Sample records for background field method

  1. Non-perturbative background field calculations

    NASA Astrophysics Data System (ADS)

    Stephens, C. R.

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.

  2. Magnetic imager and method

    DOEpatents

    Powell, J.; Reich, M.; Danby, G.

    1997-07-22

    A magnetic imager includes a generator for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager also includes a sensor for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object. 25 figs.

  3. Magnetic imager and method

    DOEpatents

    Powell, James; Reich, Morris; Danby, Gordon

    1997-07-22

    A magnetic imager 10 includes a generator 18 for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager 10 also includes a sensor 20 for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object.

  4. Limitations of the background field method applied to Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Nobili, Camilla; Otto, Felix

    2017-09-01

    We consider Rayleigh-Bénard convection as modeled by the Boussinesq equations, in the case of infinite Prandtl numbers and with no-slip boundary condition. There is a broad interest in bounds of the upwards heat flux, as given by the Nusselt number Nu, in terms of the forcing via the imposed temperature difference, as given by the Rayleigh number in the turbulent regime Ra ≫ 1 . In several studies, the background field method applied to the temperature field has been used to provide upper bounds on Nu in terms of Ra. In these applications, the background field method comes in the form of a variational problem where one optimizes a stratified temperature profile subject to a certain stability condition; the method is believed to capture the marginal stability of the boundary layer. The best available upper bound via this method is Nu ≲Ra/1 3 ( ln R a )/1 15 ; it proceeds via the construction of a stable temperature background profile that increases logarithmically in the bulk. In this paper, we show that the background temperature field method cannot provide a tighter upper bound in terms of the power of the logarithm. However, by another method, one does obtain the tighter upper bound Nu ≲ Ra /1 3 ( ln ln Ra ) /1 3 so that the result of this paper implies that the background temperature field method is unphysical in the sense that it cannot provide the optimal bound.

  5. A novel background field removal method for MRI using projection onto dipole fields (PDF).

    PubMed

    Liu, Tian; Khalidov, Ildar; de Rochefort, Ludovic; Spincemaille, Pascal; Liu, Jing; Tsiouris, A John; Wang, Yi

    2011-11-01

    For optimal image quality in susceptibility-weighted imaging and accurate quantification of susceptibility, it is necessary to isolate the local field generated by local magnetic sources (such as iron) from the background field that arises from imperfect shimming and variations in magnetic susceptibility of surrounding tissues (including air). Previous background removal techniques have limited effectiveness depending on the accuracy of model assumptions or information input. In this article, we report an observation that the magnetic field for a dipole outside a given region of interest (ROI) is approximately orthogonal to the magnetic field of a dipole inside the ROI. Accordingly, we propose a nonparametric background field removal technique based on projection onto dipole fields (PDF). In this PDF technique, the background field inside an ROI is decomposed into a field originating from dipoles outside the ROI using the projection theorem in Hilbert space. This novel PDF background removal technique was validated on a numerical simulation and a phantom experiment and was applied in human brain imaging, demonstrating substantial improvement in background field removal compared with the commonly used high-pass filtering method. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain

    NASA Astrophysics Data System (ADS)

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C. M.; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions are preserved in the brain mask. Shadow artifacts due to strong susceptibility variations in the derived QSM maps could also be largely eliminated using the R-SHARP method, leading to more accurate QSM reconstruction.

  7. Background field removal using a region adaptive kernel for quantitative susceptibility mapping of human brain.

    PubMed

    Fang, Jinsheng; Bao, Lijun; Li, Xu; van Zijl, Peter C M; Chen, Zhong

    2017-08-01

    Background field removal is an important MR phase preprocessing step for quantitative susceptibility mapping (QSM). It separates the local field induced by tissue magnetic susceptibility sources from the background field generated by sources outside a region of interest, e.g. brain, such as air-tissue interface. In the vicinity of air-tissue boundary, e.g. skull and paranasal sinuses, where large susceptibility variations exist, present background field removal methods are usually insufficient and these regions often need to be excluded by brain mask erosion at the expense of losing information of local field and thus susceptibility measures in these regions. In this paper, we propose an extension to the variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP) background field removal method using a region adaptive kernel (R-SHARP), in which a scalable spherical Gaussian kernel (SGK) is employed with its kernel radius and weights adjustable according to an energy "functional" reflecting the magnitude of field variation. Such an energy functional is defined in terms of a contour and two fitting functions incorporating regularization terms, from which a curve evolution model in level set formation is derived for energy minimization. We utilize it to detect regions of with a large field gradient caused by strong susceptibility variation. In such regions, the SGK will have a small radius and high weight at the sphere center in a manner adaptive to the voxel energy of the field perturbation. Using the proposed method, the background field generated from external sources can be effectively removed to get a more accurate estimation of the local field and thus of the QSM dipole inversion to map local tissue susceptibility sources. Numerical simulation, phantom and in vivo human brain data demonstrate improved performance of R-SHARP compared to V-SHARP and RESHARP (regularization enabled SHARP) methods, even when the whole paranasal sinus regions are preserved in the brain mask. Shadow artifacts due to strong susceptibility variations in the derived QSM maps could also be largely eliminated using the R-SHARP method, leading to more accurate QSM reconstruction. Copyright © 2017. Published by Elsevier Inc.

  8. Separation of foreground and background from light field using gradient information.

    PubMed

    Lee, Jae Young; Park, Rae-Hong

    2017-02-01

    Studies of computer vision or machine vision applications using a light field camera have been increasing in recent years. However, the abilities that the light field camera has are not fully used in these applications. In this paper, we propose a method for direct separation of foreground and background that uses the gradient information and can be used in various applications such as pre-processing. From an optical phenomenon whereby the bundles of rays from the background are flipped, we derive that the disparity sign of the background in the captured three-dimensional scene has the opposite disparity sign of the foreground. Using the majority-weighted voting algorithm based on the gradient information with the Lambertian assumption and the gradient constraint, the foreground and background can be separated at each pixel. In regard to pre-processing, the proposed method can be used for various applications such as occlusion and saliency detection, disparity estimation, and so on. Experimental results with the EPFL light field dataset and Stanford Lytro light field dataset show that the proposed method achieves better performance in terms of the occlusion detection, and thus can be effectively used in pre-processing for saliency detection and disparity estimation.

  9. Measuring Extinction in Local Group Galaxies Using Background Galaxies

    NASA Astrophysics Data System (ADS)

    Wyder, T. K.; Hodge, P. W.

    1999-05-01

    Knowledge of the distribution and quantity of dust in galaxies is important for understanding their structure and evolution. The goal of our research is to measure the total extinction through Local Group galaxies using measured properties of background galaxies. Our method relies on the SExtractor software as an objective and automated method of detecting background galaxies. In an initial test, we have explored two WFPC2 fields in the SMC and two in M31 obtained from the HST archives. The two pointings in the SMC are fields around the open clusters L31 and B83 while the two M31 fields target the globular clusters G1 and G170. Except for the G1 observations of M31, the fields chosen are very crowded (even when observed with HST) and we chose them as a particularly stringent test of the method. We performed several experiments using a series of completeness tests that involved superimposing comparison fields, adjusted to the equivalent exposure time, from the HST Medium-Deep and Groth-Westphal surveys. These tests showed that for crowded fields, such as the two in the core of the SMC and the one in the bulge of M31, this automated method of detecting galaxies can be completely dominated by the effects of crowding. For these fields, only a small fraction of the added galaxies was recovered. However, in the outlying G1 field in M31, almost all of the added galaxies were recovered. The numbers of actual background galaxies in this field are consistent with zero extinction. As a follow-up experiment, we used image processing techniques to suppress stellar objects while enhancing objects with non-stellar, more gradual luminosity profiles. This method yielded significant numbers of background galaxies in even the most crowded fields, which we are now analyzing to determine the total extinction and reddening caused by the foreground galaxy.

  10. General heat kernel coefficients for massless free spin-3/2 Rarita-Schwinger field

    NASA Astrophysics Data System (ADS)

    Karan, Sudip; Kumar, Shashank; Panda, Binata

    2018-04-01

    We review the general heat kernel method for the Dirac spinor field as an elementary example in any arbitrary background. We, then compute the first three Seeley-DeWitt coefficients for the massless free spin-3/2 Rarita-Schwinger field without imposing any limitations on the background geometry.

  11. Background field removal technique based on non-regularized variable kernels sophisticated harmonic artifact reduction for phase data for quantitative susceptibility mapping.

    PubMed

    Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2018-06-11

    We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.

  12. Spectral characterization of natural backgrounds

    NASA Astrophysics Data System (ADS)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  13. Nonlocal interactions in color perception: nonlinear processing of chromatic signals from remote inducers.

    PubMed

    Wachtler, T; Albright, T D; Sejnowski, T J

    2001-05-01

    The perceived color of an object depends on the chromaticity of its immediate background. But color appearance is also influenced by remote chromaticities. To quantify these influences, the effects of remote color fields on the appearance of a fixated 2 degrees test field were measured using a forced-choice method. Changes in the appearance of the test field were induced by chromaticity changes of the background and of 2 degrees color fields not adjacent to the test field. The appearance changes induced by the color of the background corresponded to a fraction of between 0.5 and 0.95 of the cone contrast of the background change, depending on the observer. The magnitude of induction by the background color was modulated on average by 7.6% by chromaticity changes in the remote color fields. Chromaticity changes in the remote fields had virtually no inducing effect when they occurred without a change in background color. The spatial range of these chromatic interactions extended over at least 10 degrees from the fovea. They were established within the first few hundred milliseconds after the change of background color and depended only weakly on the number of inducing fields. These results may be interpreted as reflecting rapid chromatic interactions that support robustness of color vision under changing viewing conditions.

  14. Influence of magnetic field configuration on magnetohydrodynamic waves in Earth's core

    NASA Astrophysics Data System (ADS)

    Knezek, Nicholas; Buffett, Bruce

    2018-04-01

    We develop a numerical model to study magnetohydrodynamic waves in a thin layer of stratified fluid near the surface of Earth's core. Past studies have been limited to using simple background magnetic field configurations. However, the choice of field distribution can dramatically affect the structure and frequency of the waves. To permit a more general treatment of background magnetic field and layer stratification, we combine finite volume and Fourier methods to describe the wave motions. We validate our model by comparisons to previous studies and examine the influence of background magnetic field configuration on two types of magnetohydrodynamic waves. We show that the structure of zonal Magnetic-Archimedes-Coriolis (MAC) waves for a dipole background field is unstable to small perturbations of the field strength in the equatorial region. Modifications to the wave structures are computed for a range of field configurations. In addition, we show that non-zonal MAC waves are trapped near the equator for realistic magnetic field distributions, and that their latitudinal extent depends upon the distribution of magnetic field strength at the CMB.

  15. 3D Inversion of Natural Source Electromagnetics

    NASA Astrophysics Data System (ADS)

    Holtham, E. M.; Oldenburg, D. W.

    2010-12-01

    The superior depth of investigation of natural source electromagnetic techniques makes these methods excellent candidates for crustal studies as well as for mining and hydrocarbon exploration. The traditional natural source method, the magnetotelluric (MT) technique, has practical limitations because the surveys are costly and time consuming due to the labor intensive nature of ground based surveys. In an effort to continue to use the penetration advantage of natural sources, it has long been recognized that tipper data, the ratio of the local vertical magnetic field to the horizontal magnetic field, provide information about 3D electrical conductivity structure. It was this understanding that prompted the development of AFMAG (Audio Frequency Magnetics) and recently the new airborne Z-Axis Tipper Electromagnetic Technique (ZTEM). In ZTEM, the vertical component of the magnetic field is recorded above the entire survey area, while the horizontal fields are recorded at a ground-based reference station. MT processing techniques yield frequency domain transfer functions typically between 30-720 Hz that relate the vertical fields over the survey area to the horizontal fields at the reference station. The result is a cost effective procedure for collecting natural source EM data and for finding large scale targets at moderate depths. It is well known however that 1D layered structures produce zero vertical magnetic fields and thus ZTEM data cannot recover such background conductivities. This is in sharp contrast to the MT technique where electric fields are measured and a 1D background conductivity can be recovered from the off diagonal elements of the impedance tensor. While 1D models produce no vertical fields, two and three dimensional structures will produce anomalous currents and a ZTEM response. For such models the background conductivity structure does affect the data. In general however, the ZTEM data have weak sensitivity to the background conductivity and while we show that it is possible to obtain the background structure by inverting the ZTEM data alone, it is desirable to obtain robust background conductivity information from other sources. This information could come from a priori geologic and petrophysical information or from additional geophysical data such as MT. To counter the costly nature of large MT surveys and the limited sensitivity of the ZTEM technique to the background conductivity we show that an effective method is to collect and invert both MT and ZTEM data. A sparse MT survey grid can gather information about the background conductivity and deep structures while keeping the survey costs affordable. Higher spatial resolution at moderate depths can be obtained by flying multiple lines of ZTEM data.

  16. Background field Landau mode operators for the nucleon

    NASA Astrophysics Data System (ADS)

    Kamleh, Waseem; Bignell, Ryan; Leinweber, Derek B.; Burkardt, Matthias

    2018-03-01

    The introduction of a uniform background magnetic field breaks threedimensional spatial symmetry for a charged particle and introduces Landau mode effects. Standard quark operators are inefficient at isolating the nucleon correlation function at nontrivial field strengths. We introduce novel quark operators constructed from the twodimensional Laplacian eigenmodes that describe a charged particle on a finite lattice. These eigenmode-projected quark operators provide enhanced precision for calculating nucleon energy shifts in a magnetic field. Preliminary results are obtained for the neutron and proton magnetic polarisabilities using these methods.

  17. Andromeda (M31) optical and infrared disk survey. I. Insights in wide-field near-IR surface photometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sick, Jonathan; Courteau, Stéphane; Cuillandre, Jean-Charles

    We present wide-field near-infrared J and K{sub s} images of the Andromeda Galaxy (M31) taken with WIRCam at the Canada-France-Hawaii Telescope as part of the Andromeda Optical and Infrared Disk Survey. This data set allows simultaneous observations of resolved stars and near-infrared (NIR) surface brightness across M31's entire bulge and disk (within R = 22 kpc), permitting a direct test of the stellar composition of near-infrared light in a nearby galaxy. Here we develop NIR observation and reduction methods to recover a uniform surface brightness map across the 3° × 1° disk of M31 with 27 WIRCam fields. Two sky-targetmore » nodding strategies are tested, and we find that strictly minimizing sky sampling latency cannot improve background subtraction accuracy to better than 2% of the background level due to spatio-temporal variations in the NIR skyglow. We fully describe our WIRCam reduction pipeline and advocate using flats built from night-sky images over a single night, rather than dome flats that do not capture the WIRCam illumination field. Contamination from scattered light and thermal background in sky flats has a negligible effect on the surface brightness shape compared to the stochastic differences in background shape between sky and galaxy disk fields, which are ∼0.3% of the background level. The most dramatic calibration step is the introduction of scalar sky offsets to each image that optimizes surface brightness continuity. Sky offsets reduce the mean surface brightness difference between observation blocks from 1% to <0.1% of the background level, though the absolute background level remains statistically uncertain to 0.15% of the background level. We present our WIRCam reduction pipeline and performance analysis to give specific recommendations for the improvement of NIR wide-field imaging methods.« less

  18. Measurement of volatile plant compounds in field ambient air by thermal desorption-gas chromatography-mass spectrometry.

    PubMed

    Cai, Xiao-Ming; Xu, Xiu-Xiu; Bian, Lei; Luo, Zong-Xiu; Chen, Zong-Mao

    2015-12-01

    Determination of volatile plant compounds in field ambient air is important to understand chemical communication between plants and insects and will aid the development of semiochemicals from plants for pest control. In this study, a thermal desorption-gas chromatography-mass spectrometry (TD-GC-MS) method was developed to measure ultra-trace levels of volatile plant compounds in field ambient air. The desorption parameters of TD, including sorbent tube material, tube desorption temperature, desorption time, and cold trap temperature, were selected and optimized. In GC-MS analysis, the selected ion monitoring mode was used for enhanced sensitivity and selectivity. This method was sufficiently sensitive to detect part-per-trillion levels of volatile plant compounds in field ambient air. Laboratory and field evaluation revealed that the method presented high precision and accuracy. Field studies indicated that the background odor of tea plantations contained some common volatile plant compounds, such as (Z)-3-hexenol, methyl salicylate, and (E)-ocimene, at concentrations ranging from 1 to 3400 ng m(-3). In addition, the background odor in summer was more abundant in quality and quantity than in autumn. Relative to previous methods, the TD-GC-MS method is more sensitive, permitting accurate qualitative and quantitative measurements of volatile plant compounds in field ambient air.

  19. Background oriented schlieren in a density stratified fluid.

    PubMed

    Verso, Lilly; Liberzon, Alex

    2015-10-01

    Non-intrusive quantitative fluid density measurement methods are essential in the stratified flow experiments. Digital imaging leads to synthetic schlieren methods in which the variations of the index of refraction are reconstructed computationally. In this study, an extension to one of these methods, called background oriented schlieren, is proposed. The extension enables an accurate reconstruction of the density field in stratified liquid experiments. Typically, the experiments are performed by the light source, background pattern, and the camera positioned on the opposite sides of a transparent vessel. The multimedia imaging through air-glass-water-glass-air leads to an additional aberration that destroys the reconstruction. A two-step calibration and image remapping transform are the key components that correct the images through the stratified media and provide a non-intrusive full-field density measurements of transparent liquids.

  20. Nuclear Quadrupole Resonance (NQR) Method and Probe for Generating RF Magnetic Fields in Different Directions to Distinguish NQR from Acoustic Ringing Induced in a Sample

    DTIC Science & Technology

    1997-08-01

    77,719 TITLE OF THE INVENTION NUCLEAR QUADRUPOLE RESONANCE ( NQR ) METHOD AND PROBE FOR GENERATING RF MAGNETIC FIELDS IN DIFFERENT DIRECTIONS TO...DISTINGUISH NQR FROM ACOUSTIC RINGING INDUCED IN A SAMPLE BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a...nuclear quadrupole 15 resonance ( NQR ) method and probe for generating RF magnetic fields in different directions towards a sample. More specifically

  1. Background oriented schlieren measurement of the refractive index field of air induced by a hot, cylindrical measurement object.

    PubMed

    Beermann, Rüdiger; Quentin, Lorenz; Pösch, Andreas; Reithmeier, Eduard; Kästner, Markus

    2017-05-10

    To optically capture the topography of a hot measurement object with high precision, the light deflection by the inhomogeneous refractive index field-induced by the heat transfer from the measurement object to the ambient medium-has to be considered. We used the 2D background oriented schlieren method with illuminated wavelet background, an optical flow algorithm, and Ciddor's equation to quantify the refractive index field located directly above a red-glowing, hot measurement object. A heat transfer simulation has been implemented to verify the magnitude and the shape of the measured refractive index field. Provided that no forced external flow is disturbing the shape of the convective flow originating from the hot object, a laminar flow can be observed directly above the object, resulting in a sharply bounded, inhomogeneous refractive index field.

  2. US Fish and Wildlife Service biomonitoring operations manual, Appendices A--K

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gianotto, D.F.; Rope, R.C.; Mondecar, M.

    1993-04-01

    Volume 2 contains Appendices and Summary Sheets for the following areas: A-Legislative Background and Key to Relevant Legislation, B- Biomonitoring Operations Workbook, C-Air Monitoring, D-Introduction to the Flora and Fauna for Biomonitoring, E-Decontamination Guidance Reference Field Methods, F-Documentation Guidance, Sample Handling, and Quality Assurance/Quality Control Standard Operating Procedures, G-Field Instrument Measurements Reference Field Methods, H-Ground Water Sampling Reference Field Methods, I-Sediment Sampling Reference Field Methods, J-Soil Sampling Reference Field Methods, K-Surface Water Reference Field Methods. Appendix B explains how to set up strategy to enter information on the ``disk workbook``. Appendix B is enhanced by DE97006389, an on-line workbook formore » users to be able to make revisions to their own biomonitoring data.« less

  3. Background feature descriptor for offline handwritten numeral recognition

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Wang, Hao; Tian, Tian; Jie, Feiran; Lei, Bo

    2011-11-01

    This paper puts forward an offline handwritten numeral recognition method based on background structural descriptor (sixteen-value numerical background expression). Through encoding the background pixels in the image according to a certain rule, 16 different eigenvalues were generated, which reflected the background condition of every digit, then reflected the structural features of the digits. Through pattern language description of images by these features, automatic segmentation of overlapping digits and numeral recognition can be realized. This method is characterized by great deformation resistant ability, high recognition speed and easy realization. Finally, the experimental results and conclusions are presented. The experimental results of recognizing datasets from various practical application fields reflect that with this method, a good recognition effect can be achieved.

  4. Spectral methods for coupled channels with a mass gap

    NASA Astrophysics Data System (ADS)

    Weigel, H.; Quandt, M.; Graham, N.

    2018-02-01

    We develop a method to compute the vacuum polarization energy for coupled scalar fields with different masses scattering off a background potential in one space dimension. As an example we consider the vacuum polarization energy of a kinklike soliton built from two real scalar fields with different mass parameters.

  5. The effect of different methods to compute N on estimates of mixing in stratified flows

    NASA Astrophysics Data System (ADS)

    Fringer, Oliver; Arthur, Robert; Venayagamoorthy, Subhas; Koseff, Jeffrey

    2017-11-01

    The background stratification is typically well defined in idealized numerical models of stratified flows, although it is more difficult to define in observations. This may have important ramifications for estimates of mixing which rely on knowledge of the background stratification against which turbulence must work to mix the density field. Using direct numerical simulation data of breaking internal waves on slopes, we demonstrate a discrepancy in ocean mixing estimates depending on the method in which the background stratification is computed. Two common methods are employed to calculate the buoyancy frequency N, namely a three-dimensionally resorted density field (often used in numerical models) and a locally-resorted vertical density profile (often used in the field). We show that how N is calculated has a significant effect on the flux Richardson number Rf, which is often used to parameterize turbulent mixing, and the turbulence activity number Gi, which leads to errors when estimating the mixing efficiency using Gi-based parameterizations. Supported by ONR Grant N00014-08-1-0904 and LLNL Contract DE-AC52-07NA27344.

  6. Radiative improvement of the lattice nonrelativistic QCD action using the background field method and application to the hyperfine splitting of quarkonium states.

    PubMed

    Hammant, T C; Hart, A G; von Hippel, G M; Horgan, R R; Monahan, C J

    2011-09-09

    We present the first application of the background field method to nonrelativistic QCD (NRQCD) on the lattice in order to determine the one-loop radiative corrections to the coefficients of the NRQCD action in a manifestly gauge-covariant manner. The coefficients of the σ·B term in the NRQCD action and the four-fermion spin-spin interaction are computed at the one-loop level; the resulting shift of the hyperfine splitting of bottomonium is found to bring the lattice predictions in line with experiment.

  7. Data Friction Meets Social Friction: Challenges for standardization in emerging fields of geoscience

    NASA Astrophysics Data System (ADS)

    Darch, P. T.

    2017-12-01

    Many interdisciplinary endeavors in the geosciences occur in emergent scientific fields. These fields are often characterized by heterogeneity of methods for production and collection of data, and by data scarcity. This paper presents findings about processes of methods standardization from a long-term case study of an emergent, data-scarce field, the deep subseafloor biosphere. Researchers come from many physical and life science backgrounds to study interactions between microbial life in the seafloor and the physical environment they inhabit. Standardization of methods for collecting data promises multiple benefits to this field, including: Addressing data scarcity through enabling greater data reuse and promoting better interoperability with large scale infrastructures; Fostering stronger collaborative links between researchers distributed across institutions and backgrounds. Ongoing standardization efforts in the field do not only involve scientific judgments about which among a range of methods is most efficient, least biased, or most reliable. Instead, these efforts also encounter multiple difficult social challenges, including: Lack of agreed upon criteria about how to judge competing methods: should efficiency, bias, or reliability take priority?; Lack of resources to carry out the work necessary to determine standards, particularly acute in emergent fields; Concerns that standardization is premature in such a new field, foreclosing the possibility of better methods being developed in the future; Concerns that standardization could prematurely shut down important scientific debates; Concerns among some researchers that their own work may become obsolete should the methods chosen as standard be different from their own. The success of these standardization efforts will depend on addressing both scientific and social dimensions, to ensure widespread acceptance among researchers in the field.

  8. A beam hardening and dispersion correction for x-ray dark-field radiography.

    PubMed

    Pelzer, Georg; Anton, Gisela; Horn, Florian; Rieger, Jens; Ritter, André; Wandner, Johannes; Weber, Thomas; Michel, Thilo

    2016-06-01

    X-ray dark-field imaging promises information on the small angle scattering properties even of large samples. However, the dark-field image is correlated with the object's attenuation and phase-shift if a polychromatic x-ray spectrum is used. A method to remove part of these correlations is proposed. The experimental setup for image acquisition was modeled in a wave-field simulation to quantify the dark-field signals originating solely from a material's attenuation and phase-shift. A calibration matrix was simulated for ICRU46 breast tissue. Using the simulated data, a dark-field image of a human mastectomy sample was corrected for the finger print of attenuation- and phase-image. Comparing the simulated, attenuation-based dark-field values to a phantom measurement, a good agreement was found. Applying the proposed method to mammographic dark-field data, a reduction of the dark-field background and anatomical noise was achieved. The contrast between microcalcifications and their surrounding background was increased. The authors show that the influence of and dispersion can be quantified by simulation and, thus, measured image data can be corrected. The simulation allows to determine the corresponding dark-field artifacts for a wide range of setup parameters, like tube-voltage and filtration. The application of the proposed method to mammographic dark-field data shows an increase in contrast compared to the original image, which might simplify a further image-based diagnosis.

  9. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    NASA Astrophysics Data System (ADS)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 0.9998, 0.9915, 0.9895, and 0.9940 respectively). The proposed spline interpolation method exhibits better linear correlation and smaller error in the results of the quantitative analysis of Cu compared with polynomial fitting, Lorentz fitting and model-free methods, The simulation and quantitative experimental results show that the spline interpolation method can effectively detect and correct the continuous background.

  10. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Improvements in Technique of NMR Imaging and NMR Diffusion Measurements in the Presence of Background Gradients.

    NASA Astrophysics Data System (ADS)

    Lian, Jianyu

    In this work, modification of the cosine current distribution rf coil, PCOS, has been introduced and tested. The coil produces a very homogeneous rf magnetic field, and it is inexpensive to build and easy to tune for multiple resonance frequency. The geometrical parameters of the coil are optimized to produce the most homogeneous rf field over a large volume. To avoid rf field distortion when the coil length is comparable to a quarter wavelength, a parallel PCOS coil is proposed and discussed. For testing rf coils and correcting B _1 in NMR experiments, a simple, rugged and accurate NMR rf field mapping technique has been developed. The method has been tested and used in 1D, 2D, 3D and in vivo rf mapping experiments. The method has been proven to be very useful in the design of rf coils. To preserve the linear relation between rf output applied on an rf coil and modulating input for an rf modulating -amplifying system of NMR imaging spectrometer, a quadrature feedback loop is employed in an rf modulator with two orthogonal rf channels to correct the amplitude and phase non-linearities caused by the rf components in the rf system. The modulator is very linear over a large range and it can generate an arbitrary rf shape. A diffusion imaging sequence has been developed for measuring and imaging diffusion in the presence of background gradients. Cross terms between the diffusion sensitizing gradients and background gradients or imaging gradients can complicate diffusion measurement and make the interpretation of NMR diffusion data ambiguous, but these have been eliminated in this method. Further, the background gradients has been measured and imaged. A dipole random distribution model has been established to study background magnetic fields Delta B and background magnetic gradients G_0 produced by small particles in a sample when it is in a B_0 field. From this model, the minimum distance that a spin can approach a particle can be determined by measuring and <{bf G}_sp{0 }{2}>. From this model, the particle concentration in a sample can be determined by measuring the lineshape of a free induction decay (fid).

  12. A comparison of a two-dimensional variational analysis method and a median filter for NSCAT ambiguity removal

    NASA Astrophysics Data System (ADS)

    Henderson, J. M.; Hoffman, R. N.; Leidner, S. M.; Atlas, R.; Brin, E.; Ardizzone, J. V.

    2003-06-01

    The ocean surface vector wind can be measured from space by scatterometers. For a set of measurements observed from several viewing directions and collocated in space and time, there will usually exist two, three, or four consistent wind vectors. These multiple wind solutions are known as ambiguities. Ambiguity removal procedures select one ambiguity at each location. We compare results of two different ambiguity removal algorithms, the operational median filter (MF) used by the Jet Propulsion Laboratory (JPL) and a two-dimensional variational analysis method (2d-VAR). We applied 2d-VAR to the entire NASA Scatterometer (NSCAT) mission, orbit by orbit, using European Centre for Medium-Range Weather Forecasts (ECMWF) 10-m wind analyses as background fields. We also applied 2d-VAR to a 51-day subset of the NSCAT mission using National Centers for Environmental Prediction (NCEP) 1000-hPa wind analyses as background fields. This second data set uses the same background fields as the MF data set. When both methods use the same NCEP background fields as a starting point for ambiguity removal, agreement is very good: Approximately only 3% of the wind vector cells (WVCs) have different ambiguity selections; however, most of the WVCs with changes occur in coherent patches. Since at least one of the selections is in error, this implies that errors due to ambiguity selection are not isolated, but are horizontally correlated. When we examine ambiguity selection differences at synoptic scales, we often find that the 2d-VAR selections are more meteorologically reasonable and more consistent with cloud imagery.

  13. Heterodyne effect in Hybrid CARS

    NASA Astrophysics Data System (ADS)

    Wang, Xi; Zhang, Aihua; Zhi, Miaochan; Sokolov, Alexei; Welch, George; Scully, Marlan

    2009-10-01

    We study the interaction between the resonant Raman signal and non-Raman field, either the concomitant nonresonant four-wave-mixing (FWM) background or an applied external field, in our recently developed scheme of coherent Anti-Stokes Raman scattering, a hybrid CARS. Our technique combines instantaneous coherent excitation of several characteristic molecular vibrations with subsequent probing of these vibrations by an optimally shaped, time-delayed, narrowband laser pulse. This pulse configuration mitigates the non-resonant FWM background while maximizing the Raman-resonant signal, and allows rapid and highly specific detection even in the presence of multiple scattering. We apply this method to non-invasive monitoring of blood glucose levels. Under certain conditions we find that the measured signal is linearly proportional to the glucose concentration due to optical interference with the residual background light, which allows reliable detection of spectral signatures down to medically-relevant glucose levels. We also study the interference between the CARS field and an external field (the local oscillator) by controlling their relative phase and amplitude. This control allows direct observation of the real and imaginary components of the third-order nonlinear susceptibility (χ^(3)) of the sample. We demonstrate that the heterodyne method can be used to amplify the signal and thus increase detection sensitivity.

  14. Worldline approach to helicity flip in plane waves

    NASA Astrophysics Data System (ADS)

    Ilderton, Anton; Torgrimsson, Greger

    2016-04-01

    We apply worldline methods to the study of vacuum polarization effects in plane wave backgrounds, in both scalar and spinor QED. We calculate helicity-flip probabilities to one loop order and treated exactly in the background field, and provide a toolkit of methods for use in investigations of higher-order processes. We also discuss the connections between the worldline, S-matrix, and lightfront approaches to vacuum polarization effects.

  15. Chem I Supplement: Nuclear Synthesis and Identification of New Elements.

    ERIC Educational Resources Information Center

    Seaborg, Glenn T.

    1985-01-01

    As background material for a paper on the transuranium elements (SE 537 837), this article reviews: (1) several descriptive terms; (2) nuclear reactions; (3) radioactive decay modes; (4) chemical background; and (5) experimental methods used in this field of research and more broadly in nuclear chemistry. (Author/JN)

  16. Studies of Isolated and Non-isolated Photospheric Bright Points in an Active Region Observed by the New Vacuum Solar Telescope

    NASA Astrophysics Data System (ADS)

    Liu, Yanxiao; Xiang, Yongyuan; Erdélyi, Robertus; Liu, Zhong; Li, Dong; Ning, Zongjun; Bi, Yi; Wu, Ning; Lin, Jun

    2018-03-01

    Properties of photospheric bright points (BPs) near an active region have been studied in TiO λ 7058 Å images observed by the New Vacuum Solar Telescope of the Yunnan Observatories. We developed a novel recognition method that was used to identify and track 2010 BPs. The observed evolving BPs are classified into isolated (individual) and non-isolated (where multiple BPs are observed to display splitting and merging behaviors) sets. About 35.1% of BPs are non-isolated. For both isolated and non-isolated BPs, the brightness varies from 0.8 to 1.3 times the average background intensity and follows a Gaussian distribution. The lifetimes of BPs follow a log-normal distribution, with characteristic lifetimes of (267 ± 140) s and (421 ± 255) s, respectively. Their size also follows log-normal distribution, with an average size of about (2.15 ± 0.74) × 104 km2 and (3.00 ± 1.31) × 104 km2 for area, and (163 ± 27) km and (191 ± 40) km for diameter, respectively. Our results indicate that regions with strong background magnetic field have higher BP number density and higher BP area coverage than regions with weak background field. Apparently, the brightness/size of BPs does not depend on the background field. Lifetimes in regions with strong background magnetic field are shorter than those in regions with weak background field, on average.

  17. Twisted versus braided magnetic flux ropes in coronal geometry. II. Comparative behaviour

    NASA Astrophysics Data System (ADS)

    Prior, C.; Yeates, A. R.

    2016-06-01

    Aims: Sigmoidal structures in the solar corona are commonly associated with magnetic flux ropes whose magnetic field lines are twisted about a mutual axis. Their dynamical evolution is well studied, with sufficient twisting leading to large-scale rotation (writhing) and vertical expansion, possibly leading to ejection. Here, we investigate the behaviour of flux ropes whose field lines have more complex entangled/braided configurations. Our hypothesis is that this internal structure will inhibit the large-scale morphological changes. Additionally, we investigate the influence of the background field within which the rope is embedded. Methods: A technique for generating tubular magnetic fields with arbitrary axial geometry and internal structure, introduced in part I of this study, provides the initial conditions for resistive-MHD simulations. The tubular fields are embedded in a linear force-free background, and we consider various internal structures for the tubular field, including both twisted and braided topologies. These embedded flux ropes are then evolved using a 3D MHD code. Results: Firstly, in a background where twisted flux ropes evolve through the expected non-linear writhing and vertical expansion, we find that flux ropes with sufficiently braided/entangled interiors show no such large-scale changes. Secondly, embedding a twisted flux rope in a background field with a sigmoidal inversion line leads to eventual reversal of the large-scale rotation. Thirdly, in some cases a braided flux rope splits due to reconnection into two twisted flux ropes of opposing chirality - a phenomenon previously observed in cylindrical configurations. Conclusions: Sufficiently complex entanglement of the magnetic field lines within a flux rope can suppress large-scale morphological changes of its axis, with magnetic energy reduced instead through reconnection and expansion. The structure of the background magnetic field can significantly affect the changing morphology of a flux rope.

  18. X-ray radiative transfer in protoplanetary disks. The role of dust and X-ray background fields

    NASA Astrophysics Data System (ADS)

    Rab, Ch.; Güdel, M.; Woitke, P.; Kamp, I.; Thi, W.-F.; Min, M.; Aresu, G.; Meijerink, R.

    2018-01-01

    Context. The X-ray luminosities of T Tauri stars are about two to four orders of magnitude higher than the luminosity of the contemporary Sun. As these stars are born in clusters, their disks are not only irradiated by their parent star but also by an X-ray background field produced by the cluster members. Aims: We aim to quantify the impact of X-ray background fields produced by young embedded clusters on the chemical structure of disks. Further, we want to investigate the importance of the dust for X-ray radiative transfer in disks. Methods: We present a new X-ray radiative transfer module for the radiation thermo-chemical disk code PRODIMO (PROtoplanetary DIsk MOdel), which includes X-ray scattering and absorption by both the gas and dust component. The X-ray dust opacities can be calculated for various dust compositions and dust-size distributions. For the X-ray radiative transfer we consider irradiation by the star and by X-ray background fields. To study the impact of X-rays on the chemical structure of disks we use the well established disk ionization tracers N2H+ and HCO+. Results: For evolved dust populations (e.g. grain growth), X-ray opacities are mostly dominated by the gas; only for photon energies E ≳ 5-10 keV do dust opacities become relevant. Consequently the local disk X-ray radiation field is only affected in dense regions close to the disk midplane. X-ray background fields can dominate the local X-ray disk ionization rate for disk radii r ≳ 20 au. However, the N2H+ and HCO+ column densities are only significantly affected in cases of low cosmic-ray ionization rates (≲10-19 s-1), or if the background flux is at least a factor of ten higher than the flux level of ≈10-5 erg cm-2 s-1 expected for clusters typical for the solar vicinity. Conclusions: Observable signatures of X-ray background fields in low-mass star-formation regions, like Taurus, are only expected for cluster members experiencing a strong X-ray background field (e.g. due to their location within the cluster). For the majority of the cluster members, the X-ray background field has relatively little impact on the disk chemical structure.

  19. On the detection of a stochastic background of gravitational radiation by the Doppler tracking of spacecraft

    NASA Technical Reports Server (NTRS)

    Mashhoon, B.; Grishchuk, L. P.

    1980-01-01

    Consideration is given to the possibility of detection of an isotropic background gravitational radiation of a stochastic nature by the method of Doppler tracking of spacecraft. Attention is given in the geometrical optics limit, to the general formula for the frequency shift of an electromagnetic signal in the gravitational radiation field, and it is shown to be gauge independent. The propagation of a free electromagnetic wave in a gravitational radiation field is examined with the conclusion that no resonance phenomena can be expected. Finally, the 'Doppler noise' due to a stochastic background is evaluated, and it is shown to depend on the total energy density of the background and a parameter that is a characteristic of the radiation spectrum and the detection system used.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, Elsayed

    Purpose: To characterize and correct for radiation-induced background (RIB) observed in the signals from a class of scanning water tanks. Methods: A method was developed to isolate the RIB through detector measurements in the background-free linac console area. Variation of the RIB against a large number of parameters was characterized, and its impact on basic clinical data for photon and electron beams was quantified. Different methods to minimize and/or correct for the RIB were proposed and evaluated. Results: The RIB is due to the presence of the electrometer and connection box in a low background radiation field (by design). Themore » absolute RIB current with a biased detector is up to 2 pA, independent of the detector size, which is 0.6% and 1.5% of the central axis reference signal for a standard and a mini scanning chamber, respectively. The RIB monotonically increases with field size, is three times smaller for detectors that do not require a bias (e.g., diodes), is up to 80% larger for positive (versus negative) polarity, decreases with increasing photon energy, exhibits a single curve versus dose rate at the electrometer location, and is negligible for electron beams. Data after the proposed field-size correction method agree with point measurements from an independent system to within a few tenth of a percent for output factor, head scatter, depth dose at depth, and out-of-field profile dose. Manufacturer recommendations for electrometer placement are insufficient and sometimes incorrect. Conclusions: RIB in scanning water tanks can have a non-negligible effect on dosimetric data.« less

  1. Amplification due to two-stream instability of self-electric and magnetic fields of an ion beam propagating in background plasma

    NASA Astrophysics Data System (ADS)

    Tokluoglu, Erinc K.; Kaganovich, Igor D.; Carlsson, Johan A.; Hara, Kentaro; Startsev, Edward A.

    2018-05-01

    Propagation of charged particle beams in background plasma as a method of space charge neutralization has been shown to achieve a high degree of charge and current neutralization and therefore enables nearly ballistic propagation and focusing of charged particle beams. Correspondingly, the use of plasmas for propagation of charged particle beams has important applications for transport and focusing of intense particle beams in inertial fusion and high energy density laboratory plasma physics. However, the streaming of beam ions through a background plasma can lead to the development of two-stream instability between the beam ions and the plasma electrons. The beam electric and magnetic fields enhanced by the two-stream instability can lead to defocusing of the ion beam. Using particle-in-cell simulations, we study the scaling of the instability-driven self-electromagnetic fields and consequent defocusing forces with the background plasma density and beam ion mass. We identify plasma parameters where the defocusing forces can be reduced.

  2. Cosmic Microwave Background Mapmaking with a Messenger Field

    NASA Astrophysics Data System (ADS)

    Huffenberger, Kevin M.; Næss, Sigurd K.

    2018-01-01

    We apply a messenger field method to solve the linear minimum-variance mapmaking equation in the context of Cosmic Microwave Background (CMB) observations. In simulations, the method produces sky maps that converge significantly faster than those from a conjugate gradient descent algorithm with a diagonal preconditioner, even though the computational cost per iteration is similar. The messenger method recovers large scales in the map better than conjugate gradient descent, and yields a lower overall χ2. In the single, pencil beam approximation, each iteration of the messenger mapmaking procedure produces an unbiased map, and the iterations become more optimal as they proceed. A variant of the method can handle differential data or perform deconvolution mapmaking. The messenger method requires no preconditioner, but a high-quality solution needs a cooling parameter to control the convergence. We study the convergence properties of this new method and discuss how the algorithm is feasible for the large data sets of current and future CMB experiments.

  3. Noise covariance incorporated MEG-MUSIC algorithm: a method for multiple-dipole estimation tolerant of the influence of background brain activity.

    PubMed

    Sekihara, K; Poeppel, D; Marantz, A; Koizumi, H; Miyashita, Y

    1997-09-01

    This paper proposes a method of localizing multiple current dipoles from spatio-temporal biomagnetic data. The method is based on the multiple signal classification (MUSIC) algorithm and is tolerant of the influence of background brain activity. In this method, the noise covariance matrix is estimated using a portion of the data that contains noise, but does not contain any signal information. Then, a modified noise subspace projector is formed using the generalized eigenvectors of the noise and measured-data covariance matrices. The MUSIC localizer is calculated using this noise subspace projector and the noise covariance matrix. The results from a computer simulation have verified the effectiveness of the method. The method was then applied to source estimation for auditory-evoked fields elicited by syllable speech sounds. The results strongly suggest the method's effectiveness in removing the influence of background activity.

  4. Hybrid Magnetic Shielding

    NASA Astrophysics Data System (ADS)

    Royal, Kevin; Crawford, Christopher; Mullins, Andrew; Porter, Greg; Blanton, Hunter; Johnstone, Connor; Kistler, Ben; Olivera, Daniela

    2017-09-01

    The search for the electric dipole moment of the neutron requires the ambient magnetic field to be on the pT scale which is accomplished with large magnetic shielding rooms. These rooms are fitted with large mu-metal sheets to allow for passive cancellation of background magnetic fields. Active shielding technology cannot uniformly cancel background magnetic fields. These issues can be remedied by combining the methods into a hybrid system. The design used is composed of panels that have an active layer of cancellation between two sheets of mu-metal. The panels form a cube and draw in magnetic fields perpendicular to the surface which can then be reduced using active shielding. This work is supported by the Department of Energy under Contract DE-SC0008107.

  5. Far-field detection of sub-wavelength Tetris without extra near-field metal parts based on phase prints of time-reversed fields with intensive background interference.

    PubMed

    Chen, Yingming; Wang, Bing-Zhong

    2014-07-14

    Time-reversal (TR) phase prints are first used in far-field (FF) detection of sub-wavelength (SW) deformable scatterers without any extra metal structure positioned in the vicinity of the target. The 2D prints derive from discrete short-time Fourier transform of 1D TR electromagnetic (EM) signals. Because the time-invariant intensive background interference is effectively centralized by TR technique, the time-variant weak indication from FF SW scatterers can be highlighted. This method shows a different use of TR technique in which the focus peak of TR EM waves is unusually removed and the most useful information is conveyed by the other part.

  6. Evaluation of Field-deployed Low Cost PM Sensors

    EPA Science Inventory

    Background Particulate matter (PM) is a pollutant of high public interest regulated by national ambient air quality standards (NAAQS) using federal reference method (FRM) and federal equivalent method (FEM) instrumentation identified for environmental monitoring. PM is present i...

  7. Identification of source velocities on 3D structures in non-anechoic environments: Theoretical background and experimental validation of the inverse patch transfer functions method

    NASA Astrophysics Data System (ADS)

    Aucejo, M.; Totaro, N.; Guyader, J.-L.

    2010-08-01

    In noise control, identification of the source velocity field remains a major problem open to investigation. Consequently, methods such as nearfield acoustical holography (NAH), principal source projection, the inverse frequency response function and hybrid NAH have been developed. However, these methods require free field conditions that are often difficult to achieve in practice. This article presents an alternative method known as inverse patch transfer functions, designed to identify source velocities and developed in the framework of the European SILENCE project. This method is based on the definition of a virtual cavity, the double measurement of the pressure and particle velocity fields on the aperture surfaces of this volume, divided into elementary areas called patches and the inversion of impedances matrices, numerically computed from a modal basis obtained by FEM. Theoretically, the method is applicable to sources with complex 3D geometries and measurements can be carried out in a non-anechoic environment even in the presence of other stationary sources outside the virtual cavity. In the present paper, the theoretical background of the iPTF method is described and the results (numerical and experimental) for a source with simple geometry (two baffled pistons driven in antiphase) are presented and discussed.

  8. Incentive Pay for Remotely Piloted Aircraft Career Fields

    DTIC Science & Technology

    2012-01-01

    Fields C.1. Mathematical Symbols for Non-Stochastic Values and Shock Terms...78 C.2. Mathematical Symbols for Taste and Compensation . . . . . . . . . . . 79 xiii Summary Background and...manning requirement, even with the current incentive pays and reenlistment bonuses. 2 The mathematical foundations, data, and estimation methods for the

  9. Novel symmetries in Weyl-invariant gravity with massive gauge field

    NASA Astrophysics Data System (ADS)

    Abhinav, K.; Shukla, A.; Panigrahi, P. K.

    2016-11-01

    The background field method is used to linearize the Weyl-invariant scalar-tensor gravity, coupled with a Stückelberg field. For a generic background metric, this action is found not to be invariant, under both a diffeomorphism and generalized Weyl symmetry, the latter being a combination of gauge and Weyl transformations. Interestingly, the quadratic Lagrangian, emerging from a background of Minkowski metric, respects both transformations independently. The Becchi-Rouet-Stora-Tyutin symmetry of scalar-tensor gravity coupled with a Stückelberg-like massive gauge particle, possessing a diffeomorphism and generalized Weyl symmetry, reveals that in both cases negative-norm states with unphysical degrees of freedom do exist. We then show that, by combining diffeomorphism and generalized Weyl symmetries, all the ghost states decouple, thereby removing the unphysical redundancies of the theory. During this process, the scalar field does not represent any dynamic mode, yet modifies the usual harmonic gauge condition through non-minimal coupling with gravity.

  10. Electric-magnetic dualities in non-abelian and non-commutative gauge theories

    NASA Astrophysics Data System (ADS)

    Ho, Jun-Kai; Ma, Chen-Te

    2016-08-01

    Electric-magnetic dualities are equivalence between strong and weak coupling constants. A standard example is the exchange of electric and magnetic fields in an abelian gauge theory. We show three methods to perform electric-magnetic dualities in the case of the non-commutative U (1) gauge theory. The first method is to use covariant field strengths to be the electric and magnetic fields. We find an invariant form of an equation of motion after performing the electric-magnetic duality. The second method is to use the Seiberg-Witten map to rewrite the non-commutative U (1) gauge theory in terms of abelian field strength. The third method is to use the large Neveu Schwarz-Neveu Schwarz (NS-NS) background limit (non-commutativity parameter only has one degree of freedom) to consider the non-commutative U (1) gauge theory or D3-brane. In this limit, we introduce or dualize a new one-form gauge potential to get a D3-brane in a large Ramond-Ramond (R-R) background via field redefinition. We also use perturbation to study the equivalence between two D3-brane theories. Comparison of these methods in the non-commutative U (1) gauge theory gives different physical implications. The comparison reflects the differences between the non-abelian and non-commutative gauge theories in the electric-magnetic dualities. For a complete study, we also extend our studies to the simplest abelian and non-abelian p-form gauge theories, and a non-commutative theory with the non-abelian structure.

  11. Background-Oriented Schlieren for Large-Scale and High-Speed Aerodynamic Phenomena

    NASA Technical Reports Server (NTRS)

    Mizukaki, Toshiharu; Borg, Stephen; Danehy, Paul M.; Murman, Scott M.; Matsumura, Tomoharu; Wakabayashi, Kunihiko; Nakayama, Yoshio

    2015-01-01

    Visualization of the flow field around a generic re-entry capsule in subsonic flow and shock wave visualization with cylindrical explosives have been conducted to demonstrate sensitivity and applicability of background-oriented schlieren (BOS) for field experiments. The wind tunnel experiment suggests that BOS with a fine-pixel imaging device has a density change detection sensitivity on the order of 10(sup -5) in subsonic flow. In a laboratory setup, the structure of the shock waves generated by explosives have been successfully reconstructed by a computed tomography method combined with BOS.

  12. On the local well-posedness of Lovelock and Horndeski theories

    NASA Astrophysics Data System (ADS)

    Papallo, Giuseppe; Reall, Harvey S.

    2017-08-01

    We investigate local well-posedness of the initial value problem for Lovelock and Horndeski theories of gravity. A necessary condition for local well-posedness is strong hyperbolicity of the equations of motion. Even weak hyperbolicity can fail for strong fields so we restrict to weak fields. The Einstein equation is known to be strongly hyperbolic in harmonic gauge so we study Lovelock theories in harmonic gauge. We show that the equation of motion is always weakly hyperbolic for weak fields but, in a generic weak-field background, it is not strongly hyperbolic. For Horndeski theories, we prove that, for weak fields, the equation of motion is always weakly hyperbolic in any generalized harmonic gauge. For some Horndeski theories there exists a generalized harmonic gauge for which the equation of motion is strongly hyperbolic in a weak-field background. This includes "k-essence" like theories. However, for more general Horndeski theories, there is no generalized harmonic gauge for which the equation of motion is strongly hyperbolic in a generic weak-field background. Our results show that the standard method used to establish local well-posedness of the Einstein equation does not extend to Lovelock or general Horndeski theories. This raises the possibility that these theories may not admit a well-posed initial value problem even for weak fields.

  13. Ionization signals from diamond detectors in fast-neutron fields

    NASA Astrophysics Data System (ADS)

    Weiss, C.; Frais-Kölbl, H.; Griesmayer, E.; Kavrigin, P.

    2016-09-01

    In this paper we introduce a novel analysis technique for measurements with single-crystal chemical vapor deposition (sCVD) diamond detectors in fast-neutron fields. This method exploits the unique electronic property of sCVD diamond sensors that the signal shape of the detector current is directly proportional to the initial ionization profile. In fast-neutron fields the diamond sensor acts simultaneously as target and sensor. The interaction of neutrons with the stable isotopes 12 C and 13 C is of interest for fast-neutron diagnostics. The measured signal shapes of detector current pulses are used to identify individual types of interactions in the diamond with the goal to select neutron-induced reactions in the diamond and to suppress neutron-induced background reactions as well as γ-background. The method is verified with experimental data from a measurement in a 14.3 MeV neutron beam at JRC-IRMM, Geel/Belgium, where the 13C(n, α)10Be reaction was successfully extracted from the dominating background of recoil protons and γ-rays and the energy resolution of the 12C(n, α)9Be reaction was substantially improved. The presented analysis technique is especially relevant for diagnostics in harsh radiation environments, like fission and fusion reactors. It allows to extract the neutron spectrum from the background, and is particularly applicable to neutron flux monitoring and neutron spectroscopy.

  14. Universal field matching in craniospinal irradiation by a background-dose gradient-optimized method.

    PubMed

    Traneus, Erik; Bizzocchi, Nicola; Fellin, Francesco; Rombi, Barbara; Farace, Paolo

    2018-01-01

    The gradient-optimized methods are overcoming the traditional feathering methods to plan field junctions in craniospinal irradiation. In this note, a new gradient-optimized technique, based on the use of a background dose, is described. Treatment planning was performed by RayStation (RaySearch Laboratories, Stockholm, Sweden) on the CT scans of a pediatric patient. Both proton (by pencil beam scanning) and photon (by volumetric modulated arc therapy) treatments were planned with three isocenters. An 'in silico' ideal background dose was created first to cover the upper-spinal target and to produce a perfect dose gradient along the upper and lower junction regions. Using it as background, the cranial and the lower-spinal beams were planned by inverse optimization to obtain dose coverage of their relevant targets and of the junction volumes. Finally, the upper-spinal beam was inversely planned after removal of the background dose and with the previously optimized beams switched on. In both proton and photon plans, the optimized cranial and the lower-spinal beams produced a perfect linear gradient in the junction regions, complementary to that produced by the optimized upper-spinal beam. The final dose distributions showed a homogeneous coverage of the targets. Our simple technique allowed to obtain high-quality gradients in the junction region. Such technique universally works for photons as well as protons and could be applicable to the TPSs that allow to manage a background dose. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    PubMed

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Limitations of STIRAP-like population transfer in extended systems: the three-level system embedded in a web of background states.

    PubMed

    Jakubetz, Werner

    2012-12-14

    This paper presents a systematic numerical investigation of background state participation in STIRAP (stimulated Raman-adiabatic passage) population transfer among vibrational states, focusing on the consequences for the robustness of the method. The simulations, which are performed over extended grids in the parameter space of the Stokes- and pump pulses (frequencies, field strengths, and pulse lengths), involve hierarchies of (3 + N)-level systems of increasing complexity, ranging from the standard three-level STIRAP setup, (N = 0) in Λ-configuration, up to N = 446. A strongly coupled three-level core system is selected from the full Hamiltonian of the double-well HCN∕HNC system, and the couplings connecting this core system to the remaining states are (re-) parameterized in different ways, from very weak to very strong. The systems so obtained represent a three-level system embedded in various ways in webs of cross-linked vibrational background states and incorporate typical molecular properties. We first summarize essential properties of population transfer in the standard three-level system and quantify the robustness of the method and its dependence on the pulse parameters. Against these reference results, we present results obtained for four (3 + 446)-level systems and several subsystems. For pulse lengths of at most few picoseconds the intrinsic robustness of STIRAP with respect to variations in the field strength disappears as soon as the largest core-background couplings exceed about one tenth of the STIRAP couplings. In such cases robustness with respect to variations in the field strength is entirely lost, since at higher field strengths, except for irregularly spaced narrow frequency ranges, transfer probabilities are strongly reduced. STIRAP-like population transfer is maintained, with some restrictions, at low field strengths near the onset of adiabatic transfer. The suppression of STIRAP is traced back to different mechanisms based on a plentitude of single- and multiphoton transitions to background states, which at the high field strengths characteristic for STIRAP proceed readily even along weakly coupled pathways.

  17. WDR-PK-AK-018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollister, R

    2009-08-26

    Method - CES SOP-HW-P556 'Field and Bulk Gamma Analysis'. Detector - High-purity germanium, 40% relative efficiency. Calibration - The detector was calibrated on February 8, 2006 using a NIST-traceable sealed source, and the calibration was verified using an independent sealed source. Count Time and Geometry - The sample was counted for 20 minutes at 72 inches from the detector. A lead collimator was used to limit the field-of-view to the region of the sample. The drum was rotated 180 degrees halfway through the count time. Date and Location of Scans - June 1,2006 in Building 235 Room 1136. Spectral Analysismore » Spectra were analyzed with ORTEC GammaVision software. Matrix and geometry corrections were calculated using OR TEC Isotopic software. A background spectrum was measured at the counting location. No man-made radioactivity was observed in the background. Results were determined from the sample spectra without background subtraction. Minimum detectable activities were calculated by the Nureg 4.16 method. Results - Detected Pu-238, Pu-239, Am-241 and Am-243.« less

  18. Anatomical background noise power spectrum in differential phase contrast breast images

    NASA Astrophysics Data System (ADS)

    Garrett, John; Ge, Yongshuai; Li, Ke; Chen, Guang-Hong

    2015-03-01

    In x-ray breast imaging, the anatomical noise background of the breast has a significant impact on the detection of lesions and other features of interest. This anatomical noise is typically characterized by a parameter, β, which describes a power law dependence of anatomical noise on spatial frequency (the shape of the anatomical noise power spectrum). Large values of β have been shown to reduce human detection performance, and in conventional mammography typical values of β are around 3.2. Recently, x-ray differential phase contrast (DPC) and the associated dark field imaging methods have received considerable attention as possible supplements to absorption imaging for breast cancer diagnosis. However, the impact of these additional contrast mechanisms on lesion detection is not yet well understood. In order to better understand the utility of these new methods, we measured the β indices for absorption, DPC, and dark field images in 15 cadaver breast specimens using a benchtop DPC imaging system. We found that the measured β value for absorption was consistent with the literature for mammographic acquisitions (β = 3.61±0.49), but that both DPC and dark field images had much lower values of β (β = 2.54±0.75 for DPC and β = 1.44±0.49 for dark field). In addition, visual inspection showed greatly reduced anatomical background in both DPC and dark field images. These promising results suggest that DPC and dark field imaging may help provide improved lesion detection in breast imaging, particularly for those patients with dense breasts, in whom anatomical noise is a major limiting factor in identifying malignancies.

  19. A salient region detection model combining background distribution measure for indoor robots.

    PubMed

    Li, Na; Xu, Hui; Wang, Zhenhua; Sun, Lining; Chen, Guodong

    2017-01-01

    Vision system plays an important role in the field of indoor robot. Saliency detection methods, capturing regions that are perceived as important, are used to improve the performance of visual perception system. Most of state-of-the-art methods for saliency detection, performing outstandingly in natural images, cannot work in complicated indoor environment. Therefore, we propose a new method comprised of graph-based RGB-D segmentation, primary saliency measure, background distribution measure, and combination. Besides, region roundness is proposed to describe the compactness of a region to measure background distribution more robustly. To validate the proposed approach, eleven influential methods are compared on the DSD and ECSSD dataset. Moreover, we build a mobile robot platform for application in an actual environment, and design three different kinds of experimental constructions that are different viewpoints, illumination variations and partial occlusions. Experimental results demonstrate that our model outperforms existing methods and is useful for indoor mobile robots.

  20. Exponential nonlinear electrodynamics and backreaction effects on holographic superconductor in the Lifshitz black hole background

    NASA Astrophysics Data System (ADS)

    Sherkatghanad, Z.; Mirza, B.; Lalehgani Dezaki, F.

    We analytically describe the properties of the s-wave holographic superconductor with the exponential nonlinear electrodynamics in the Lifshitz black hole background in four-dimensions. Employing an assumption the scalar and gauge fields backreact on the background geometry, we calculate the critical temperature as well as the condensation operator. Based on Sturm-Liouville method, we show that the critical temperature decreases with increasing exponential nonlinear electrodynamics and Lifshitz dynamical exponent, z, indicating that condensation becomes difficult. Also we find that the effects of backreaction has a more important role on the critical temperature and condensation operator in small values of Lifshitz dynamical exponent, while z is around one. In addition, the properties of the upper critical magnetic field in Lifshitz black hole background using Sturm-Liouville approach is investigated to describe the phase diagram of the corresponding holographic superconductor in the probe limit. We observe that the critical magnetic field decreases with increasing Lifshitz dynamical exponent, z, and it goes to zero at critical temperature, independent of the Lifshitz dynamical exponent, z.

  1. PE Metrics: Background, Testing Theory, and Methods

    ERIC Educational Resources Information Center

    Zhu, Weimo; Rink, Judy; Placek, Judith H.; Graber, Kim C.; Fox, Connie; Fisette, Jennifer L.; Dyson, Ben; Park, Youngsik; Avery, Marybell; Franck, Marian; Raynes, De

    2011-01-01

    New testing theories, concepts, and psychometric methods (e.g., item response theory, test equating, and item bank) developed during the past several decades have many advantages over previous theories and methods. In spite of their introduction to the field, they have not been fully accepted by physical educators. Further, the manner in which…

  2. Robust foreground detection: a fusion of masked grey world, probabilistic gradient information and extended conditional random field approach.

    PubMed

    Zulkifley, Mohd Asyraf; Moran, Bill; Rawlinson, David

    2012-01-01

    Foreground detection has been used extensively in many applications such as people counting, traffic monitoring and face recognition. However, most of the existing detectors can only work under limited conditions. This happens because of the inability of the detector to distinguish foreground and background pixels, especially in complex situations. Our aim is to improve the robustness of foreground detection under sudden and gradual illumination change, colour similarity issue, moving background and shadow noise. Since it is hard to achieve robustness using a single model, we have combined several methods into an integrated system. The masked grey world algorithm is introduced to handle sudden illumination change. Colour co-occurrence modelling is then fused with the probabilistic edge-based background modelling. Colour co-occurrence modelling is good in filtering moving background and robust to gradual illumination change, while an edge-based modelling is used for solving a colour similarity problem. Finally, an extended conditional random field approach is used to filter out shadow and afterimage noise. Simulation results show that our algorithm performs better compared to the existing methods, which makes it suitable for higher-level applications.

  3. Background fluorescence estimation and vesicle segmentation in live cell imaging with conditional random fields.

    PubMed

    Pécot, Thierry; Bouthemy, Patrick; Boulanger, Jérôme; Chessel, Anatole; Bardin, Sabine; Salamero, Jean; Kervrann, Charles

    2015-02-01

    Image analysis applied to fluorescence live cell microscopy has become a key tool in molecular biology since it enables to characterize biological processes in space and time at the subcellular level. In fluorescence microscopy imaging, the moving tagged structures of interest, such as vesicles, appear as bright spots over a static or nonstatic background. In this paper, we consider the problem of vesicle segmentation and time-varying background estimation at the cellular scale. The main idea is to formulate the joint segmentation-estimation problem in the general conditional random field framework. Furthermore, segmentation of vesicles and background estimation are alternatively performed by energy minimization using a min cut-max flow algorithm. The proposed approach relies on a detection measure computed from intensity contrasts between neighboring blocks in fluorescence microscopy images. This approach permits analysis of either 2D + time or 3D + time data. We demonstrate the performance of the so-called C-CRAFT through an experimental comparison with the state-of-the-art methods in fluorescence video-microscopy. We also use this method to characterize the spatial and temporal distribution of Rab6 transport carriers at the cell periphery for two different specific adhesion geometries.

  4. Research on cloud background infrared radiation simulation based on fractal and statistical data

    NASA Astrophysics Data System (ADS)

    Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing

    2018-02-01

    Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.

  5. Scalar field coupling to Einstein tensor in regular black hole spacetime

    NASA Astrophysics Data System (ADS)

    Zhang, Chi; Wu, Chen

    2018-02-01

    In this paper, we study the perturbation property of a scalar field coupling to Einstein's tensor in the background of the regular black hole spacetimes. Our calculations show that the the coupling constant η imprints in the wave equation of a scalar perturbation. We calculated the quasinormal modes of scalar field coupling to Einstein's tensor in the regular black hole spacetimes by the 3rd order WKB method.

  6. Renormalized stress-energy tensor for stationary black holes

    NASA Astrophysics Data System (ADS)

    Levi, Adam

    2017-01-01

    We continue the presentation of the pragmatic mode-sum regularization (PMR) method for computing the renormalized stress-energy tensor (RSET). We show in detail how to employ the t -splitting variant of the method, which was first presented for ⟨ϕ2⟩ren , to compute the RSET in a stationary, asymptotically flat background. This variant of the PMR method was recently used to compute the RSET for an evaporating spinning black hole. As an example for regularization, we demonstrate here the computation of the RSET for a minimally coupled, massless scalar field on Schwarzschild background in all three vacuum states. We discuss future work and possible improvements of the regularization schemes in the PMR method.

  7. Development and Validation of a Simplified Renal Replacement Therapy Suitable for Prolonged Field Care in a Porcine (Sus scrofa) Model of Acute Kidney Injury

    DTIC Science & Technology

    2018-03-01

    of a Simplified Renal Replacement Therapy Suitable for Prolonged Field Care in a Porcine (Sus scrofa) Model of Acute Kidney Injury. PRINCIPAL...and methods, results - include tables/figures, and conclusions/applications.) Objectives/Background: Acute kidney injury (AKI) is a serious

  8. Photovoice and Photodocumentary for Enhancing Community Partner Engagement and Student Learning in a Public Health Field School in Cape Town

    ERIC Educational Resources Information Center

    Wainwright, Megan; Bingham, Shantell; Sicwebu, Namhla

    2017-01-01

    Background: Field school research, which begins by considering community partners as pedagogues and thus exploring their perspectives on student learning, is uncommon. Photovoice is a method for self-expression of such marginalized voices. Purpose: Describe the photovoice to photodocumentary process and present results of its evaluation.…

  9. Near-field diffraction from amplitude diffraction gratings: theory, simulation and results

    NASA Astrophysics Data System (ADS)

    Abedin, Kazi Monowar; Rahman, S. M. Mujibur

    2017-08-01

    We describe a computer simulation method by which the complete near-field diffract pattern of an amplitude diffraction grating can be generated. The technique uses the method of iterative Fresnel integrals to calculate and generate the diffraction images. Theoretical background as well as the techniques to perform the simulation is described. The program is written in MATLAB, and can be implemented in any ordinary PC. Examples of simulated diffraction images are presented and discussed. The generated images in the far-field where they reduce to Fraunhofer diffraction pattern are also presented for a realistic grating, and compared with the results predicted by the grating equation, which is applicable in the far-field. The method can be used as a tool to teach the complex phenomenon of diffraction in classrooms.

  10. Stable solitary waves in super dense plasmas at external magnetic fields

    NASA Astrophysics Data System (ADS)

    Ghaani, Azam; Javidan, Kurosh; Sarbishaei, Mohsen

    2015-07-01

    Propagation of localized waves in a Fermi-Dirac distributed super dense matter at the presence of strong external magnetic fields is studied using the reductive perturbation method. We have shown that stable solitons can be created in such non-relativistic fluids in the presence of an external magnetic field. Such solitary waves are governed by the Zakharov-Kuznetsov (ZK) equation. Properties of solitonic solutions are studied in media with different values of background mass density and strength of magnetic field.

  11. A universality in pp-waves

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Partha

    2007-06-01

    We discuss a universality property of any covariant field theory in space-time expanded around pp-wave backgrounds. According to this property the space-time lagrangian density evaluated on a restricted set of field configurations, called universal sector, turns out to be same around all the pp-waves, even off-shell, with same transverse space and same profiles for the background scalars. In this paper we restrict our discussion to tensorial fields only. In the context of bosonic string theory we consider on-shell pp-waves and argue that universality requires the existence of a universal sector of world-sheet operators whose correlation functions are insensitive to the pp-wave nature of the metric and the background gauge flux. Such results can also be reproduced using the world-sheet conformal field theory. We also study such pp-waves in non-polynomial closed string field theory (CSFT). In particular, we argue that for an off-shell pp-wave ansatz with flat transverse space and dilaton independent of transverse coordinates the field redefinition relating the low energy effective field theory and CSFT with all the massive modes integrated out is at most quadratic in fields. Because of this simplification it is expected that the off-shell pp-waves can be identified on the two sides. Furthermore, given the massless pp-wave field configurations, an iterative method for computing the higher massive modes using the CSFT equations of motion has been discussed. All our bosonic string theory analyses can be generalised to the common Neveu-Schwarz sector of superstrings.

  12. Advanced Background Subtraction Applied to Aeroacoustic Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Horne, William C.

    2015-01-01

    An advanced form of background subtraction is presented and applied to aeroacoustic wind tunnel data. A variant of this method has seen use in other fields such as climatology and medical imaging. The technique, based on an eigenvalue decomposition of the background noise cross-spectral matrix, is robust against situations where isolated background auto-spectral levels are measured to be higher than levels of combined source and background signals. It also provides an alternate estimate of the cross-spectrum, which previously might have poor definition for low signal-to-noise ratio measurements. Simulated results indicate similar performance to conventional background subtraction when the subtracted spectra are weaker than the true contaminating background levels. Superior performance is observed when the subtracted spectra are stronger than the true contaminating background levels. Experimental results show limited success in recovering signal behavior for data where conventional background subtraction fails. They also demonstrate the new subtraction technique's ability to maintain a proper coherence relationship in the modified cross-spectral matrix. Beam-forming and de-convolution results indicate the method can successfully separate sources. Results also show a reduced need for the use of diagonal removal in phased array processing, at least for the limited data sets considered.

  13. Wide-field absolute transverse blood flow velocity mapping in vessel centerline

    NASA Astrophysics Data System (ADS)

    Wu, Nanshou; Wang, Lei; Zhu, Bifeng; Guan, Caizhong; Wang, Mingyi; Han, Dingan; Tan, Haishu; Zeng, Yaguang

    2018-02-01

    We propose a wide-field absolute transverse blood flow velocity measurement method in vessel centerline based on absorption intensity fluctuation modulation effect. The difference between the light absorption capacities of red blood cells and background tissue under low-coherence illumination is utilized to realize the instantaneous and average wide-field optical angiography images. The absolute fuzzy connection algorithm is used for vessel centerline extraction from the average wide-field optical angiography. The absolute transverse velocity in the vessel centerline is then measured by a cross-correlation analysis according to instantaneous modulation depth signal. The proposed method promises to contribute to the treatment of diseases, such as those related to anemia or thrombosis.

  14. A possible alternative method for collecting mosquito larvae in rice fields

    PubMed Central

    Robert, Vincent; Goff, Gilbert Le; Ariey, Frédéric; Duchemin, Jean-Bernard

    2002-01-01

    Background Rice fields are efficient breeding places for malaria vectors in Madagascar. In order to establish as easily as possible if a rice field is an effective larval site for anophelines, we compared classical dipping versus a net as methods of collecting larvae. Results Using similar collecting procedures, we found that the total number of anopheline larvae collected with the net was exactly double (174/87) that collected by dipping. The number of anopheline species collected was also greater with a net. Conclusions The net is an effective means of collecting anopheline larvae and can be used for qualitative ecological studies and to rapidly determine which rice fields are containing malaria vectors. PMID:12057018

  15. T1 and susceptibility contrast at high fields

    NASA Astrophysics Data System (ADS)

    Neelavalli, Jaladhar

    Clinical imaging at high magnetic field strengths (≥ 3Tesla) is sought after primarily due to the increased signal strength available at these fields. This increased SNR can be used to perform: (a) high resolution imaging in the same time as at lower field strengths; (b) the same resolution imaging with much faster acquisition; and (c) functional MR imaging (fMRI), dynamic perfusion and diffusion imaging with increased sensitivity. However they are also associated with increased power deposition (SAR) due to increase in imaging frequency and longer T1 relaxation times. Longer T1s mean longer imaging times for generating good T1 contrast images. On the other hand for faster imaging, at high fields fast spin echo or magnetization prepared sequences are conventionally proposed which are, however, associated with high SAR values. Imaging with low SAR is more and more important as we move towards high fields and particularly for patients with metallic implants like pacemakers or deep brain stimulator. The SAR limit acceptable for these patients is much less than the limit acceptable for normal subjects. A new method is proposed for imaging at high fields with good contrast with simultaneous reduction in power deposition. Further, T1 based contrast optimization problem in FLASH imaging is considered for tissues with different T1s but same spin densities. The solution providing optimal imaging parameters is simplified for quick and easy computation in a clinical setting. The efficacy of the simplification is evaluated and practical limits under which the simplification can be applied are worked out. The phase difference due to variation in magnetic susceptibility property among biological tissues is another unique source of contrast which is different from the conventional T1, T2 and T2* contrast. This susceptibility based phase contrast has become more and more important at high fields, partly due to contrast generation issues due to longer T 1s and shorter T2s and partly because of the invariance of most tissue susceptibilities with field strength. This essentially ensures a constant available phase contrast between tissues across field strengths. In fact, with the increased SNR at high fields, the phase CNR actually increases with field strength which is even better. Susceptibility weighted imaging, which uniquely combines this phase and magnitude information to generate enhanced susceptibility contrast magnitude images, has proven to be an important tool in the study of various neurological conditions like, Alzheimer's, Parkinson's, Huntington's disease and multiple sclerosis even at conventional field strength of 1.5T and should have more applicability at high fields. A major issue in using phase images for susceptibility contrast, directly or as processed SWI magnitude images, is the large scale background phase variations that obscure the local susceptibility based contrast. A novel method is proposed for removing such geometrically induced large scale phase variations using a Fourier Transform based field calculation method. It is shown that the new method is capable of successfully removing the background field effects. It is shown that the new method is not only capable of successfully removing the background field effects but also helps in preserving more local phase information.

  16. Jets and Metastability in Quantum Mechanics and Quantum Field Theory

    NASA Astrophysics Data System (ADS)

    Farhi, David

    I give a high level overview of the state of particle physics in the introduction, accessible without any background in the field. I discuss improvements of theoretical and statistical methods used for collider physics. These include telescoping jets, a statistical method which was claimed to allow jet searches to increase their sensitivity by considering several interpretations of each event. We find that indeed multiple interpretations extend the power of searches, for both simple counting experiments and powerful multivariate fitting experiments, at least for h → bb¯ at the LHC. Then I propose a method for automation of background calculations using SCET by appropriating the technology of Monte Carlo generators such as MadGraph. In the third chapter I change gears and discuss the future of the universe. It has long been known that our pocket of the standard model is unstable; there is a lower-energy configuration in a remote part of the configuration space, to which our universe will, eventually, decay. While the timescales involved are on the order of 10400 years (depending on how exactly one counts) and thus of no immediate worry, I discuss the shortcomings of the standard methods and propose a more physically motivated derivation for the decay rate. I then make various observations about the structure of decays in quantum field theory.

  17. Wide-field two-photon microscopy with temporal focusing and HiLo background rejection

    NASA Astrophysics Data System (ADS)

    Yew, Elijah Y. S.; Choi, Heejin; Kim, Daekeun; So, Peter T. C.

    2011-03-01

    Scanningless depth-resolved microscopy is achieved through spatial-temporal focusing and has been demonstrated previously. The advantage of this method is that a large area may be imaged without scanning resulting in higher throughput of the imaging system. Because it is a widefield technique, the optical sectioning effect is considerably poorer than with conventional spatial focusing two-photon microscopy. Here we propose wide-field two-photon microscopy based on spatio-temporal focusing and employing background rejection based on the HiLo microscope principle. We demonstrate the effects of applying HiLo microscopy to widefield temporally focused two-photon microscopy.

  18. The EPIC-MOS Particle-Induced Background Spectrum

    NASA Technical Reports Server (NTRS)

    Kuntz, K. D.; Snowden, S. L.

    2006-01-01

    We have developed a method for constructing a spectrum of the particle-induced instrumental background of the XMM-Newton EPIC MOS detectors that can be used for observations of the diffuse background and extended sources that fill a significant fraction of the instrument field of view. The strength and spectrum of the particle-induced background, that is, the background due to the interaction of particles with the detector and the detector surroundings, is temporally variable as well as spatially variable over individual chips. Our method uses a combination of the filter-wheel-closed data and a database of unexposed-region data to construct a spectrum of the "quiescent" background. We show that, using this method of background subtraction, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear evidence of solar wind charge exchange emission. We use the blank sky observations to show that contamination by SWCX emission is a strong function of the solar wind proton flux, and that observations through the flanks of the magnetosheath appear to be contaminated only at much higher solar wind fluxes. We have also developed a spectral model of the residual soft proton flares, which allows their effects to be removed to a substantial degree during spectral fitting.

  19. Nonrelativistic trace and diffeomorphism anomalies in particle number background

    NASA Astrophysics Data System (ADS)

    Auzzi, Roberto; Baiguera, Stefano; Nardelli, Giuseppe

    2018-04-01

    Using the heat kernel method, we compute nonrelativistic trace anomalies for Schrödinger theories in flat spacetime, with a generic background gauge field for the particle number symmetry, both for a free scalar and a free fermion. The result is genuinely nonrelativistic, and it has no counterpart in the relativistic case. Contrary to naive expectations, the anomaly is not gauge invariant; this is similar to the nongauge covariance of the non-Abelian relativistic anomaly. We also show that, in the same background, the gravitational anomaly for a nonrelativistic scalar vanishes.

  20. Maize and prairie root contributions to soil CO2 emissions in the field

    USDA-ARS?s Scientific Manuscript database

    Background and aims: A major hurdle in closing carbon budgets is partitioning soil-surface CO2 fluxes by source. This study aims to estimate CO2 resulting from root growth (RG) in the field. Methods: We used periodic 48-hour shading over two seasons to estimate and compare RG-derived CO2 in one annu...

  1. Photovoice as an Evaluation Tool for Student Learning on a Field Trip

    ERIC Educational Resources Information Center

    Behrendt, Marc; Machtmes, Krisanna

    2016-01-01

    Background: Photovoice is one method that enables an educator to view an experience from a student's perspective. This study examined how teachers might use photovoice during an informal learning experience to understand the students' experiences and experiential gain. Design and methods: Participants in this study consisted of six students, three…

  2. Background Error Covariance Estimation using Information from a Single Model Trajectory with Application to Ocean Data Assimilation into the GEOS-5 Coupled Model

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume; Koster, Randal D. (Editor)

    2014-01-01

    An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory. SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

  3. Background Error Covariance Estimation Using Information from a Single Model Trajectory with Application to Ocean Data Assimilation

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele; Kovach, Robin M.; Vernieres, Guillaume

    2014-01-01

    An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

  4. Particle and flow field holography: A critical survey

    NASA Technical Reports Server (NTRS)

    Trolinger, James D.

    1987-01-01

    A brief background is provided for the fields of particle and flow visualization holography. A summary of methods currently in use is given, followed by a discussion of more recent and unique applications. The problem of data reduction is discussed. A state of the art summary is then provided with a prognosis of the future of the field. Particle and flow visualization holography are characterized as powerful tools currently in wide use and with significant untapped potential.

  5. An improved correlation method for determining the period of a torsion pendulum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo Jie; Wang Dianhong

    Considering variation of environment temperature and unhomogeneity of background gravitational field, an improved correlation method was proposed to determine the variational period of a torsion pendulum with high precision. The result of processing experimental data shows that the uncertainty of determining the period with this method has been improved about twofolds than traditional correlation method, which is significant for the determination of gravitational constant with time-of-swing method.

  6. Pinch technique and the Batalin-Vilkovisky formalism

    NASA Astrophysics Data System (ADS)

    Binosi, Daniele; Papavassiliou, Joannis

    2002-07-01

    In this paper we take the first step towards a nondiagrammatic formulation of the pinch technique. In particular we proceed into a systematic identification of the parts of the one-loop and two-loop Feynman diagrams that are exchanged during the pinching process in terms of unphysical ghost Green's functions; the latter appear in the standard Slavnov-Taylor identity satisfied by the tree-level and one-loop three-gluon vertex. This identification allows for the consistent generalization of the intrinsic pinch technique to two loops, through the collective treatment of entire sets of diagrams, instead of the laborious algebraic manipulation of individual graphs, and sets up the stage for the generalization of the method to all orders. We show that the task of comparing the effective Green's functions obtained by the pinch technique with those computed in the background field method Feynman gauge is significantly facilitated when employing the powerful quantization framework of Batalin and Vilkovisky. This formalism allows for the derivation of a set of useful nonlinear identities, which express the background field method Green's functions in terms of the conventional (quantum) ones and auxiliary Green's functions involving the background source and the gluonic antifield; these latter Green's functions are subsequently related by means of a Schwinger-Dyson type of equation to the ghost Green's functions appearing in the aforementioned Slavnov-Taylor identity.

  7. Improved tomographic reconstructions using adaptive time-dependent intensity normalization.

    PubMed

    Titarenko, Valeriy; Titarenko, Sofya; Withers, Philip J; De Carlo, Francesco; Xiao, Xianghui

    2010-09-01

    The first processing step in synchrotron-based micro-tomography is the normalization of the projection images against the background, also referred to as a white field. Owing to time-dependent variations in illumination and defects in detection sensitivity, the white field is different from the projection background. In this case standard normalization methods introduce ring and wave artefacts into the resulting three-dimensional reconstruction. In this paper the authors propose a new adaptive technique accounting for these variations and allowing one to obtain cleaner normalized data and to suppress ring and wave artefacts. The background is modelled by the product of two time-dependent terms representing the illumination and detection stages. These terms are written as unknown functions, one scaled and shifted along a fixed direction (describing the illumination term) and one translated by an unknown two-dimensional vector (describing the detection term). The proposed method is applied to two sets (a stem Salix variegata and a zebrafish Danio rerio) acquired at the parallel beam of the micro-tomography station 2-BM at the Advanced Photon Source showing significant reductions in both ring and wave artefacts. In principle the method could be used to correct for time-dependent phenomena that affect other tomographic imaging geometries such as cone beam laboratory X-ray computed tomography.

  8. CAPELLA: Software for stellar photometry in dense fields with an irregular background

    NASA Astrophysics Data System (ADS)

    Debray, B.; Llebaria, A.; Dubout-Crillon, R.; Petit, M.

    1994-01-01

    We describe CAPELLA, a photometric reduction package developed top automatically process images of very crowded stellar fields with an irregular background. Detection is performed by the use of a derivative filter (the laplacian of a gaussian), the measuring of position and flux of the stars uses a profile fitting technique. The Point Spread Function (PSF) is empirical. The traditional multiparmetric non-linear fit is replaced by a set of individual linear fits. The determination of the background, the detection, the definition of the PSF and the basics of the methods are successively addressed in details. The iterative procedure as well as some aspects of the sampling problem are also discussed. Precision tests, performances in uncrowded and crowded fields are given CAPELLA has been used to process crowded stellar fields obtained with different detectors such as electronographic cameras, CCD's photographic films coupled to image intensifiers. It has been applied successfully in the extreme cases of close associations of the galaxy M33, of the composite Wolf-Rayet Brey 73 in the Large Magellanic Cloud (LMC) and of the central parts of globular clusters as 47 TUC and M15.

  9. Tweaking one-loop determinants in AdS3

    NASA Astrophysics Data System (ADS)

    Castro, Alejandra; Keeler, Cynthia; Szepietowski, Phillip

    2017-10-01

    We revisit the subject of one-loop determinants in AdS3 gravity via the quasi-normal mode method. Our goal is to evaluate a one-loop determinant with chiral boundary conditions for the metric field; chirality is achieved by imposing Dirichlet boundary conditions on certain components while others satisfy Neumann. Along the way, we give a generalization of the quasinormal mode method for stationary (non-static) thermal backgrounds, and propose a treatment for Neumann boundary conditions in this framework. We evaluate the graviton one-loop determinant on the Euclidean BTZ background with parity-violating boundary conditions (CSS), and find excellent agreement with the dual warped CFT. We also discuss a more general falloff in AdS3 that is related to two dimensional quantum gravity in lightcone gauge. The behavior of the ghost fields under both sets of boundary conditions is novel and we discuss potential interpretations.

  10. Skeletonization of Gridded Potential-Field Images

    NASA Astrophysics Data System (ADS)

    Gao, L.; Morozov, I. B.

    2012-12-01

    A new approach to skeletonization was developed for gridded potential-field data. Generally, skeletonization is a pattern-recognition technique allowing automatic recognition of near-linear features in the images, measurement of their parameters, and analyzing them for similarities. Our approach decomposes the images into arbitrarily-oriented "wavelets" characterized by positive or negative amplitudes, orientation angles, spatial dimensions, polarities, and other attributes. Orientations of the wavelets are obtained by scanning the azimuths to detect the strike direction of each anomaly. The wavelets are connected according to the similarities of these attributes, which leads to a "skeleton" map of the potential-field data. In addition, 2-D filtering is conducted concurrently with the wavelet-identification process, which allows extracting parameters of background trends and reduces the adverse effects of low-frequency background (which is often strong in potential-field maps) on skeletonization.. By correlating the neighboring wavelets, linear anomalies are identified and characterized. The advantages of this algorithm are the generality and isotropy of feature detection, as well as being specifically designed for gridded data. With several options for background-trend extraction, the stability for identification of lineaments is improved and optimized. The algorithm is also integrated in a powerful processing system which allows combining it with numerous other tools, such as filtering, computation of analytical signal, empirical mode decomposition, and various types of plotting. The method is applied to potential-field data for the Western Canada Sedimentary Basin, in a study area which extends from southern Saskatchewan into southwestern Manitoba. The target is the structure of crystalline basement beneath Phanerozoic sediments. The examples illustrate that skeletonization aid in the interpretation of complex structures at different scale lengths. The results indicate that this method is useful for identifying structures in complex geophysical images and for automatic extraction of their attributes as well as for quantitative characterization and analysis of potential-field images. Skeletonized potential-field images should also be useful for inversion.

  11. Study of improving signal-noise ratio for fluorescence channel

    NASA Astrophysics Data System (ADS)

    Wang, Guoqing; Li, Xin; Lou, Yue; Chen, Dong; Zhao, Xin; Wang, Ran; Yan, Debao; Zhao, Qi

    2017-10-01

    Laser-induced fluorescence(LIFS), which is one of most effective discrimination methods to identify the material at the molecular level by inducing fluorescence spectrum, has been popularized for its fast and accurate probe's results. According to the research, violet laser or ultraviolet laser is always used as excitation light source. While, There is no atmospheric window for violet laser and ultraviolet laser, causing laser attenuation along its propagation path. What's worse, as the laser reaching sample, part of the light is reflected. That is, excitation laser really react on sample to produce fluorescence is very poor, leading to weak fluorescence mingled with the background light collected by LIFS' processing unit, when it used outdoor. In order to spread LIFS to remote probing under the complex background, study of improving signal-noise ratio for fluorescence channel is a meaningful work. Enhancing the fluorescence intensity and inhibiting background light both can improve fluorescence' signal-noise ratio. In this article, three different approaches of inhibiting background light are discussed to improve the signal-noise ratio of LIFS. The first method is increasing fluorescence excitation area in the proportion of LIFS' collecting field by expanding laser beam, if the collecting filed is fixed. The second one is changing field angle base to accommodate laser divergence angle. The third one is setting a very narrow gating circuit to control acquisition circuit, which is shortly open only when fluorescence arriving. At some level, these methods all can reduce the background light. But after discussion, the third one is best with adding gating acquisition circuit to acquisition circuit instead of changing light path, which is effective and economic.

  12. A depth enhancement strategy for kinect depth image

    NASA Astrophysics Data System (ADS)

    Quan, Wei; Li, Hua; Han, Cheng; Xue, Yaohong; Zhang, Chao; Hu, Hanping; Jiang, Zhengang

    2018-03-01

    Kinect is a motion sensing input device which is widely used in computer vision and other related fields. However, there are many inaccurate depth data in Kinect depth images even Kinect v2. In this paper, an algorithm is proposed to enhance Kinect v2 depth images. According to the principle of its depth measuring, the foreground and the background are considered separately. As to the background, the holes are filled according to the depth data in the neighborhood. And as to the foreground, a filling algorithm, based on the color image concerning about both space and color information, is proposed. An adaptive joint bilateral filtering method is used to reduce noise. Experimental results show that the processed depth images have clean background and clear edges. The results are better than ones of traditional Strategies. It can be applied in 3D reconstruction fields to pretreat depth image in real time and obtain accurate results.

  13. O(d,d)-duality in string theory

    NASA Astrophysics Data System (ADS)

    Rennecke, Felix

    2014-10-01

    A new method for obtaining dual string theory backgrounds is presented. Preservation of the Hamiltonian density and the energy momentum tensor induced by O( d, d)-transformations leads to a relation between dual sets of coordinate one-forms accompanied by a redefinition of the background fields and a shift of the dilaton. The necessity of isometric directions arises as integrability condition for this map. The isometry algebra is studied in detail using generalised geometry. In particular, non-abelian dualities and β-transformations are contained in this approach. The latter are exemplified by the construction of a new approximate non-geometric background.

  14. Data Analysis and Data Mining: Current Issues in Biomedical Informatics

    PubMed Central

    Bellazzi, Riccardo; Diomidous, Marianna; Sarkar, Indra Neil; Takabayashi, Katsuhiko; Ziegler, Andreas; McCray, Alexa T.

    2011-01-01

    Summary Background Medicine and biomedical sciences have become data-intensive fields, which, at the same time, enable the application of data-driven approaches and require sophisticated data analysis and data mining methods. Biomedical informatics provides a proper interdisciplinary context to integrate data and knowledge when processing available information, with the aim of giving effective decision-making support in clinics and translational research. Objectives To reflect on different perspectives related to the role of data analysis and data mining in biomedical informatics. Methods On the occasion of the 50th year of Methods of Information in Medicine a symposium was organized, that reflected on opportunities, challenges and priorities of organizing, representing and analysing data, information and knowledge in biomedicine and health care. The contributions of experts with a variety of backgrounds in the area of biomedical data analysis have been collected as one outcome of this symposium, in order to provide a broad, though coherent, overview of some of the most interesting aspects of the field. Results The paper presents sections on data accumulation and data-driven approaches in medical informatics, data and knowledge integration, statistical issues for the evaluation of data mining models, translational bioinformatics and bioinformatics aspects of genetic epidemiology. Conclusions Biomedical informatics represents a natural framework to properly and effectively apply data analysis and data mining methods in a decision-making context. In the future, it will be necessary to preserve the inclusive nature of the field and to foster an increasing sharing of data and methods between researchers. PMID:22146916

  15. Intricate Puzzle of Oil and Gas Reserves Growth

    EIA Publications

    1997-01-01

    This article begins with a background discussion of the methods used to estimate proved oil and gas reserves and ultimate recovery, which is followed by a discussion of the factors that affect the ultimate recovery estimates of a field or reservoir.

  16. Interseasonal precipitation patternsimpact the occurrence of waterborne pathogens in an agricultural watershed

    EPA Science Inventory

    Background/Question/Methods: Runoff from agricultural fields undergoing manure applications or housing livestock operations may carry a variety of chemical and microbial contaminants that compromise water quality and increase the possibility of human exposure to pathogenic microo...

  17. Modularization and Flexibilization.

    ERIC Educational Resources Information Center

    Van Meel, R. M.

    Publications in the fields of educational science, organization theory, and project management were analyzed to identify the possibilities that modularization offers to institutions of higher professional education and to obtain background information for use in developing a method for modularization in higher professional education. It was…

  18. Plenoptic background oriented schlieren imaging

    NASA Astrophysics Data System (ADS)

    Klemkowsky, Jenna N.; Fahringer, Timothy W.; Clifford, Christopher J.; Bathel, Brett F.; Thurow, Brian S.

    2017-09-01

    The combination of the background oriented schlieren (BOS) technique with the unique imaging capabilities of a plenoptic camera, termed plenoptic BOS, is introduced as a new addition to the family of schlieren techniques. Compared to conventional single camera BOS, plenoptic BOS is capable of sampling multiple lines-of-sight simultaneously. Displacements from each line-of-sight are collectively used to build a four-dimensional displacement field, which is a vector function structured similarly to the original light field captured in a raw plenoptic image. The displacement field is used to render focused BOS images, which qualitatively are narrow depth of field slices of the density gradient field. Unlike focused schlieren methods that require manually changing the focal plane during data collection, plenoptic BOS synthetically changes the focal plane position during post-processing, such that all focal planes are captured in a single snapshot. Through two different experiments, this work demonstrates that plenoptic BOS is capable of isolating narrow depth of field features, qualitatively inferring depth, and quantitatively estimating the location of disturbances in 3D space. Such results motivate future work to transition this single-camera technique towards quantitative reconstructions of 3D density fields.

  19. Maximizing the quantitative accuracy and reproducibility of Förster resonance energy transfer measurement for screening by high throughput widefield microscopy

    PubMed Central

    Schaufele, Fred

    2013-01-01

    Förster resonance energy transfer (FRET) between fluorescent proteins (FPs) provides insights into the proximities and orientations of FPs as surrogates of the biochemical interactions and structures of the factors to which the FPs are genetically fused. As powerful as FRET methods are, technical issues have impeded their broad adoption in the biologic sciences. One hurdle to accurate and reproducible FRET microscopy measurement stems from variable fluorescence backgrounds both within a field and between different fields. Those variations introduce errors into the precise quantification of fluorescence levels on which the quantitative accuracy of FRET measurement is highly dependent. This measurement error is particularly problematic for screening campaigns since minimal well-to-well variation is necessary to faithfully identify wells with altered values. High content screening depends also upon maximizing the numbers of cells imaged, which is best achieved by low magnification high throughput microscopy. But, low magnification introduces flat-field correction issues that degrade the accuracy of background correction to cause poor reproducibility in FRET measurement. For live cell imaging, fluorescence of cell culture media in the fluorescence collection channels for the FPs commonly used for FRET analysis is a high source of background error. These signal-to-noise problems are compounded by the desire to express proteins at biologically meaningful levels that may only be marginally above the strong fluorescence background. Here, techniques are presented that correct for background fluctuations. Accurate calculation of FRET is realized even from images in which a non-flat background is 10-fold higher than the signal. PMID:23927839

  20. A noninvasive method for measuring the velocity of diffuse hydrothermal flow by tracking moving refractive index anomalies

    NASA Astrophysics Data System (ADS)

    Mittelstaedt, Eric; Davaille, Anne; van Keken, Peter E.; Gracias, Nuno; Escartin, Javier

    2010-10-01

    Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g., rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperature-dependent viscosity. Results show that average RMS errors are ˜5%-7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck'09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ˜10°C-15°C effluent reach ˜5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2-D flows where background objects have a small spatial scale, such as sand or gravel.

  1. Skyrmion based universal memory operated by electric current

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zang, Jiadong; Chien, Chia-Ling; Li, Yufan

    2017-09-26

    A method for generating a skyrmion, comprising: depositing a vertical metallic nanopillar electrode on a first side of a helimagnetic thin film, the helimagnetic thin film having a contact on a second side to provide a current drain; injecting a current through the vertical metallic nanopillar electrode to generate a rotating field; and applying a static upward magnetic field perpendicular to the helimagnetic thin film to maintain an FM phase background.

  2. Experimental investigation of coaxial-gun-formed plasmas injected into a background transverse magnetic field or plasma

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Fisher, Dustin M.; Gilmore, Mark; Hsu, Scott C.; Lynn, Alan G.

    2018-05-01

    Injection of coaxial-gun-formed magnetized plasmas into a background transverse vacuum magnetic field or into a background magnetized plasma has been studied in the helicon-cathode (HelCat) linear plasma device at the University of New Mexico [M. Gilmore et al., J. Plasma Phys. 81, 345810104 (2015)]. A magnetized plasma jet launched into a background transverse magnetic field shows emergent kink stabilization of the jet due to the formation of a sheared flow in the jet above the kink stabilization threshold 0.1kVA [Y. Zhang et al., Phys. Plasmas 24, 110702 (2017)]. Injection of a spheromak-like plasma into a transverse background magnetic field led to the observation of finger-like structures on the side with a stronger magnetic field null between the spheromak and the background field. The finger-like structures are consistent with magneto-Rayleigh-Taylor instability. Jets or spheromaks launched into a background, low-β magnetized plasma show similar behavior as above, respectively, in both cases.

  3. Quantification of plume opacity by digital photography.

    PubMed

    Du, Ke; Rood, Mark J; Kim, Byung J; Kemme, Michael R; Franek, Bill; Mattison, Kevin

    2007-02-01

    The United States Environmental Protection Agency (USEPA) developed Method 9 to describe how plume opacity can be quantified by humans. However, use of observations by humans introduces subjectivity, and is expensive due to semiannual certification requirements of the observers. The Digital Opacity Method (DOM) was developed to quantify plume opacity at lower cost, with improved objectivity, and to provide a digital record. Photographs of plumes were taken with a calibrated digital camera under specified conditions. Pixel values from those photographs were then interpreted to quantify the plume's opacity using a contrast model and a transmission model. The contrast model determines plume opacity based on pixel values that are related to the change in contrast between two backgrounds that are located behind and next to the plume. The transmission model determines the plume's opacity based on pixel values that are related to radiances from the plume and its background. DOM was field tested with a smoke generator. The individual and average opacity errors of DOM were within the USEPA Method 9 acceptable error limits for both field campaigns. Such results are encouraging and support the use of DOM as an alternative to Method 9.

  4. Mitigating fluorescence spectral overlap in wide-field endoscopic imaging

    PubMed Central

    Hou, Vivian; Nelson, Leonard Y.; Seibel, Eric J.

    2013-01-01

    Abstract. The number of molecular species suitable for multispectral fluorescence imaging is limited due to the overlap of the emission spectra of indicator fluorophores, e.g., dyes and nanoparticles. To remove fluorophore emission cross-talk in wide-field multispectral fluorescence molecular imaging, we evaluate three different solutions: (1) image stitching, (2) concurrent imaging with cross-talk ratio subtraction algorithm, and (3) frame-sequential imaging. A phantom with fluorophore emission cross-talk is fabricated, and a 1.2-mm ultrathin scanning fiber endoscope (SFE) is used to test and compare these approaches. Results show that fluorophore emission cross-talk could be successfully avoided or significantly reduced. Near term, the concurrent imaging method of wide-field multispectral fluorescence SFE is viable for early stage cancer detection and localization in vivo. Furthermore, a means to enhance exogenous fluorescence target-to-background ratio by the reduction of tissue autofluorescence background is demonstrated. PMID:23966226

  5. Hadron electric polarizability from lattice QCD

    NASA Astrophysics Data System (ADS)

    Alexandru, Andrei

    2017-09-01

    Electromagnetic polarizabilities are important parameters for hadron structure, describing the response of the charge and current distributions inside the hadron to an external electromagnetic field. For most hadrons these quantities are poorly constrained experimentally since they can only be measured indirectly. Lattice QCD can be used to compute these quantities directly in terms of quark and gluons degrees of freedom, using the background field method. We present results for the neutron electric polarizability for two different quark masses, light enough to connect to chiral perturbation theory. These are currently the lightest quark masses used in polarizability studies. For each pion mass we compute the polarizability at four different volumes and perform an infinite volume extrapolation. We also discuss the effect of turning on the coupling between the background field and the sea quarks. A.A. is supported in part by the National Science Foundation CAREER Grant PHY-1151648 and by U.S. DOE Grant No. DE-FG02-95ER40907.

  6. Trapping and dynamic manipulation of polystyrene beads mimicking circulating tumor cells using targeted magnetic/photoacoustic contrast agents

    NASA Astrophysics Data System (ADS)

    Wei, Chen-Wei; Xia, Jinjun; Pelivanov, Ivan; Hu, Xiaoge; Gao, Xiaohu; O'Donnell, Matthew

    2012-10-01

    Results on magnetically trapping and manipulating micro-scale beads circulating in a flow field mimicking metastatic cancer cells in human peripheral vessels are presented. Composite contrast agents combining magneto-sensitive nanospheres and highly optical absorptive gold nanorods were conjugated to micro-scale polystyrene beads. To efficiently trap the targeted objects in a fast stream, a dual magnet system consisting of two flat magnets to magnetize (polarize) the contrast agent and an array of cone magnets producing a sharp gradient field to trap the magnetized contrast agent was designed and constructed. A water-ink solution with an optical absorption coefficient of 10 cm-1 was used to mimic the optical absorption of blood. Magnetomotive photoacoustic imaging helped visualize bead trapping, dynamic manipulation of trapped beads in a flow field, and the subtraction of stationary background signals insensitive to the magnetic field. The results show that trafficking micro-scale objects can be effectively trapped in a stream with a flow rate up to 12 ml/min and the background can be significantly (greater than 15 dB) suppressed. It makes the proposed method very promising for sensitive detection of rare circulating tumor cells within high flow vessels with a highly absorptive optical background.

  7. Research Projects 1968.

    ERIC Educational Resources Information Center

    Swedish Council for Personnel Administration, Stockholm.

    The 39 research projects described cover several fields within the social and behavioral sciences related to personnel administration. The project description format includes: (1) project title, (2) principal investigator, (3) institution, (4) advisor, (5) grants, (6) background and purpose, (7) scope, material, methods, experimental design, (8)…

  8. Using functional data analysis to analyze ecological series data

    EPA Science Inventory

    Background/Question/MethodsA frequent goal in ecology is to understand the relationships among biological organisms and their environment. Most field data are collected as scalar measurements, such that observations are recorded as a collection of datums. The observations are t...

  9. The inception of pulsed discharges in air: simulations in background fields above and below breakdown

    NASA Astrophysics Data System (ADS)

    Sun, Anbang; Teunissen, Jannis; Ebert, Ute

    2014-11-01

    We investigate discharge inception in air, in uniform background electric fields above and below the breakdown threshold. We perform 3D particle simulations that include a natural level of background ionization in the form of positive and \\text{O}2- ions. In background fields below breakdown, we use a strongly ionized seed of electrons and positive ions to enhance the field locally. In the region of enhanced field, we observe the growth of positive streamers, as in previous simulations with 2D plasma fluid models. The inclusion of background ionization has little effect in this case. When the background field is above the breakdown threshold, the situation is very different. Electrons can then detach from \\text{O}2- and start ionization avalanches in the whole volume. These avalanches together create one extended discharge, in contrast to the ‘double-headed’ streamers found in many fluid simulations.

  10. Toward particle-level filtering of individual collision events at the Large Hadron Collider and beyond

    NASA Astrophysics Data System (ADS)

    Colecchia, Federico

    2014-03-01

    Low-energy strong interactions are a major source of background at hadron colliders, and methods of subtracting the associated energy flow are well established in the field. Traditional approaches treat the contamination as diffuse, and estimate background energy levels either by averaging over large data sets or by restricting to given kinematic regions inside individual collision events. On the other hand, more recent techniques take into account the discrete nature of background, most notably by exploiting the presence of substructure inside hard jets, i.e. inside collections of particles originating from scattered hard quarks and gluons. However, none of the existing methods subtract background at the level of individual particles inside events. We illustrate the use of an algorithm that will allow particle-by-particle background discrimination at the Large Hadron Collider, and we envisage this as the basis for a novel event filtering procedure upstream of the official reconstruction chains. Our hope is that this new technique will improve physics analysis when used in combination with state-of-the-art algorithms in high-luminosity hadron collider environments.

  11. Multi-feature machine learning model for automatic segmentation of green fractional vegetation cover for high-throughput field phenotyping.

    PubMed

    Sadeghi-Tehran, Pouria; Virlet, Nicolas; Sabermanesh, Kasra; Hawkesford, Malcolm J

    2017-01-01

    Accurately segmenting vegetation from the background within digital images is both a fundamental and a challenging task in phenotyping. The performance of traditional methods is satisfactory in homogeneous environments, however, performance decreases when applied to images acquired in dynamic field environments. In this paper, a multi-feature learning method is proposed to quantify vegetation growth in outdoor field conditions. The introduced technique is compared with the state-of the-art and other learning methods on digital images. All methods are compared and evaluated with different environmental conditions and the following criteria: (1) comparison with ground-truth images, (2) variation along a day with changes in ambient illumination, (3) comparison with manual measurements and (4) an estimation of performance along the full life cycle of a wheat canopy. The method described is capable of coping with the environmental challenges faced in field conditions, with high levels of adaptiveness and without the need for adjusting a threshold for each digital image. The proposed method is also an ideal candidate to process a time series of phenotypic information throughout the crop growth acquired in the field. Moreover, the introduced method has an advantage that it is not limited to growth measurements only but can be applied on other applications such as identifying weeds, diseases, stress, etc.

  12. Whole head quantitative susceptibility mapping using a least-norm direct dipole inversion method.

    PubMed

    Sun, Hongfu; Ma, Yuhan; MacDonald, M Ethan; Pike, G Bruce

    2018-06-15

    A new dipole field inversion method for whole head quantitative susceptibility mapping (QSM) is proposed. Instead of performing background field removal and local field inversion sequentially, the proposed method performs dipole field inversion directly on the total field map in a single step. To aid this under-determined and ill-posed inversion process and obtain robust QSM images, Tikhonov regularization is implemented to seek the local susceptibility solution with the least-norm (LN) using the L-curve criterion. The proposed LN-QSM does not require brain edge erosion, thereby preserving the cerebral cortex in the final images. This should improve its applicability for QSM-based cortical grey matter measurement, functional imaging and venography of full brain. Furthermore, LN-QSM also enables susceptibility mapping of the entire head without the need for brain extraction, which makes QSM reconstruction more automated and less dependent on intermediate pre-processing methods and their associated parameters. It is shown that the proposed LN-QSM method reduced errors in a numerical phantom simulation, improved accuracy in a gadolinium phantom experiment, and suppressed artefacts in nine subjects, as compared to two-step and other single-step QSM methods. Measurements of deep grey matter and skull susceptibilities from LN-QSM are consistent with established reconstruction methods. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Born iterative reconstruction using perturbed-phase field estimates.

    PubMed

    Astheimer, Jeffrey P; Waag, Robert C

    2008-10-01

    A method of image reconstruction from scattering measurements for use in ultrasonic imaging is presented. The method employs distorted-wave Born iteration but does not require using a forward-problem solver or solving large systems of equations. These calculations are avoided by limiting intermediate estimates of medium variations to smooth functions in which the propagated fields can be approximated by phase perturbations derived from variations in a geometric path along rays. The reconstruction itself is formed by a modification of the filtered-backpropagation formula that includes correction terms to account for propagation through an estimated background. Numerical studies that validate the method for parameter ranges of interest in medical applications are presented. The efficiency of this method offers the possibility of real-time imaging from scattering measurements.

  14. Rocket-triggered lightning strikes and forest fire ignition

    NASA Technical Reports Server (NTRS)

    Fenner, James H.

    1989-01-01

    Background information on the rocket-triggered lightning project at Kennedy Space Center (KSC), a summary of the forecasting problem there, the facilities and equipment available for undertaking field experiments at KSC, previous research activity performed, a description of the atmospheric science field laboratory near Mosquito Lagoon on the KSC complex, methods of data acquisition, and present results are discussed. New sources of data for the 1989 field experiment include measuring the electric field in the lower few thousand feet of the atmosphere by suspending field measuring devices below a tethered balloon. Problems encountered during the 1989 field experiment are discussed. Future prospects for both triggered lightning and lightning-kindled forest fire research at KSC are listed.

  15. An Automatic Technique for Finding Faint Moving Objects in Wide Field CCD Images

    NASA Astrophysics Data System (ADS)

    Hainaut, O. R.; Meech, K. J.

    1996-09-01

    The traditional method used to find moving objects in astronomical images is to blink pairs or series of frames after registering them to align the background objects. While this technique is extremely efficient in terms of the low signal-to-noise ratio that the human sight can detect, it proved to be extremely time-, brain- and eyesight-consuming. The wide-field images provided by the large CCD mosaic recently built at IfA cover a field of view of 20 to 30' over 8192(2) pixels. Blinking such images is an enormous task, comparable to that of blinking large photographic plates. However, as the data are available digitally (each image occupying 260Mb of disk space), we are developing a set of computer codes to perform the moving object identification in sets of frames. This poster will describe the techniques we use in order to reach a detection efficiency as good as that of a human blinker; the main steps are to find all the objects in each frame (for which we rely on ``S-Extractor'' (Bertin & Arnouts (1996), A&ASS 117, 393), then identify all the background objects, and finally to search the non-background objects for sources moving in a coherent fashion. We will also describe the results of this method applied to actual data from the 8k CCD mosaic. {This work is being supported, in part, by NSF grant AST 92-21318.}

  16. Current Status of the Polyamine Research Field

    PubMed Central

    Pegg, Anthony E.; Casero, Robert A.

    2013-01-01

    This chapter provides an overview of the polyamine field and introduces the 32 other chapters that make up this volume. These chapters provide a wide range of methods, advice, and background relevant to studies of the function of polyamines, the regulation of their content, their role in disease, and the therapeutic potential of drugs targeting polyamine content and function. The methodology provided in this new volume will enable laboratories already working in this area to expand their experimental techniques and facilitate the entry of additional workers into this rapidly expanding field. PMID:21318864

  17. Optical Flow for Flight and Wind Tunnel Background Oriented Schlieren Imaging

    NASA Technical Reports Server (NTRS)

    Smith, Nathanial T.; Heineck, James T.; Schairer, Edward T.

    2017-01-01

    Background oriented Schlieren images have historically been generated by calculating the observed pixel displacement between a wind-on and wind-o image pair using normalized cross-correlation. This work uses optical flow to solve the displacement fields which generate the Schlieren images. A well established method used in the computer vision community, optical flow is the apparent motion in an image sequence due to brightness changes. The regularization method of Horn and Schunck is used to create Schlieren images using two data sets: a supersonic jet plume shock interaction from the NASA Ames Unitary Plan Wind Tunnel, and a transonic flight test of a T-38 aircraft using a naturally occurring background, performed in conjunction with NASA Ames and Armstrong Research Centers. Results are presented and contrasted with those using normalized cross-correlation. The optical flow Schlieren images are found to provided significantly more detail. We apply the method to historical data sets to demonstrate the broad applicability and limitations of the technique.

  18. Transdisciplinarity in Research: Perspectives of Early Career Faculty

    ERIC Educational Resources Information Center

    Moore, Megan; Martinson, Melissa L.; Nurius, Paula S.; Kemp, Susan P.

    2018-01-01

    Background: Early career faculty experiences and perspectives on transdisciplinary research are important yet understudied. Methods: Assistant professors at 50 top-ranked social work programs completed an online survey assessing perspectives on the salience of transdisciplinary training in their field, obstacles to or negative impacts of…

  19. Proportional basal area method for implementing selection silviculture systems in longleaf pine forests

    Treesearch

    Dale G. Brockway; Edward F. Loewenstein; Kenneth W. Outcalt

    2014-01-01

    Proportional basal area (Pro-B) was developed as an accurate, easy-to-use method for making uneven-aged silviculture a practical management option. Following less than 3 h of training, forest staff from a range of professional backgrounds used Pro-B in an operational-scale field study to apply single-tree selection and group selection systems in longleaf pine (Pinus...

  20. Lessons learned in preparing method 29 filters for compliance testing audits.

    PubMed

    Martz, R F; McCartney, J E; Bursey, J T; Riley, C E

    2000-01-01

    Companies conducting compliance testing are required to analyze audit samples at the time they collect and analyze the stack samples if audit samples are available. Eastern Research Group (ERG) provides technical support to the EPA's Emission Measurements Center's Stationary Source Audit Program (SSAP) for developing, preparing, and distributing performance evaluation samples and audit materials. These audit samples are requested via the regulatory Agency and include spiked audit materials for EPA Method 29-Metals Emissions from Stationary Sources, as well as other methods. To provide appropriate audit materials to federal, state, tribal, and local governments, as well as agencies performing environmental activities and conducting emission compliance tests, ERG has recently performed testing of blank filter materials and preparation of spiked filters for EPA Method 29. For sampling stationary sources using an EPA Method 29 sampling train, the use of filters without organic binders containing less than 1.3 microg/in.2 of each of the metals to be measured is required. Risk Assessment testing imposes even stricter requirements for clean filter background levels. Three vendor sources of quartz fiber filters were evaluated for background contamination to ensure that audit samples would be prepared using filters with the lowest metal background levels. A procedure was developed to test new filters, and a cleaning procedure was evaluated to see if a greater level of cleanliness could be achieved using an acid rinse with new filters. Background levels for filters supplied by different vendors and within lots of filters from the same vendor showed a wide variation, confirmed through contact with several analytical laboratories that frequently perform EPA Method 29 analyses. It has been necessary to repeat more than one compliance test because of suspect metals background contamination levels. An acid cleaning step produced improvement in contamination level, but the difference was not significant for most of the Method 29 target metals. As a result of our studies, we conclude: Filters for Method 29 testing should be purchased in lots as large as possible. Testing firms should pre-screen new boxes and/or new lots of filters used for Method 29 testing. Random analysis of three filters (top, middle, bottom of the box) from a new box of vendor filters before allowing them to be used in field tests is a prudent approach. A box of filters from a given vendor should be screened, and filters from this screened box should be used both for testing and as field blanks in each test scenario to provide the level of quality assurance required for stationary source testing.

  1. Interaction of moving branes with background massless and tachyon fields in superstring theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaei, Z., E-mail: z.rezaei@aut.ac.ir; Kamani, D., E-mail: kamani@aut.ac.ir

    2012-02-15

    Using the boundary state formalism, we study a moving Dp-brane in a partially compact space-time in the presence of background fields: the Kalb-Ramond field B{sub {mu}{nu}}, a U(1) gauge field A{sub {alpha}}, and the tachyon field. The boundary state enables us to obtain the interaction amplitude of two branes with the above back-ground fields. The branes are parallel or perpendicular to each other. Because of the presence of background fields, compactification of some space-time directions, motion of the branes, and the arbitrariness of the dimensions of the branes, the system is rather general. Due to the tachyon fields and velocitiesmore » of the branes, the behavior of the interaction amplitude reveals obvious differences from the conventional behavior.« less

  2. Preferential Heating and Acceleration of Heavy Ions in Impulsive Solar Flares

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Rahul; Gaspari, Massimo; Spitkovsky, Anatoly

    2017-02-01

    We simulate decaying turbulence in a homogeneous pair plasma using a three-dimensional electromagnetic particle-in-cell method. A uniform background magnetic field permeates the plasma such that the magnetic pressure is three times larger than the thermal pressure and the turbulence is generated by counter-propagating shear Alfvén waves. The energy predominately cascades transverse to the background magnetic field, rendering the turbulence anisotropic at smaller scales. We simultaneously move several ion species of varying charge to mass ratios in our simulation and show that the particles of smaller charge to mass ratios are heated and accelerated to non-thermal energies at a faster rate.more » This is in accordance with the enhancement of heavy ions and a non-thermal tail in their energy spectrum observed in the impulsive solar flares. We further show that the heavy ions are energized mostly in the direction perpendicular to the background magnetic field, with a rate consistent with our analytical estimate of the rate of heating due to cyclotron resonance with the Alfvén waves, of which a large fraction is due to obliquely propagating waves.« less

  3. A perfectly conducting surface in electrodynamics with Lorentz symmetry breaking

    NASA Astrophysics Data System (ADS)

    Borges, L. H. C.; Barone, F. A.

    2017-10-01

    In this paper we consider a model which exhibits explicit Lorentz symmetry breaking due to the presence of a single background vector v^{μ } coupled to the gauge field. We investigate such a theory in the vicinity of a perfectly conducting plate for different configurations of v^{μ }. First we consider no restrictions on the components of the background vector and we treat it perturbatively up to second order. Next, we treat v^{μ } exactly for two special cases: the first one is when it has only components parallel to the plate, and the second one when it has a single component perpendicular to the plate. For all these configurations, the propagator for the gauge field and the interaction force between the plate and a point-like electric charge are computed. Surprisingly, it is shown that the image method is valid in our model and we argue that it is a non-trivial result. We show there arises a torque on the mirror with respect to its positioning in the background field when it interacts with a point-like charge. It is a new effect with no counterpart in theories with Lorentz symmetry in the presence of a perfect mirror.

  4. Reconstruction and separation of vibratory field using structural holography

    NASA Astrophysics Data System (ADS)

    Chesnais, C.; Totaro, N.; Thomas, J.-H.; Guyader, J.-L.

    2017-02-01

    A method for reconstructing and separating vibratory field on a plate-like structure is presented. The method, called "Structural Holography" is derived from classical Near-field Acoustic Holography (NAH) but in the vibratory domain. In this case, the plate displacement is measured on one-dimensional lines (the holograms) and used to reconstruct the entire two-dimensional displacement field. As a consequence, remote measurements on non directly accessible zones are possible with Structural Holography. Moreover, as it is based on the decomposition of the field into forth and back waves, Structural Holography permits to separate forces in the case of multi-sources excitation. The theoretical background of the Structural Holography method is described first. Then, to illustrate the process and the possibilities of Structural Holography, the academic test case of an infinite plate excited by few point forces is presented. With the principle of vibratory field separation, the displacement fields produced by each point force separately is reconstructed. However, the displacement field is not always meaningful and some additional treatments are mandatory to localize the position of point forces for example. From the simple example of an infinite plate, a post-processing based on the reconstruction of the structural intensity field is thus proposed. Finally, Structural Holography is generalized to finite plates and applied to real experimental measurements

  5. Relaxation of the chiral imbalance and the generation of magnetic fields in magnetars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dvornikov, M. S., E-mail: maxdvo@izmiran.ru

    2016-12-15

    The model for the generation of magnetic fields in a neutron star, based on the magnetic field instability caused by the electroweak interaction between electrons and nucleons, is developed. Using the methods of the quantum field theory, the helicity flip rate of electrons in their scattering off protons in dense matter of a neutron star is calculated. The influence of the electroweak interaction between electrons and background nucleons on the process of the helicity flip is studied. The kinetic equation for the evolution of the chiral imbalance is derived. The obtained results are applied for the description of the magneticmore » fields evolution in magnetars.« less

  6. Speech information retrieval: a review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Henry, Michael J.

    Audio is an information-rich component of multimedia. Information can be extracted from audio in a number of different ways, and thus there are several established audio signal analysis research fields. These fields include speech recognition, speaker recognition, audio segmentation and classification, and audio finger-printing. The information that can be extracted from tools and methods developed in these fields can greatly enhance multimedia systems. In this paper, we present the current state of research in each of the major audio analysis fields. The goal is to introduce enough back-ground for someone new in the field to quickly gain high-level understanding andmore » to provide direction for further study.« less

  7. Sources of background light on space based laser communications links

    NASA Astrophysics Data System (ADS)

    Farrell, Thomas C.

    2018-05-01

    We discuss the sources and levels of background light that should be expected on space based laser communication (lasercom) crosslinks and uplinks, as well as on downlinks to ground stations. The analyses are valid for both Earth orbiting satellites and inter-planetary links. Fundamental equations are derived suitable for first order system engineering analyses of potential lasercom systems. These divide sources of background light into two general categories: extended sources which fill the field of view of a receiver's optics, and point sources which cannot be resolved by the optics. Specific sources of background light are discussed, and expected power levels are estimated. For uplinks, reflected sunlight and blackbody radiation from the Earth dominates. For crosslinks, depending on specific link geometry, sources of background light may include the Sun in the field of view (FOV), reflected sunlight and blackbody radiation from planets and other bodies in the solar system, individual bright stars in the FOV, the amalgam of dim stars in the FOV, zodiacal light, and reflected sunlight off of the transmitting spacecraft. For downlinks, all of these potentially come into play, and the effects of the atmosphere, including turbulence, scattering, and absorption contribute as well. Methods for accounting for each of these are presented. Specific examples are presented to illustrate the relative contributions of each source for various link geometries.

  8. Synchronization of video recording and laser pulses including background light suppression

    NASA Technical Reports Server (NTRS)

    Kalshoven, Jr., James E. (Inventor); Tierney, Jr., Michael (Inventor); Dabney, Philip W. (Inventor)

    2004-01-01

    An apparatus for and a method of triggering a pulsed light source, in particular a laser light source, for predictable capture of the source by video equipment. A frame synchronization signal is derived from the video signal of a camera to trigger the laser and position the resulting laser light pulse in the appropriate field of the video frame and during the opening of the electronic shutter, if such shutter is included in the camera. Positioning of the laser pulse in the proper video field allows, after recording, for the viewing of the laser light image with a video monitor using the pause mode on a standard cassette-type VCR. This invention also allows for fine positioning of the laser pulse to fall within the electronic shutter opening. For cameras with externally controllable electronic shutters, the invention provides for background light suppression by increasing shutter speed during the frame in which the laser light image is captured. This results in the laser light appearing in one frame in which the background scene is suppressed with the laser light being uneffected, while in all other frames, the shutter speed is slower, allowing for the normal recording of the background scene. This invention also allows for arbitrary (manual or external) triggering of the laser with full video synchronization and background light suppression.

  9. Removal of anti-Stokes emission background in STED microscopy by FPGA-based synchronous detection

    NASA Astrophysics Data System (ADS)

    Castello, M.; Tortarolo, G.; Coto Hernández, I.; Deguchi, T.; Diaspro, A.; Vicidomini, G.

    2017-05-01

    In stimulated emission depletion (STED) microscopy, the role of the STED beam is to de-excite, via stimulated emission, the fluorophores that have been previously excited by the excitation beam. This condition, together with specific beam intensity distributions, allows obtaining true sub-diffraction spatial resolution images. However, if the STED beam has a non-negligible probability to excite the fluorophores, a strong fluorescent background signal (anti-Stokes emission) reduces the effective resolution. For STED scanning microscopy, different synchronous detection methods have been proposed to remove this anti-Stokes emission background and recover the resolution. However, every method works only for a specific STED microscopy implementation. Here we present a user-friendly synchronous detection method compatible with any STED scanning microscope. It exploits a data acquisition (DAQ) card based on a field-programmable gate array (FPGA), which is progressively used in STED microscopy. In essence, the FPGA-based DAQ card synchronizes the fluorescent signal registration, the beam deflection, and the excitation beam interruption, providing a fully automatic pixel-by-pixel synchronous detection method. We validate the proposed method in both continuous wave and pulsed STED microscope systems.

  10. Strong field QED in lepton colliders and electron/laser interactions

    NASA Astrophysics Data System (ADS)

    Hartin, Anthony

    2018-05-01

    The studies of strong field particle physics processes in electron/laser interactions and lepton collider interaction points (IPs) are reviewed. These processes are defined by the high intensity of the electromagnetic fields involved and the need to take them into account as fully as possible. Thus, the main theoretical framework considered is the Furry interaction picture within intense field quantum field theory. In this framework, the influence of a background electromagnetic field in the Lagrangian is calculated nonperturbatively, involving exact solutions for quantized charged particles in the background field. These “dressed” particles go on to interact perturbatively with other particles, enabling the background field to play both macroscopic and microscopic roles. Macroscopically, the background field starts to polarize the vacuum, in effect rendering it a dispersive medium. Particles encountering this dispersive vacuum obtain a lifetime, either radiating or decaying into pair particles at a rate dependent on the intensity of the background field. In fact, the intensity of the background field enters into the coupling constant of the strong field quantum electrodynamic Lagrangian, influencing all particle processes. A number of new phenomena occur. Particles gain an intensity-dependent rest mass shift that accounts for their presence in the dispersive vacuum. Multi-photon events involving more than one external field photon occur at each vertex. Higher order processes which exchange a virtual strong field particle resonate via the lifetimes of the unstable strong field states. Two main arenas of strong field physics are reviewed; those occurring in relativistic electron interactions with intense laser beams, and those occurring in the beam-beam physics at the interaction point of colliders. This review outlines the theory, describes its significant novel phenomenology and details the experimental schema required to detect strong field effects and the simulation programs required to model them.

  11. Analysis on spectra of hydroacoustic field in sonar cavity of the sandwich elastic wall structure

    NASA Astrophysics Data System (ADS)

    Xuetao, W.; Rui, H.; Weike, W.

    2017-09-01

    In this paper, the characteristics of the mechanical self - noise in sonar array cavity are studied by using the elastic flatbed - filled rectangular cavity parameterization model. Firstly, the analytic derivation of the vibration differential equation of the single layer, sandwich elastic wall plate structure and internal fluid coupling is carried out, and the modal method is used to solve it. Finally, the spectral characteristics of the acoustic field of rectangular cavity of different elastic wallboard materials are simulated and analyzed, which provides a theoretical reference for the prediction and control of sonar mechanical self-noise. In this paper, the sandwich board as control inside the dome background noise of a potential means were discussed, the dome background noise of qualitative prediction analysis and control has important theoretical significance.

  12. Fermi gamma-ray imaging of a radio galaxy.

    PubMed

    Abdo, A A; Ackermann, M; Ajello, M; Atwood, W B; Baldini, L; Ballet, J; Barbiellini, G; Bastieri, D; Baughman, B M; Bechtol, K; Bellazzini, R; Berenji, B; Blandford, R D; Bloom, E D; Bonamente, E; Borgland, A W; Bregeon, J; Brez, A; Brigida, M; Bruel, P; Burnett, T H; Buson, S; Caliandro, G A; Cameron, R A; Caraveo, P A; Casandjian, J M; Cavazzuti, E; Cecchi, C; Celik, O; Chekhtman, A; Cheung, C C; Chiang, J; Ciprini, S; Claus, R; Cohen-Tanugi, J; Colafrancesco, S; Cominsky, L R; Conrad, J; Costamante, L; Cutini, S; Davis, D S; Dermer, C D; de Angelis, A; de Palma, F; Digel, S W; do Couto e Silva, E; Drell, P S; Dubois, R; Dumora, D; Farnier, C; Favuzzi, C; Fegan, S J; Finke, J; Focke, W B; Fortin, P; Fukazawa, Y; Funk, S; Fusco, P; Gargano, F; Gasparrini, D; Gehrels, N; Georganopoulos, M; Germani, S; Giebels, B; Giglietto, N; Giordano, F; Giroletti, M; Glanzman, T; Godfrey, G; Grenier, I A; Grove, J E; Guillemot, L; Guiriec, S; Hanabata, Y; Harding, A K; Hayashida, M; Hays, E; Hughes, R E; Jackson, M S; Jóhannesson, G; Johnson, A S; Johnson, T J; Johnson, W N; Kamae, T; Katagiri, H; Kataoka, J; Kawai, N; Kerr, M; Knödlseder, J; Kocian, M L; Kuss, M; Lande, J; Latronico, L; Lemoine-Goumard, M; Longo, F; Loparco, F; Lott, B; Lovellette, M N; Lubrano, P; Madejski, G M; Makeev, A; Mazziotta, M N; McConville, W; McEnery, J E; Meurer, C; Michelson, P F; Mitthumsiri, W; Mizuno, T; Moiseev, A A; Monte, C; Monzani, M E; Morselli, A; Moskalenko, I V; Murgia, S; Nolan, P L; Norris, J P; Nuss, E; Ohsugi, T; Omodei, N; Orlando, E; Ormes, J F; Paneque, D; Parent, D; Pelassa, V; Pepe, M; Pesce-Rollins, M; Piron, F; Porter, T A; Rainò, S; Rando, R; Razzano, M; Razzaque, S; Reimer, A; Reimer, O; Reposeur, T; Ritz, S; Rochester, L S; Rodriguez, A Y; Romani, R W; Roth, M; Ryde, F; Sadrozinski, H F-W; Sambruna, R; Sanchez, D; Sander, A; Saz Parkinson, P M; Scargle, J D; Sgrò, C; Siskind, E J; Smith, D A; Smith, P D; Spandre, G; Spinelli, P; Starck, J-L; Stawarz, Ł; Strickman, M S; Suson, D J; Tajima, H; Takahashi, H; Takahashi, T; Tanaka, T; Thayer, J B; Thayer, J G; Thompson, D J; Tibaldo, L; Torres, D F; Tosti, G; Tramacere, A; Uchiyama, Y; Usher, T L; Vasileiou, V; Vilchez, N; Vitale, V; Waite, A P; Wallace, E; Wang, P; Winer, B L; Wood, K S; Ylinen, T; Ziegler, M; Hardcastle, M J; Kazanas, D

    2010-05-07

    The Fermi Gamma-ray Space Telescope has detected the gamma-ray glow emanating from the giant radio lobes of the radio galaxy Centaurus A. The resolved gamma-ray image shows the lobes clearly separated from the central active source. In contrast to all other active galaxies detected so far in high-energy gamma-rays, the lobe flux constitutes a considerable portion (greater than one-half) of the total source emission. The gamma-ray emission from the lobes is interpreted as inverse Compton-scattered relic radiation from the cosmic microwave background, with additional contribution at higher energies from the infrared-to-optical extragalactic background light. These measurements provide gamma-ray constraints on the magnetic field and particle energy content in radio galaxy lobes, as well as a promising method to probe the cosmic relic photon fields.

  13. Born iterative reconstruction using perturbed-phase field estimates

    PubMed Central

    Astheimer, Jeffrey P.; Waag, Robert C.

    2008-01-01

    A method of image reconstruction from scattering measurements for use in ultrasonic imaging is presented. The method employs distorted-wave Born iteration but does not require using a forward-problem solver or solving large systems of equations. These calculations are avoided by limiting intermediate estimates of medium variations to smooth functions in which the propagated fields can be approximated by phase perturbations derived from variations in a geometric path along rays. The reconstruction itself is formed by a modification of the filtered-backpropagation formula that includes correction terms to account for propagation through an estimated background. Numerical studies that validate the method for parameter ranges of interest in medical applications are presented. The efficiency of this method offers the possibility of real-time imaging from scattering measurements. PMID:19062873

  14. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  15. A frequency domain linearized Navier-Stokes method including acoustic damping by eddy viscosity using RANS

    NASA Astrophysics Data System (ADS)

    Holmberg, Andreas; Kierkegaard, Axel; Weng, Chenyang

    2015-06-01

    In this paper, a method for including damping of acoustic energy in regions of strong turbulence is derived for a linearized Navier-Stokes method in the frequency domain. The proposed method is validated and analyzed in 2D only, although the formulation is fully presented in 3D. The result is applied in a study of the linear interaction between the acoustic and the hydrodynamic field in a 2D T-junction, subject to grazing flow at Mach 0.1. Part of the acoustic energy at the upstream edge of the junction is shed as harmonically oscillating disturbances, which are conveyed across the shear layer over the junction, where they interact with the acoustic field. As the acoustic waves travel in regions of strong shear, there is a need to include the interaction between the background turbulence and the acoustic field. For this purpose, the oscillation of the background turbulence Reynold's stress, due to the acoustic field, is modeled using an eddy Newtonian model assumption. The time averaged flow is first solved for using RANS along with a k-ε turbulence model. The spatially varying turbulent eddy viscosity is then added to the spatially invariant kinematic viscosity in the acoustic set of equations. The response of the 2D T-junction to an incident acoustic field is analyzed via a plane wave scattering matrix model, and the result is compared to experimental data for a T-junction of rectangular ducts. A strong improvement in the agreement between calculation and experimental data is found when the modification proposed in this paper is implemented. Discrepancies remaining are likely due to inaccuracies in the selected turbulence model, which is known to produce large errors e.g. for flows with significant rotation, which the grazing flow across the T-junction certainly is. A natural next step is therefore to test the proposed methodology together with more sophisticated turbulence models.

  16. Visualizing and quantifying microtopographic change of dryland landscapes from an unmanned aircraft system

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: Soil and site stability are key attributes of assessing the health of dryland landscapes because these lands are susceptible to high rates of wind- and water-caused erosion. Field techniques for measuring and monitoring soil erosion in drylands are often labor intensive...

  17. 76 FR 58157 - Shiga Toxin-Producing Escherichia coli

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-20

    .... Comments may be submitted by either of the following methods: Federal eRulemaking Portal: This Web site provides the ability to type short comments directly into the comment field on this Web page or attach a.... Department of Agriculture, (202) 205-0495. SUPPLEMENTARY INFORMATION: Table of Contents Background I. Shiga...

  18. Root traits and soil properties in harvested perennial grassland, annual wheat, and never-tilled annual wheat

    USDA-ARS?s Scientific Manuscript database

    Background and aims: Root functional traits are determinants of soil carbon storage; plant productivity; and ecosystemproperties. However, few studies look at both annual and perennial roots, soil properties, and productivity in the context of field scale agricultural systems. Methods: In Long Term...

  19. Doppler distortion correction based on microphone array and matching pursuit algorithm for a wayside train bearing monitoring system

    NASA Astrophysics Data System (ADS)

    Liu, Xingchen; Hu, Zhiyong; He, Qingbo; Zhang, Shangbin; Zhu, Jun

    2017-10-01

    Doppler distortion and background noise can reduce the effectiveness of wayside acoustic train bearing monitoring and fault diagnosis. This paper proposes a method of combining a microphone array and matching pursuit algorithm to overcome these difficulties. First, a dictionary is constructed based on the characteristics and mechanism of a far-field assumption. Then, the angle of arrival of the train bearing is acquired when applying matching pursuit to analyze the acoustic array signals. Finally, after obtaining the resampling time series, the Doppler distortion can be corrected, which is convenient for further diagnostic work. Compared with traditional single-microphone Doppler correction methods, the advantages of the presented array method are its robustness to background noise and its barely requiring pre-measuring parameters. Simulation and experimental study show that the proposed method is effective in performing wayside acoustic bearing fault diagnosis.

  20. Introduction to Geostatistics

    NASA Astrophysics Data System (ADS)

    Kitanidis, P. K.

    1997-05-01

    Introduction to Geostatistics presents practical techniques for engineers and earth scientists who routinely encounter interpolation and estimation problems when analyzing data from field observations. Requiring no background in statistics, and with a unique approach that synthesizes classic and geostatistical methods, this book offers linear estimation methods for practitioners and advanced students. Well illustrated with exercises and worked examples, Introduction to Geostatistics is designed for graduate-level courses in earth sciences and environmental engineering.

  1. Pragmatic mode-sum regularization method for semiclassical black-hole spacetimes

    NASA Astrophysics Data System (ADS)

    Levi, Adam; Ori, Amos

    2015-05-01

    Computation of the renormalized stress-energy tensor is the most serious obstacle in studying the dynamical, self-consistent, semiclassical evaporation of a black hole in 4D. The difficulty arises from the delicate regularization procedure for the stress-energy tensor, combined with the fact that in practice the modes of the field need to be computed numerically. We have developed a new method for numerical implementation of the point-splitting regularization in 4D, applicable to the renormalized stress-energy tensor as well as to ⟨ϕ2⟩ren , namely the renormalized ⟨ϕ2⟩. So far we have formulated two variants of this method: t -splitting (aimed for stationary backgrounds) and angular splitting (for spherically symmetric backgrounds). In this paper we introduce our basic approach, and then focus on the t -splitting variant, which is the simplest of the two (deferring the angular-splitting variant to a forthcoming paper). We then use this variant, as a first stage, to calculate ⟨ϕ2⟩ren in Schwarzschild spacetime, for a massless scalar field in the Boulware state. We compare our results to previous ones, obtained by a different method, and find full agreement. We discuss how this approach can be applied (using the angular-splitting variant) to analyze the dynamical self-consistent evaporation of black holes.

  2. Isotropy-violation diagnostics for B-mode polarization foregrounds to the Cosmic Microwave Background

    NASA Astrophysics Data System (ADS)

    Rotti, Aditya; Huffenberger, Kevin

    2016-09-01

    Isotropy-violation statistics can highlight polarized galactic foregrounds that contaminate primordial B-modes in the Cosmic Microwave Background (CMB). We propose a particular isotropy-violation test and apply it to polarized Planck 353 GHz data, constructing a map that indicates B-mode foreground dust power over the sky. We build our main isotropy test in harmonic space via the bipolar spherical harmonic basis, and our method helps us to identify the least-contaminated directions. By this measure, there are regions of low foreground in and around the BICEP field, near the South Galactic Pole, and in the Northern Galactic Hemisphere. There is also a possible foreground feature in the BICEP field. We compare our results to those based on the local power spectrum, which is computed on discs using a version of the method of Planck Int. XXX (2016). The discs method is closely related to our isotropy-violation diagnostic. We pay special care to the treatment of noise, including chance correlations with the foregrounds. Currently we use our isotropy tool to assess the cleanest portions of the sky, but in the future such methods will allow isotropy-based null tests for foreground contamination in maps purported to measure primordial B-modes, particularly in cases of limited frequency coverage.

  3. A research agenda on patient safety in primary care. Recommendations by the LINNEAUS collaboration on patient safety in primary care

    PubMed Central

    Verstappen, Wim; Gaal, Sander; Bowie, Paul; Parker, Diane; Lainer, Miriam; Valderas, Jose M.; Wensing, Michel; Esmail, Aneez

    2015-01-01

    ABSTRACT Background: Healthcare can cause avoidable serious harm to patients. Primary care is not an exception, and the relative lack of research in this area lends urgency to a better understanding of patient safety, the future research agenda and the development of primary care oriented safety programmes. Objective: To outline a research agenda for patient safety improvement in primary care in Europe and beyond. Methods: The LINNEAUS collaboration partners analysed existing research on epidemiology and classification of errors, diagnostic and medication errors, safety culture, and learning for and improving patient safety. We discussed ideas for future research in several meetings, workshops and congresses with LINNEAUS collaboration partners, practising GPs, researchers in this field, and policy makers. Results: This paper summarizes and integrates the outcomes of the LINNEAUS collaboration on patient safety in primary care. It proposes a research agenda on improvement strategies for patient safety in primary care. In addition, it provides background information to help to connect research in this field with practicing GPs and other healthcare workers in primary care. Conclusion: Future research studies should target specific primary care domains, using prospective methods and innovative methods such as patient involvement. PMID:26339841

  4. Detection of S-Nitrosothiols

    PubMed Central

    Diers, Anne R.; Keszler, Agnes; Hogg, Neil

    2015-01-01

    BACKGROUND S-Nitrosothiols have been recognized as biologically-relevant products of nitric oxide that are involved in many of the diverse activities of this free radical. SCOPE OF REVIEW This review serves to discuss current methods for the detection and analysis of protein S-nitrosothiols. The major methods of S-nitrosothiol detection include chemiluminescence-based methods and switch-based methods, each of which comes in various flavors with advantages and caveats. MAJOR CONCLUSIONS The detection of S-nitrosothiols is challenging and prone to many artifacts. Accurate measurements require an understanding of the underlying chemistry of the methods involved and the use of appropriate controls. GENERAL SIGNIFICANCE Nothing is more important to a field of research than robust methodology that is generally trusted. The field of S-Nitrosation has developed such methods but, as S-nitrosothiols are easy to introduce as artifacts, it is vital that current users learn from the lessons of the past. PMID:23988402

  5. A level set method for multiple sclerosis lesion segmentation.

    PubMed

    Zhao, Yue; Guo, Shuxu; Luo, Min; Shi, Xue; Bilello, Michel; Zhang, Shaoxiang; Li, Chunming

    2018-06-01

    In this paper, we present a level set method for multiple sclerosis (MS) lesion segmentation from FLAIR images in the presence of intensity inhomogeneities. We use a three-phase level set formulation of segmentation and bias field estimation to segment MS lesions and normal tissue region (including GM and WM) and CSF and the background from FLAIR images. To save computational load, we derive a two-phase formulation from the original multi-phase level set formulation to segment the MS lesions and normal tissue regions. The derived method inherits the desirable ability to precisely locate object boundaries of the original level set method, which simultaneously performs segmentation and estimation of the bias field to deal with intensity inhomogeneity. Experimental results demonstrate the advantages of our method over other state-of-the-art methods in terms of segmentation accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Background for protective action recommendations: accidental radioactive contamination of food and animal feeds. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shleien, B.; Schmidt, G.D.; Chiacchierini, R.P.

    This report provides background material for the development of FDA's Protective Action Recommendations: Accidental Radioactive Contamination of Food and Animal Feeds. The rationale, dosimetric and agricultural transport models for the Protective Action Guides are presented, along with information on dietary intake. In addition, the document contains a discussion of field methods of analysis of radionuclides deposited on the ground or contained in milk and herbage. Various protective actions are described and evaluated, and a cost-effectiveness analysis for the recommendations performed.

  7. Traffic Video Image Segmentation Model Based on Bayesian and Spatio-Temporal Markov Random Field

    NASA Astrophysics Data System (ADS)

    Zhou, Jun; Bao, Xu; Li, Dawei; Yin, Yongwen

    2017-10-01

    Traffic video image is a kind of dynamic image and its background and foreground is changed at any time, which results in the occlusion. In this case, using the general method is more difficult to get accurate image segmentation. A segmentation algorithm based on Bayesian and Spatio-Temporal Markov Random Field is put forward, which respectively build the energy function model of observation field and label field to motion sequence image with Markov property, then according to Bayesian' rule, use the interaction of label field and observation field, that is the relationship of label field’s prior probability and observation field’s likelihood probability, get the maximum posterior probability of label field’s estimation parameter, use the ICM model to extract the motion object, consequently the process of segmentation is finished. Finally, the segmentation methods of ST - MRF and the Bayesian combined with ST - MRF were analyzed. Experimental results: the segmentation time in Bayesian combined with ST-MRF algorithm is shorter than in ST-MRF, and the computing workload is small, especially in the heavy traffic dynamic scenes the method also can achieve better segmentation effect.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xi; Mou, Xuanqin; Nishikawa, Robert M.

    Purpose: Small calcifications are often the earliest and the main indicator of breast cancer. Dual-energy digital mammography (DEDM) has been considered as a promising technique to improve the detectability of calcifications since it can be used to suppress the contrast between adipose and glandular tissues of the breast. X-ray scatter leads to erroneous calculations of the DEDM image. Although the pinhole-array interpolation method can estimate scattered radiations, it requires extra exposures to measure the scatter and apply the correction. The purpose of this work is to design an algorithmic method for scatter correction in DEDM without extra exposures.Methods: In thismore » paper, a scatter correction method for DEDM was developed based on the knowledge that scattered radiation has small spatial variation and that the majority of pixels in a mammogram are noncalcification pixels. The scatter fraction was estimated in the DEDM calculation and the measured scatter fraction was used to remove scatter from the image. The scatter correction method was implemented on a commercial full-field digital mammography system with breast tissue equivalent phantom and calcification phantom. The authors also implemented the pinhole-array interpolation scatter correction method on the system. Phantom results for both methods are presented and discussed. The authors compared the background DE calcification signals and the contrast-to-noise ratio (CNR) of calcifications in the three DE calcification images: image without scatter correction, image with scatter correction using pinhole-array interpolation method, and image with scatter correction using the authors' algorithmic method.Results: The authors' results show that the resultant background DE calcification signal can be reduced. The root-mean-square of background DE calcification signal of 1962 μm with scatter-uncorrected data was reduced to 194 μm after scatter correction using the authors' algorithmic method. The range of background DE calcification signals using scatter-uncorrected data was reduced by 58% with scatter-corrected data by algorithmic method. With the scatter-correction algorithm and denoising, the minimum visible calcification size can be reduced from 380 to 280 μm.Conclusions: When applying the proposed algorithmic scatter correction to images, the resultant background DE calcification signals can be reduced and the CNR of calcifications can be improved. This method has similar or even better performance than pinhole-array interpolation method in scatter correction for DEDM; moreover, this method is convenient and requires no extra exposure to the patient. Although the proposed scatter correction method is effective, it is validated by a 5-cm-thick phantom with calcifications and homogeneous background. The method should be tested on structured backgrounds to more accurately gauge effectiveness.« less

  9. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    PubMed Central

    2012-01-01

    Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence. PMID:22452821

  10. Consistent compactification of double field theory on non-geometric flux backgrounds

    NASA Astrophysics Data System (ADS)

    Hassler, Falk; Lüst, Dieter

    2014-05-01

    In this paper, we construct non-trivial solutions to the 2 D-dimensional field equations of Double Field Theory (DFT) by using a consistent Scherk-Schwarz ansatz. The ansatz identifies 2( D - d) internal directions with a twist U M N which is directly connected to the covariant fluxes ABC . It exhibits 2( D - d) linear independent generalized Killing vectors K I J and gives rise to a gauged supergravity in d dimensions. We analyze the covariant fluxes and the corresponding gauged supergravity with a Minkowski vacuum. We calculate fluctuations around such vacua and show how they gives rise to massive scalars field and vectors field with a non-abelian gauge algebra. Because DFT is a background independent theory, these fields should directly correspond the string excitations in the corresponding background. For ( D - d) = 3 we perform a complete scan of all allowed covariant fluxes and find two different kinds of backgrounds: the single and the double elliptic case. The later is not T-dual to a geometric background and cannot be transformed to a geometric setting by a field redefinition either. While this background fulfills the strong constraint, it is still consistent with the Killing vectors depending on the coordinates and the winding coordinates, thereby giving a non-geometric patching. This background can therefore not be described in Supergravity or Generalized Geometry.

  11. Comparison of Haemophilus parasuis reference strains and field isolates by using random amplified polymorphic DNA and protein profiles

    PubMed Central

    2012-01-01

    Background Haemophilus parasuis is the causative agent of Glässer’s disease and is a pathogen of swine in high-health status herds. Reports on serotyping of field strains from outbreaks describe that approximately 30% of them are nontypeable and therefore cannot be traced. Molecular typing methods have been used as alternatives to serotyping. This study was done to compare random amplified polymorphic DNA (RAPD) profiles and whole cell protein (WCP) lysate profiles as methods for distinguishing H. parasuis reference strains and field isolates. Results The DNA and WCP lysate profiles of 15 reference strains and 31 field isolates of H. parasuis were analyzed using the Dice and neighbor joining algorithms. The results revealed unique and reproducible DNA and protein profiles among the reference strains and field isolates studied. Simpson’s index of diversity showed significant discrimination between isolates when three 10mer primers were combined for the RAPD method and also when both the RAPD and WCP lysate typing methods were combined. Conclusions The RAPD profiles seen among the reference strains and field isolates did not appear to change over time which may reflect a lack of DNA mutations in the genes of the samples. The recent field isolates had different WCP lysate profiles than the reference strains, possibly because the number of passages of the type strains may affect their protein expression. PMID:22703293

  12. [Incident reporting systems in anesthesiology--methods and benefits using the example of PaSOS].

    PubMed

    Rall, Marcus; Reddersen, Silke; Zieger, Jörg; Schädle, Bertram; Hirsch, Patricia; Stricker, Eric; Martin, Jörg; Geldner, Götz; Schleppers, Alexander

    2008-09-01

    Preventing patient harm is one of the main tasks for the field of anesthesiology from early on. With the introduction of the national German incident reporting system PaSOS, which is hosted by the German anesthesia society, anesthesiology is again leading the field of patient safety. Important elements, success factors and background information for the introduction of successful incident reporting systems in an organization are given. Examples by and from PaSOS are given.

  13. Temperature- and field-dependent characterization of a conductor on round core cable

    NASA Astrophysics Data System (ADS)

    Barth, C.; van der Laan, D. C.; Bagrets, N.; Bayer, C. M.; Weiss, K.-P.; Lange, C.

    2015-06-01

    The conductor on round core (CORC) cable is one of the major high temperature superconductor cable concepts combining scalability, flexibility, mechanical strength, ease of fabrication and high current density; making it a possible candidate as conductor for large, high field magnets. To simulate the boundary conditions of such magnets as well as the temperature dependence of CORC cables a 1.16 m long sample consisting of 15, 4 mm wide SuperPower REBCO tapes was characterized using the ‘FBI’ (force—field—current) superconductor test facility of the Institute for Technical Physics of the Karlsruhe Institute of Technology. In a five step investigation, the CORC cable’s performance was determined at different transverse mechanical loads, magnetic background fields and temperatures as well as its response to swift current changes. In the first step, the sample’s 77 K, self-field current was measured in a liquid nitrogen bath. In the second step, the temperature dependence was measured at self-field condition and compared with extrapolated single tape data. In the third step, the magnetic background field was repeatedly cycled while measuring the current carrying capabilities to determine the impact of transverse Lorentz forces on the CORC cable sample’s performance. In the fourth step, the sample’s current carrying capabilities were measured at different background fields (2-12 T) and surface temperatures (4.2-51.5 K). Through finite element method simulations, the surface temperatures are converted into average sample temperatures and the gained field- and temperature dependence is compared with extrapolated single tape data. In the fifth step, the response of the CORC cable sample to rapid current changes (8.3 kA s-1) was observed with a fast data acquisition system. During these tests, the sample performance remains constant, no degradation is observed. The sample’s measured current carrying capabilities correlate to those of single tapes assuming field- and temperature dependence as published by the manufacturer.

  14. Polarization models of filamentary molecular clouds.

    NASA Astrophysics Data System (ADS)

    Carlqvist, P.; Kristen, H.

    1997-08-01

    We study numerically the linear polarization and extinction of light from background stars in three types of models of elongated molecular clouds by following the development of the Stokes parameters. The clouds are assumed to be of cylindrical shape and penetrated by a helical magnetic field {vec}(B). In the first two models we study only the relative magnitude of the polarization assuming that the polarization is proportional to Bmu^, where primarily μ=2. Provided there is no background/foreground polarization present we find from the cylindrically symmetric Model I that the angle of polarization has a bimodal character with the polarization being either parallel with or perpendicular to the axis of the filament. For some magnetic-field geometries both angles may exist in one and the same filament. It is concluded that it is not a straightforward task to find the magnetic-field-line pattern from the polarization pattern. If a background/foreground polarization exists or, as in Model II, the filament is not cylindrically symmetric, the bimodal character of the angle of polarization is lost. By means of Model III we have, using semi-empirical methods based on the Davis-Greenstein mechanism, estimated the absolute degree of polarization in the filamentary molecular cloud L204. It is found that the polarization produced by the model is much less than the polarization observed. We therefore conclude that most of the polarization measured in the L204 cloud is not produced in the cloud itself but is constituted by a large-scale background/foreground polarization.

  15. Dynamics of Plasma Jets and Bubbles Launched into a Transverse Background Magnetic Field

    NASA Astrophysics Data System (ADS)

    Zhang, Yue

    2017-10-01

    A coaxial magnetized plasma gun has been utilized to launch both plasma jets (open B-field) and plasma bubbles (closed B-field) into a transverse background magnetic field in the HelCat (Helicon-Cathode) linear device at the University of New Mexico. These situations may have bearing on fusion plasmas (e.g. plasma injection for tokamak fueling, ELM pacing, or disruption mitigation) and astrophysical settings (e.g. astrophysical jet stability, coronal mass ejections, etc.). The magnetic Reynolds number of the gun plasma is 100 , so that magnetic advection dominates over magnetic diffusion. The gun plasma ram pressure, ρjetVjet2 >B02 / 2μ0 , the background magnetic pressure, so that the jet or bubble can easily penetrate the background B-field, B0. When the gun axial B-field is weak compared to the gun azimuthal field, a current-driven jet is formed with a global helical magnetic configuration. Applying the transverse background magnetic field, it is observed that the n = 1 kink mode is stabilized, while magnetic probe measurements show contrarily that the safety factor q(a) drops below unity. At the same time, a sheared axial jet velocity is measured. We conclude that the tension force arising from increasing curvature of the background magnetic field induces the measured sheared flow gradient above the theoretical kink-stabilization threshold, resulting in the emergent kink stabilization of the injected plasma jet. In the case of injected bubbles, spheromak-like plasma formation is verified. However, when the spheromak plasma propagates into the transverse background magnetic field, the typical self-closed global symmetry magnetic configuration does not hold any more. In the region where the bubble toroidal field opposed the background B-field, the magneto-Rayleigh-Taylor (MRT) instability has been observed. Details of the experiment setup, diagnostics, experimental results and theoretical analysis will be presented. Supported by the National Science Foundation under Grant No. AST-0613577 and the Army Research Office under Award No. W911NF1510480. This work performed in collaboration with D. Fisher, A. G. Lynn, M Gilmore, and S. C. Hsu.

  16. Scalar field dark energy with a minimal coupling in a spherically symmetric background

    NASA Astrophysics Data System (ADS)

    Matsumoto, Jiro

    Dark energy models and modified gravity theories have been actively studied and the behaviors in the solar system have been also carefully investigated in a part of the models. However, the isotropic solutions of the field equations in the simple models of dark energy, e.g. quintessence model without matter coupling, have not been well investigated. One of the reason would be the nonlinearity of the field equations. In this paper, a method to evaluate the solution of the field equations is constructed, and it is shown that there is a model that can easily pass the solar system tests, whereas, there is also a model that is constrained from the solar system tests.

  17. On the adiabatic limit of Hadamard states

    NASA Astrophysics Data System (ADS)

    Drago, Nicolò; Gérard, Christian

    2017-08-01

    We consider the adiabatic limit of Hadamard states for free quantum Klein-Gordon fields, when the background metric and the field mass are slowly varied from their initial to final values. If the Klein-Gordon field stays massive, we prove that the adiabatic limit of the initial vacuum state is the (final) vacuum state, by extending to the symplectic framework the adiabatic theorem of Avron-Seiler-Yaffe. In cases when only the field mass is varied, using an abstract version of the mode decomposition method we can also consider the case when the initial or final mass vanishes, and the initial state is either a thermal state or a more general Hadamard state.

  18. Fermi Gamma-Ray Imaging of a Radio Galaxy

    DOE PAGES

    Abdo, A. A.; Ackermann, M.; Ajello, M.; ...

    2010-04-01

    The Fermi Gamma-ray Space Telescope has detected the γ-ray glow emanating from the giant radio lobes of the radio galaxy Centaurus A. The resolved γ-ray image shows the lobes clearly separated from the central active source. In contrast to all other active galaxies detected so far in high-energy γ-rays, the lobe flux constitutes a considerable portion (greater than one-half) of the total source emission. The γ-ray emission from the lobes is interpreted as inverse Compton–scattered relic radiation from the cosmic microwave background, with additional contribution at higher energies from the infrared-to-optical extragalactic background light. In conclusion, these measurements provide γ-raymore » constraints on the magnetic field and particle energy content in radio galaxy lobes, as well as a promising method to probe the cosmic relic photon fields.« less

  19. Sex Attractants of the Banana Moth, Opogona sacchari Bojer (Lepidoptera: Tineidae): Provisional Identification and Field Evaluation

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: The banana moth, Opogona sacchari Bojer, is a ployphagous agricultural pest in many tropical areas of the world. The identification of an attractant for male O. sacchari could offer new methods for detection, study and control. RESULTS: A male electroantennographically active compound w...

  20. A Selected Bibliography on International Education.

    ERIC Educational Resources Information Center

    Foreign Policy Association, New York, NY.

    This unannotated bibliography is divided into four major sections; 1) General Background Readings for Teachers; 2) Approaches and Methods; 3) Materials for the Classroom; and, 4) Sources of Information and Materials. It offers a highly selective list of items which provide wide coverage of the field. Included are items on foreign policy, war and…

  1. Ageing and People with Learning Disabilities: In Search of Evidence

    ERIC Educational Resources Information Center

    Walker, Carol

    2015-01-01

    Background: Growing numbers of people with learning disabilities are now living into older age. This study aims to examine the state of knowledge about their lives and the challenges that ageing has for both family carers and policymakers and practitioners. Materials and Methods: The article synthesises existing research in the fields of learning…

  2. What Do GCSE Examiners Think of "Thinking Aloud"? Findings from an Exploratory Study

    ERIC Educational Resources Information Center

    Greatorex, Jackie; Suto, Irenka W. M.

    2008-01-01

    Background: "Thinking aloud" is a well-established method of data collection in education, assessment, and other fields of research. However, while many researchers have reported their views on its usage, the first-hand experiences of research participants have received less attention. Purpose: The aim of this exploratory study was to…

  3. Finding Possibility in Pitfalls: The Role of Permeable Methods Pedagogy in Preservice Teacher Learning

    ERIC Educational Resources Information Center

    Hebard, Heather

    2016-01-01

    Background/context: Tensions between university-based teacher preparation courses and field placements have long been identified as an obstacle to novices' uptake of promising instructional practices. This tension is particularly salient for writing instruction, which continues to receive inadequate attention in K-12 classrooms. More scholarship…

  4. The Effect of Brief Digital Interventions on Attitudes to Intellectual Disability: Results from a Pilot Study

    ERIC Educational Resources Information Center

    Lindau, Natalie; Amin, Tara; Zambon, Amy; Scior, Katrina

    2018-01-01

    Background: Evidence on the effects of contact and education based interventions on attitudes is limited in the intellectual disability field. This study compared the effects of brief interventions with different education, indirect and imagined contact components on lay people's attitudes. Materials and Methods: 401 adult participants were…

  5. The Impact of Hands-On Simulation Laboratories on Teaching of Wireless Communications

    ERIC Educational Resources Information Center

    Chou, Te-Shun; Vanderbye, Aaron

    2017-01-01

    Aim/Purpose: To prepare students with both theoretical knowledge and practical skills in the field of wireless communications. Background: Teaching wireless communications and networking is not an easy task because it involves broad subjects and abstract content. Methodology: A pedagogical method that combined lectures, labs, assignments, exams,…

  6. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Operator Approach to the Master Equation for the One-Step Process

    NASA Astrophysics Data System (ADS)

    Hnatič, M.; Eferina, E. G.; Korolkova, A. V.; Kulyabov, D. S.; Sevastyanov, L. A.

    2016-02-01

    Background. Presentation of the probability as an intrinsic property of the nature leads researchers to switch from deterministic to stochastic description of the phenomena. The kinetics of the interaction has recently attracted attention because it often occurs in the physical, chemical, technical, biological, environmental, economic, and sociological systems. However, there are no general methods for the direct study of this equation. The expansion of the equation in a formal Taylor series (the so called Kramers-Moyal's expansion) is used in the procedure of stochastization of one-step processes. Purpose. However, this does not eliminate the need for the study of the master equation. Method. It is proposed to use quantum field perturbation theory for the statistical systems (the so-called Doi method). Results: This work is a methodological material that describes the principles of master equation solution based on quantum field perturbation theory methods. The characteristic property of the work is that it is intelligible for non-specialists in quantum field theory. Conclusions: We show the full equivalence of the operator and combinatorial methods of obtaining and study of the one-step process master equation.

  8. A real-time tracking system of infrared dim and small target based on FPGA and DSP

    NASA Astrophysics Data System (ADS)

    Rong, Sheng-hui; Zhou, Hui-xin; Qin, Han-lin; Wang, Bing-jian; Qian, Kun

    2014-11-01

    A core technology in the infrared warning system is the detection tracking of dim and small targets with complicated background. Consequently, running the detection algorithm on the hardware platform has highly practical value in the military field. In this paper, a real-time detection tracking system of infrared dim and small target which is used FPGA (Field Programmable Gate Array) and DSP (Digital Signal Processor) as the core was designed and the corresponding detection tracking algorithm and the signal flow is elaborated. At the first stage, the FPGA obtain the infrared image sequence from the sensor, then it suppresses background clutter by mathematical morphology method and enhances the target intensity by Laplacian of Gaussian operator. At the second stage, the DSP obtain both the original image and the filtered image form the FPGA via the video port. Then it segments the target from the filtered image by an adaptive threshold segmentation method and gets rid of false target by pipeline filter. Experimental results show that our system can achieve higher detection rate and lower false alarm rate.

  9. Measuring electromagnetic fields (EMF) around wind turbines in Canada: is there a human health concern?

    PubMed Central

    2014-01-01

    Background The past five years has seen considerable expansion of wind power generation in Ontario, Canada. Most recently worries about exposure to electromagnetic fields (EMF) from wind turbines, and associated electrical transmission, has been raised at public meetings and legal proceedings. These fears have not been based on any actual measurements of EMF exposure surrounding existing projects but appear to follow from worries from internet sources and misunderstanding of the science. Methods The study was carried out at the Kingsbridge 1 Wind Farm located near Goderich, Ontario, Canada. Magnetic field measurements were collected in the proximity of 15 Vestas 1.8 MW wind turbines, two substations, various buried and overhead collector and transmission lines, and nearby homes. Data were collected during three operational scenarios to characterize potential EMF exposure: ‘high wind’ (generating power), ‘low wind’ (drawing power from the grid, but not generating power) and ‘shut off’ (neither drawing, nor generating power). Results Background levels of EMF (0.2 to 0.3 mG) were established by measuring magnetic fields around the wind turbines under the ‘shut off’ scenario. Magnetic field levels detected at the base of the turbines under both the ‘high wind’ and ‘low wind’ conditions were low (mean = 0.9 mG; n = 11) and rapidly diminished with distance, becoming indistinguishable from background within 2 m of the base. Magnetic fields measured 1 m above buried collector lines were also within background (≤ 0.3 mG). Beneath overhead 27.5 kV and 500 kV transmission lines, magnetic field levels of up to 16.5 and 46 mG, respectively, were recorded. These levels also diminished rapidly with distance. None of these sources appeared to influence magnetic field levels at nearby homes located as close as just over 500 m from turbines, where measurements immediately outside of the homes were ≤ 0.4 mG. Conclusions The results suggest that there is nothing unique to wind farms with respect to EMF exposure; in fact, magnetic field levels in the vicinity of wind turbines were lower than those produced by many common household electrical devices and were well below any existing regulatory guidelines with respect to human health. PMID:24529028

  10. Saliency Detection on Light Field.

    PubMed

    Li, Nianyi; Ye, Jinwei; Ji, Yu; Ling, Haibin; Yu, Jingyi

    2017-08-01

    Existing saliency detection approaches use images as inputs and are sensitive to foreground/background similarities, complex background textures, and occlusions. We explore the problem of using light fields as input for saliency detection. Our technique is enabled by the availability of commercial plenoptic cameras that capture the light field of a scene in a single shot. We show that the unique refocusing capability of light fields provides useful focusness, depths, and objectness cues. We further develop a new saliency detection algorithm tailored for light fields. To validate our approach, we acquire a light field database of a range of indoor and outdoor scenes and generate the ground truth saliency map. Experiments show that our saliency detection scheme can robustly handle challenging scenarios such as similar foreground and background, cluttered background, complex occlusions, etc., and achieve high accuracy and robustness.

  11. [Evaluation of Sugar Content of Huanghua Pear on Trees by Visible/Near Infrared Spectroscopy].

    PubMed

    Liu, Hui-jun; Ying, Yi-bin

    2015-11-01

    A method of ambient light correction was proposed to evaluate the sugar content of Huanghua pears on tree by visible/near infrared diffuse reflectance spectroscopy (Vis/NIRS). Due to strong interference of ambient light, it was difficult to collect the efficient spectral of pears on tree. In the field, covering the fruits with a bag blocking ambient light can get better results, but the efficiency is fairly low, the instrument corrections of dark and reference spectra may help to reduce the error of the model, however, the interference of the ambient light cannot be eliminated effectively. In order to reduce the effect of ambient light, a shutter was attached to the front of probe. When opening shutter, the spot spectrum were obtained, on which instrument light and ambient light acted at the same time. While closing shutter, background spectra were obtained, on which only ambient light acted, then the ambient light spectra was subtracted from spot spectra. Prediction models were built using data on tree (before and after ambient light correction) and after harvesting by partial least square (PLS). The results of the correlation coefficient (R) are 0.1, 0.69, 0.924; the root mean square error of prediction (SEP) are 0. 89°Brix, 0.42°Brix, 0.27°Brix; ratio of standard deviation (SD) to SEP (RPD) are 0.79, 1.69, 2.58, respectively. The results indicate that, method of background correction used in the experiment can reduce the effect of ambient lighting on spectral acquisition of Huanghua pears in field, efficiently. This method can be used to collect the visible/near infrared spectrum of fruits in field, and may give full play to visible/near-infrared spectroscopy in preharvest management and maturity testing of fruits in the field.

  12. Evaluating performance in three-dimensional fluorescence microscopy

    PubMed Central

    MURRAY, JOHN M; APPLETON, PAUL L; SWEDLOW, JASON R; WATERS, JENNIFER C

    2007-01-01

    In biological fluorescence microscopy, image contrast is often degraded by a high background arising from out of focus regions of the specimen. This background can be greatly reduced or eliminated by several modes of thick specimen microscopy, including techniques such as 3-D deconvolution and confocal. There has been a great deal of interest and some confusion about which of these methods is ‘better’, in principle or in practice. The motivation for the experiments reported here is to establish some rough guidelines for choosing the most appropriate method of microscopy for a given biological specimen. The approach is to compare the efficiency of photon collection, the image contrast and the signal-to-noise ratio achieved by the different methods at equivalent illumination, using a specimen in which the amount of out of focus background is adjustable over the range encountered with biological samples. We compared spot scanning confocal, spinning disk confocal and wide-field/deconvolution (WFD) microscopes and find that the ratio of out of focus background to in-focus signal can be used to predict which method of microscopy will provide the most useful image. We also find that the precision of measurements of net fluorescence yield is very much lower than expected for all modes of microscopy. Our analysis enabled a clear, quantitative delineation of the appropriate use of different imaging modes relative to the ratio of out-of-focus background to in-focus signal, and defines an upper limit to the useful range of the three most common modes of imaging. PMID:18045334

  13. Further Interpretation of the Relationship between Faunal Community and Seafloor Geology at Southern Hydrate Ridge, Cascadia Margin: Exploring Machine Learning

    NASA Astrophysics Data System (ADS)

    Bigham, K.; Kelley, D. S.; Marburg, A.; Delaney, J. R.

    2017-12-01

    In 2011, high-resolution, georeferenced photomoasiacs were taken of Einstein's Grotto, an active methane hydrate seep within the field at Southern Hydrate Ridge located 90 km west of Newport, Oregon at a water depth of 800 m. Methods used to analyze the relationships between the seep site, seafloor geology, and the spatial distribution and abundances of microbial and macrofaunal communities at Einstein's Grotto were expanded to three other sites over the 200 by 300 m active seep field. These seeps were documented in the same survey in 2011 conducted by the remotely operated vehicle ROPOS on board the R/V Thompson. Over 10,000 high definition images allowed for the further quantification and characterization of the diversity and structure of the faunal community at this seep field. The new results support the study's initial findings of high variability in the distribution and abundance of seep organisms across the field, with correlation to seafloor geology. The manual classification of organisms was also used to train a series of convolutional neural networks in Nvidia DIGITS and Google Tensorflow environments for automated identification. The developed networks proved highly accurate at background/non-background segmentation ( 96%) and slightly less reliable for fauna identification ( 89%). This study provides a baseline for the faunal community at the Southern Hydrate Ridge methane seeps and a more efficient computer assisted method for processing follow on studies.

  14. How we compute N matters to estimates of mixing in stratified flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arthur, Robert S.; Venayagamoorthy, Subhas K.; Koseff, Jeffrey R.

    We know that most commonly used models for turbulent mixing in the ocean rely on a background stratification against which turbulence must work to stir the fluid. While this background stratification is typically well defined in idealized numerical models, it is more difficult to capture in observations. Here, a potential discrepancy in ocean mixing estimates due to the chosen calculation of the background stratification is explored using direct numerical simulation data of breaking internal waves on slopes. There are two different methods for computing the buoyancy frequencymore » $N$$, one based on a three-dimensionally sorted density field (often used in numerical models) and the other based on locally sorted vertical density profiles (often used in the field), are used to quantify the effect of$$N$$on turbulence quantities. It is shown that how$$N$$is calculated changes not only the flux Richardson number$$R_{f}$$, which is often used to parameterize turbulent mixing, but also the turbulence activity number or the Gibson number$$Gi$$, leading to potential errors in estimates of the mixing efficiency using$$Gi$-based parameterizations.« less

  15. How we compute N matters to estimates of mixing in stratified flows

    DOE PAGES

    Arthur, Robert S.; Venayagamoorthy, Subhas K.; Koseff, Jeffrey R.; ...

    2017-10-13

    We know that most commonly used models for turbulent mixing in the ocean rely on a background stratification against which turbulence must work to stir the fluid. While this background stratification is typically well defined in idealized numerical models, it is more difficult to capture in observations. Here, a potential discrepancy in ocean mixing estimates due to the chosen calculation of the background stratification is explored using direct numerical simulation data of breaking internal waves on slopes. There are two different methods for computing the buoyancy frequencymore » $N$$, one based on a three-dimensionally sorted density field (often used in numerical models) and the other based on locally sorted vertical density profiles (often used in the field), are used to quantify the effect of$$N$$on turbulence quantities. It is shown that how$$N$$is calculated changes not only the flux Richardson number$$R_{f}$$, which is often used to parameterize turbulent mixing, but also the turbulence activity number or the Gibson number$$Gi$$, leading to potential errors in estimates of the mixing efficiency using$$Gi$-based parameterizations.« less

  16. [Electormagnetic field of the mobile phone base station: case study].

    PubMed

    Bieńkowski, Paweł; Zubrzak, Bartłomiej; Surma, Robert

    2011-01-01

    The paper presents changes in the electromagnetic field intensity in a school building and its surrounding after the mobile phone base station installation on the roof of the school. The comparison of EMF intensity measured before the base station was launched (electromagnetic background measurement) and after starting its operation (two independent control measurements) is discussed. Analyses of measurements are presented and the authors also propose the method of the electromagnetic field distribution adjustment in the area of radiation antennas side lobe to reduce the intensity of the EMF level in the base station proximity. The presented method involves the regulation of the inclination. On the basis of the measurements, it was found that the EMF intensity increased in the building and its surroundings, but the values measured with wide margins meet the requirements of the Polish law on environmental protection.

  17. Quantum theory of electromagnetic fields in a cosmological quantum spacetime

    NASA Astrophysics Data System (ADS)

    Lewandowski, Jerzy; Nouri-Zonoz, Mohammad; Parvizi, Ali; Tavakoli, Yaser

    2017-11-01

    The theory of quantum fields propagating on an isotropic cosmological quantum spacetime is reexamined by generalizing the scalar test field to an electromagnetic (EM) vector field. For any given polarization of the EM field on the classical background, the Hamiltonian can be written in the form of the Hamiltonian of a set of decoupled harmonic oscillators, each corresponding to a single mode of the field. In transition from the classical to quantum spacetime background, following the technical procedure given by Ashtekar et al. [Phys. Rev. D 79, 064030 (2009), 10.1103/PhysRevD.79.064030], a quantum theory of the test EM field on an effective (dressed) spacetime emerges. The nature of this emerging dressed geometry is independent of the chosen polarization, but it may depend on the energy of the corresponding field mode. Specifically, when the backreaction of the field on the quantum geometry is negligible (i.e., a test field approximation is assumed), all field modes probe the same effective background independent of the mode's energy. However, when the backreaction of the field modes on the quantum geometry is significant, by employing a Born-Oppenheimer approximation, it is shown that a rainbow (i.e., a mode-dependent) metric emerges. The emergence of this mode-dependent background in the Planck regime may have a significant effect on the creation of quantum particles. The production amount on the dressed background is computed and is compared with the familiar results on the classical geometry.

  18. Heat kernel and Weyl anomaly of Schrödinger invariant theory

    NASA Astrophysics Data System (ADS)

    Pal, Sridip; Grinstein, Benjamín

    2017-12-01

    We propose a method inspired by discrete light cone quantization to determine the heat kernel for a Schrödinger field theory (Galilean boost invariant with z =2 anisotropic scaling symmetry) living in d +1 dimensions, coupled to a curved Newton-Cartan background, starting from a heat kernel of a relativistic conformal field theory (z =1 ) living in d +2 dimensions. We use this method to show that the Schrödinger field theory of a complex scalar field cannot have any Weyl anomalies. To be precise, we show that the Weyl anomaly Ad+1 G for Schrödinger theory is related to the Weyl anomaly of a free relativistic scalar CFT Ad+2 R via Ad+1 G=2 π δ (m )Ad+2 R , where m is the charge of the scalar field under particle number symmetry. We provide further evidence of the vanishing anomaly by evaluating Feynman diagrams in all orders of perturbation theory. We present an explicit calculation of the anomaly using a regulated Schrödinger operator, without using the null cone reduction technique. We generalize our method to show that a similar result holds for theories with a single time-derivative and with even z >2 .

  19. Non-Destructive Evaluation of the Leaf Nitrogen Concentration by In-Field Visible/Near-Infrared Spectroscopy in Pear Orchards.

    PubMed

    Wang, Jie; Shen, Changwei; Liu, Na; Jin, Xin; Fan, Xueshan; Dong, Caixia; Xu, Yangchun

    2017-03-08

    Non-destructive and timely determination of leaf nitrogen (N) concentration is urgently needed for N management in pear orchards. A two-year field experiment was conducted in a commercial pear orchard with five N application rates: 0 (N0), 165 (N1), 330 (N2), 660 (N3), and 990 (N4) kg·N·ha -1 . The mid-portion leaves on the year's shoot were selected for the spectral measurement first and then N concentration determination in the laboratory at 50 and 80 days after full bloom (DAB). Three methods of in-field spectral measurement (25° bare fibre under solar conditions, black background attached to plant probe, and white background attached to plant probe) were compared. We also investigated the modelling performances of four chemometric techniques (principal components regression, PCR; partial least squares regression, PLSR; stepwise multiple linear regression, SMLR; and back propagation neural network, BPNN) and three vegetation indices (difference spectral index, normalized difference spectral index, and ratio spectral index). Due to the low correlation of reflectance obtained by the 25° field of view method, all of the modelling was performed on two spectral datasets-both acquired by a plant probe. Results showed that the best modelling and prediction accuracy were found in the model established by PLSR and spectra measured with a black background. The randomly-separated subsets of calibration ( n = 1000) and validation ( n = 420) of this model resulted in high R² values of 0.86 and 0.85, respectively, as well as a low mean relative error (<6%). Furthermore, a higher coefficient of determination between the leaf N concentration and fruit yield was found at 50 DAB samplings in both 2015 (R² = 0.77) and 2014 (R² = 0.59). Thus, the leaf N concentration was suggested to be determined at 50 DAB by visible/near-infrared spectroscopy and the threshold should be 24-27 g/kg.

  20. TU-F-BRD-01: Biomedical Informatics for Medical Physicists

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, M; Kalet, I; McNutt, T

    Biomedical informatics encompasses a very large domain of knowledge and applications. This broad and loosely defined field can make it difficult to navigate. Physicists often are called upon to provide informatics services and/or to take part in projects involving principles of the field. The purpose of the presentations in this symposium is to help medical physicists gain some knowledge about the breadth of the field and how, in the current clinical and research environment, they can participate and contribute. Three talks have been designed to give an overview from the perspective of physicists and to provide a more in-depth discussionmore » in two areas. One of the primary purposes, and the main subject of the first talk, is to help physicists achieve a perspective about the range of the topics and concepts that fall under the heading of 'informatics'. The approach is to de-mystify topics and jargon and to help physicists find resources in the field should they need them. The other talks explore two areas of biomedical informatics in more depth. The goal is to highlight two domains of intense current interest--databases and models--in enough depth into current approaches so that an adequate background for independent inquiry is achieved. These two areas will serve as good examples of how physicists, using informatics principles, can contribute to oncology practice and research. Learning Objectives: To understand how the principles of biomedical informatics are used by medical physicists. To put the relevant informatics concepts in perspective with regard to biomedicine in general. To use clinical database design as an example of biomedical informatics. To provide a solid background into the problems and issues of the design and use of data and databases in radiation oncology. To use modeling in the service of decision support systems as an example of modeling methods and data use. To provide a background into how uncertainty in our data and knowledge can be incorporated into modeling methods.« less

  1. Excitation-emission matrix fluorescence spectroscopy in conjunction with multiway analysis for PAH detection in complex matrices.

    PubMed

    Nahorniak, Michelle L; Booksh, Karl S

    2006-12-01

    A field portable, single exposure excitation-emission matrix (EEM) fluorometer has been constructed and used in conjunction with parallel factor analysis (PARAFAC) to determine the sub part per billion (ppb) concentrations of several aqueous polycyclic aromatic hydrocarbons (PAHs), such as benzo(k)fluoranthene and benzo(a)pyrene, in various matrices including aqueous motor oil extract and asphalt leachate. Multiway methods like PARAFAC are essential to resolve the analyte signature from the ubiquitous background in environmental samples. With multiway data and PARAFAC analysis it is shown that reliable concentration determinations can be achieved with minimal standards in spite of the large convoluting fluorescence background signal. Thus, rapid fieldable EEM analyses may prove to be a good screening method for tracking pollutants and prioritizing sampling and analysis by more complete but time consuming and labor intensive EPA methods.

  2. Preconditioner-free Wiener filtering with a dense noise matrix

    NASA Astrophysics Data System (ADS)

    Huffenberger, Kevin M.

    2018-05-01

    This work extends the Elsner & Wandelt (2013) iterative method for efficient, preconditioner-free Wiener filtering to cases in which the noise covariance matrix is dense, but can be decomposed into a sum whose parts are sparse in convenient bases. The new method, which uses multiple messenger fields, reproduces Wiener-filter solutions for test problems, and we apply it to a case beyond the reach of the Elsner & Wandelt (2013) method. We compute the Wiener-filter solution for a simulated Cosmic Microwave Background (CMB) map that contains spatially varying, uncorrelated noise, isotropic 1/f noise, and large-scale horizontal stripes (like those caused by atmospheric noise). We discuss simple extensions that can filter contaminated modes or inverse-noise-filter the data. These techniques help to address complications in the noise properties of maps from current and future generations of ground-based Microwave Background experiments, like Advanced ACTPol, Simons Observatory, and CMB-S4.

  3. Detection methods for stochastic gravitational-wave backgrounds: a unified treatment

    NASA Astrophysics Data System (ADS)

    Romano, Joseph D.; Cornish, Neil. J.

    2017-04-01

    We review detection methods that are currently in use or have been proposed to search for a stochastic background of gravitational radiation. We consider both Bayesian and frequentist searches using ground-based and space-based laser interferometers, spacecraft Doppler tracking, and pulsar timing arrays; and we allow for anisotropy, non-Gaussianity, and non-standard polarization states. Our focus is on relevant data analysis issues, and not on the particular astrophysical or early Universe sources that might give rise to such backgrounds. We provide a unified treatment of these searches at the level of detector response functions, detection sensitivity curves, and, more generally, at the level of the likelihood function, since the choice of signal and noise models and prior probability distributions are actually what define the search. Pedagogical examples are given whenever possible to compare and contrast different approaches. We have tried to make the article as self-contained and comprehensive as possible, targeting graduate students and new researchers looking to enter this field.

  4. General introduction for the “National Field Manual for the Collection of Water-Quality Data”

    USGS Publications Warehouse

    ,

    2018-02-28

    BackgroundAs part of its mission, the U.S. Geological Survey (USGS) collects data to assess the quality of our Nation’s water resources. A high degree of reliability and standardization of these data are paramount to fulfilling this mission. Documentation of nationally accepted methods used by USGS personnel serves to maintain consistency and technical quality in data-collection activities. “The National Field Manual for the Collection of Water-Quality Data” (NFM) provides documented guidelines and protocols for USGS field personnel who collect water-quality data. The NFM provides detailed, comprehensive, and citable procedures for monitoring the quality of surface water and groundwater. Topics in the NFM include (1) methods and protocols for sampling water resources, (2) methods for processing samples for analysis of water quality, (3) methods for measuring field parameters, and (4) specialized procedures, such as sampling water for low levels of mercury and organic wastewater chemicals, measuring biological indicators, and sampling bottom sediment for chemistry. Personnel who collect water-quality data for national USGS programs and projects, including projects supported by USGS cooperative programs, are mandated to use protocols provided in the NFM per USGS Office of Water Quality Technical Memorandum 2002.13. Formal training, for example, as provided in the USGS class, “Field Water-Quality Methods for Groundwater and Surface Water,” and field apprenticeships supplement the guidance provided in the NFM and ensure that the data collected are high quality, accurate, and scientifically defensible.

  5. One-loop β-function for an infinite-parameter family of gauge theories

    NASA Astrophysics Data System (ADS)

    Krasnov, Kirill

    2015-03-01

    We continue to study an infinite-parametric family of gauge theories with an arbitrary function of the self-dual part of the field strength as the Lagrangian. The arising one-loop divergences are computed using the background field method. We show that they can all be absorbed by a local redefinition of the gauge field, as well as multiplicative renormalisations of the couplings. Thus, this family of theories is one-loop renormalisable. The infinite set of β-functions for the couplings is compactly stored in a renormalisation group flow for a single function of the curvature. The flow is obtained explicitly.

  6. New type IIB backgrounds and aspects of their field theory duals

    NASA Astrophysics Data System (ADS)

    Caceres, Elena; Macpherson, Niall T.; Núñez, Carlos

    2014-08-01

    In this paper we study aspects of geometries in Type IIA and Type IIB String theory and elaborate on their field theory dual pairs. The backgrounds are associated with reductions to Type IIA of solutions with G 2 holonomy in eleven dimensions. We classify these backgrounds according to their G-structure, perform a non-Abelian T-duality on them and find new Type IIB configurations presenting dynamical SU(2)-structure. We study some aspects of the associated field theories defined by these new backgrounds. Various technical details are clearly spelled out.

  7. Cosmological origin of anomalous radio background

    NASA Astrophysics Data System (ADS)

    Cline, James M.; Vincent, Aaron C.

    2013-02-01

    The ARCADE 2 collaboration has reported a significant excess in the isotropic radio background, whose homogeneity cannot be reconciled with clustered sources. This suggests a cosmological origin prior to structure formation. We investigate several potential mechanisms and show that injection of relativistic electrons through late decays of a metastable particle can give rise to the observed excess radio spectrum through synchrotron emission. However, constraints from the cosmic microwave background (CMB) anisotropy, on injection of charged particles and on the primordial magnetic field, present a challenge. The simplest scenario is with a gtrsim9 GeV particle decaying into e+e- at a redshift of z ~ 5, in a magnetic field of ~ 5μG, which exceeds the CMB B-field constraints, unless the field was generated after decoupling. Decays into exotic millicharged particles can alleviate this tension, if they emit synchroton radiation in conjunction with a sufficiently large background magnetic field of a dark U(1)' gauge field.

  8. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement in practice.

  9. Finite-element solution to multidimensional multisource electromagnetic problems in the frequency domain using non-conforming meshes

    NASA Astrophysics Data System (ADS)

    Soloveichik, Yury G.; Persova, Marina G.; Domnikov, Petr A.; Koshkina, Yulia I.; Vagin, Denis V.

    2018-03-01

    We propose an approach to solving multisource induction logging problems in multidimensional media. According to the type of induction logging tools, the measurements are performed in the frequency range of 10 kHz to 14 MHz, transmitter-receiver offsets vary in the range of 0.5-8 m or more, and the trajectory length is up to 1 km. For calculating the total field, the primary-secondary field approach is used. The secondary field is calculated with the use of the finite-element method (FEM), irregular non-conforming meshes with local refinements and a direct solver. The approach to constructing basis functions with the continuous tangential components (from Hcurl(Ω)) on the non-conforming meshes from the standard shape vector functions is developed. On the basis of this method, the algorithm of generating global matrices and a vector of the finite-element equation system is proposed. We also propose the method of grouping the logging tool positions, which makes it possible to significantly increase the computational effectiveness. This is achieved due to the compromise between the possibility of using the 1-D background medium, which is very similar to the investigated multidimensional medium for a small group, and the decrease in the number of the finite-element matrix factorizations with the increasing number of tool positions in one group. For calculating the primary field, we propose the method based on the use of FEM. This method is highly effective when the 1-D field is required to be calculated at a great number of points. The use of this method significantly increases the effectiveness of the primary-secondary field approach. The proposed approach makes it possible to perform modelling both in the 2.5-D case (i.e. without taking into account a borehole and/or invasion zone effect) and the 3-D case (i.e. for models with a borehole and invasion zone). The accuracy of numerical results obtained with the use of the proposed approach is compared with the one obtained by other codes for 1-D and 3-D anisotropic models. The results of this comparison lend support to the validity of our code. We also present the numerical results proving greater effectiveness of the finite-element approach proposed for calculating the 1-D field in comparison with the known codes implementing the semi-analytical methods for the case in which the field is calculated at a large number of points. Additionally, we present the numerical results which confirm the accuracy advantages of the automatic choice of a background medium for calculating the 1-D field as well as the results of 2.5-D modelling for a geoelectrical model with anisotropic layers, a fault and long tool-movement trajectory with the varying dip angle.

  10. Environmental Exposure and Accelerated Testing of Rubber-to-Metal Vulcanized Bonded Assemblies

    DTIC Science & Technology

    1974-11-01

    by weapon components in the field and to determine the effect of this exposure on the vulcanized bond The purpose is also to duplicate these long term...storage and environmental exposure, and to develop accelerated methods for use in predicting this resistance. BACKGROUND: The most effective method of... the rubber coatings on the M60 machine gun components, the shock isolator and recoil adapter on the CAU 28/A Minigun, rubber pads for all tracked

  11. THE NEW YORK CITY URBAN DISPERSION PROGRAM MARCH 2005 FIELD STUDY: TRACER METHODS AND RESULTS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON, T.B.; HEISER, J.; KALB, P.

    The Urban Dispersion Program March 2005 Field Study tracer releases, sampling, and analytical methods are described in detail. There were two days where tracer releases and sampling were conducted. A total of 16.0 g of six tracers were released during the first test day or Intensive Observation Period (IOP) 1 and 15.7 g during IOP 2. Three types of sampling instruments were used in this study. Sequential air samplers, or SAS, collected six-minute samples, while Brookhaven atmospheric tracer samplers (BATS) and personal air samplers (PAS) collected thirty-minute samples. There were a total of 1300 samples resulting from the two IOPs.more » Confidence limits in the sampling and analysis method were 20% as determined from 100 duplicate samples. The sample recovery rate was 84%. The integrally averaged 6-minute samples were compared to the 30-minute samples. The agreement was found to be good in most cases. The validity of using a background tracer to calculate sample volumes was examined and also found to have a confidence level of 20%. Methods for improving sampling and analysis are discussed. The data described in this report are available as Excel files. An additional Excel file of quality assured tracer data for use in model validation efforts is also available. The file consists of extensively quality assured BATS tracer data with background concentrations subtracted.« less

  12. Development of wheelchair caster testing equipment and preliminary testing of caster models

    PubMed Central

    Mhatre, Anand; Ott, Joseph

    2017-01-01

    Background Because of the adverse environmental conditions present in less-resourced environments (LREs), the World Health Organization (WHO) has recommended that specialised wheelchair test methods may need to be developed to support product quality standards in these environments. A group of experts identified caster test methods as a high priority because of their common failure in LREs, and the insufficiency of existing test methods described in the International Organization for Standardization (ISO) Wheelchair Testing Standards (ISO 7176). Objectives To develop and demonstrate the feasibility of a caster system test method. Method Background literature and expert opinions were collected to identify existing caster test methods, caster failures common in LREs and environmental conditions present in LREs. Several conceptual designs for the caster testing method were developed, and through an iterative process using expert feedback, a final concept and a design were developed and a prototype was fabricated. Feasibility tests were conducted by testing a series of caster systems from wheelchairs used in LREs, and failure modes were recorded and compared to anecdotal reports about field failures. Results The new caster testing system was developed and it provides the flexibility to expose caster systems to typical conditions in LREs. Caster failures such as stem bolt fractures, fork fractures, bearing failures and tire cracking occurred during testing trials and are consistent with field failures. Conclusion The new caster test system has the capability to incorporate necessary test factors that degrade caster quality in LREs. Future work includes developing and validating a testing protocol that results in failure modes common during wheelchair use in LRE. PMID:29062762

  13. The synthesis of high yield Au nanoplate and optimized optical properties

    NASA Astrophysics Data System (ADS)

    Ni, Yuan; Kan, Caixia; Xu, Juan; Liu, Yang

    2018-02-01

    The applications of Au nanoplates based on the tunable plasmon properties and enhanced electromagnetic field at the sharp tip and straight edges, have generated a great deal of interest in recent years, especially in the fields of the bio-chemical sensing and imaging. In this review, we focus on the synthesis of nanoscale platelike structures by multiple synthetic strategies (such as thermal solution method, seed-mediated method, seedless method, and some greener methods), and explore corresponding growth mechanism in different synthetic approaches. Other than to review the fabrication of Au nanoplates, the purification strategies are also discussed in order to support the applications in various fields. Modifying synthetic method to obtain well-defined nanoplates can tuned optical absorption from visible to near-infrared region. Moreover, the Au nanoplate dimers (vertex-to-vertex and edge-by-edge assemblies) can induce more specific plasmon properties and stronger localized field due to coupling of interparticles. Compared with 0D quasi-spherical nanoparticles and 1D nanorods, the 2D nanoplates can be applied as a good surface-enhanced Raman scattering (SERS) substrate because of the sharp corners and straight edges. This review will provide background information for the controllable synthesis of anisotropic nanoparticles and advance the application of coupled nanostructures.

  14. Preserving Simplecticity in the Numerical Integration of Linear Beam Optics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, Christopher K.

    2017-07-01

    Presented are mathematical tools and methods for the development of numerical integration techniques that preserve the symplectic condition inherent to mechanics. The intended audience is for beam physicists with backgrounds in numerical modeling and simulation with particular attention to beam optics applications. The paper focuses on Lie methods that are inherently symplectic regardless of the integration accuracy order. Section 2 provides the mathematically tools used in the sequel and necessary for the reader to extend the covered techniques. Section 3 places those tools in the context of charged-particle beam optics; in particular linear beam optics is presented in terms ofmore » a Lie algebraic matrix representation. Section 4 presents numerical stepping techniques with particular emphasis on a third-order leapfrog method. Section 5 discusses the modeling of field imperfections with particular attention to the fringe fields of quadrupole focusing magnets. The direct computation of a third order transfer matrix for a fringe field is shown.« less

  15. Rocket-triggered lightning strikes and forest fire ignition

    NASA Technical Reports Server (NTRS)

    Fenner, James

    1990-01-01

    The following are presented: (1) background information on the rocket-triggered lightning project an Kennedy Space Center (KSC); (2) a summary of the forecasting problem; (3) the facilities and equipment available for undertaking field experiments at KSC; (4) previous research activity performed; (5) a description of the atmospheric science field laboratory near Mosquito Lagoon on the KSC complex; (6) methods of data acquisition; and (7) present results. New sources of data for the 1990 field experiment include measuring the electric field in the lower few thousand feet of the atmosphere by suspending field measuring devices below a tethered balloon, and measuring the electric field intensity in clouds and in the atmosphere with aircraft. The latter program began in July of 1990. Also, future prospects for both triggered lightning and forest fire research at KSC are listed.

  16. Complete one-loop renormalization of the Higgs-electroweak chiral Lagrangian

    NASA Astrophysics Data System (ADS)

    Buchalla, G.; Catà, O.; Celis, A.; Knecht, M.; Krause, C.

    2018-03-01

    Employing background-field method and super-heat-kernel expansion, we compute the complete one-loop renormalization of the electroweak chiral Lagrangian with a light Higgs boson. Earlier results from purely scalar fluctuations are confirmed as a special case. We also recover the one-loop renormalization of the conventional Standard Model in the appropriate limit.

  17. Evaluating ozone air pollution effects on pines in the western United States

    Treesearch

    Paul R. Miller; Kenneth W. Stolte; Daniel M. Duriscoe; John Pronos

    1996-01-01

    Historical and technical background is provided about ozone air pollution effects on ponderosa (Pinus ponderosa Dougl. ex Laws) and Jeffrey (P. jeffreyi Grev. and Balf.) pines in forests of the western United States. The principal aim is to document the development of field survey methods to be applied to assessment of chronic...

  18. Verb Form Indicates Discourse Segment Type in Biological Research Papers: Experimental Evidence

    ERIC Educational Resources Information Center

    de Waard, Anita; Maat, Henk Pander

    2012-01-01

    Corpus studies suggest that verb tense is a differentiating feature between, on the one hand, text pertaining to experimental results (involving methods and results) and on the other hand, text pertaining to more abstract concepts (i.e. regarding background knowledge in a field, hypotheses, problems or claims). In this paper, we describe a user…

  19. Responses to salinity in invasive cordgrass hybrids and their parental species (Spartina) in a scenario of sea level rise and climate change

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: Salinity is one of the main abiotic factors in salt marshes. Studies rooted to analyzed salinity tolerance of halophytes may help to relate their physiological tolerances with distribution limits in the field. Climate change-induced sea level rise and higher temperatures...

  20. Incorporating Prototyping and Iteration into Intervention Development: A Case Study of a Dining Hall-Based Intervention

    ERIC Educational Resources Information Center

    McClain, Arianna D.; Hekler, Eric B.; Gardner, Christopher D.

    2013-01-01

    Background: Previous research from the fields of computer science and engineering highlight the importance of an iterative design process (IDP) to create more creative and effective solutions. Objective: This study describes IDP as a new method for developing health behavior interventions and evaluates the effectiveness of a dining hall--based…

  1. Exploring the Development of Existing Sex Education Programmes for People with Intellectual Disabilities: An Intervention Mapping Approach

    ERIC Educational Resources Information Center

    Schaafsma, Dilana; Stoffelen, Joke M. T.; Kok, Gerjo; Curfs, Leopold M. G.

    2013-01-01

    Background: People with intellectual disabilities face barriers that affect their sexual health. Sex education programmes have been developed by professionals working in the field of intellectual disabilities with the aim to overcome these barriers. The aim of this study was to explore the development of these programmes. Methods: Sex education…

  2. Family Quality of Life from the Perspective of Older Parents

    ERIC Educational Resources Information Center

    Jokinen, N. S.; Brown, R. I.

    2005-01-01

    Background: Family quality of life is a relatively new field of study. Research has primarily concentrated on families of children and young adults with intellectual disability (ID). Method: This project explored the concept of family quality of life from the perspective of older parents who had adult children with ID aged 40. Focus groups,…

  3. Quality of Life and its Measurement: Important Principles and Guidelines

    ERIC Educational Resources Information Center

    Verdugo, M. A.; Schalock, R. L.; Keith, K. D.; Stancliffe, R. J.

    2005-01-01

    Background: The importance of the valid assessment of quality of life (QOL) is heightened with the increased use of the QOL construct as a basis for policies and practices in the field of intellectual disability (ID). Method: This article discusses the principles that should guide the measurement process, the major interrogatories (i.e. who, what,…

  4. Experiences of Two Multidisciplinary Team Members of Systemic Consultations in a Community Learning Disability Service

    ERIC Educational Resources Information Center

    Johnson, Clair; Viljoen, Nina

    2017-01-01

    Background: Systemic approaches can be useful in working with people with learning disabilities and their network. The evidence base for these approaches within the field of learning disabilities, however, is currently limited. Materials and Methods: This article presents part of a service evaluation of systemic consultations in a Community…

  5. Progress Towards a Neutral Current $$\\pi^0$$ Cross Section Analysis in the NOvA Near Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowles, Reed; Paley, Jonathan

    The NOvA neutrino experiment is attempting to measure properties of neutrinos in order to figure out information about the universe. To detect the signal neutrino interactions, we must determine methods to identify and isolate background events. Research focused on a specific background interaction called a single prong neutral currentmore » $$\\pi^0$$ interaction. To do this, a basic cuts based analysis was performed, followed by feeding data into a multi-variate analysis package using a boosted decision tree (BDT) algorithm. Using the BDT, a a new variable was generated which separates signal and background very efficiently. Further work must still be done in order to continue improving the performance of the BDT. This research is valuable to the field of studying neutrino cross sections as it is a background which will always be present in this type of analysis.« less

  6. Fast Monte Carlo-assisted simulation of cloudy Earth backgrounds

    NASA Astrophysics Data System (ADS)

    Adler-Golden, Steven; Richtsmeier, Steven C.; Berk, Alexander; Duff, James W.

    2012-11-01

    A calculation method has been developed for rapidly synthesizing radiometrically accurate ultraviolet through longwavelengthinfrared spectral imagery of the Earth for arbitrary locations and cloud fields. The method combines cloudfree surface reflectance imagery with cloud radiance images calculated from a first-principles 3-D radiation transport model. The MCScene Monte Carlo code [1-4] is used to build a cloud image library; a data fusion method is incorporated to speed convergence. The surface and cloud images are combined with an upper atmospheric description with the aid of solar and thermal radiation transport equations that account for atmospheric inhomogeneity. The method enables a wide variety of sensor and sun locations, cloud fields, and surfaces to be combined on-the-fly, and provides hyperspectral wavelength resolution with minimal computational effort. The simulations agree very well with much more time-consuming direct Monte Carlo calculations of the same scene.

  7. Sonochemical approaches to enhanced oil recovery.

    PubMed

    Abramov, Vladimir O; Abramova, Anna V; Bayazitov, Vadim M; Altunina, Lyubov K; Gerasin, Artyom S; Pashin, Dmitriy M; Mason, Timothy J

    2015-07-01

    Oil production from wells reduces with time and the well becomes uneconomic unless enhanced oil recovery (EOR) methods are applied. There are a number of methods currently available and each has specific advantages and disadvantages depending on conditions. Currently there is a big demand for new or improved technologies in this field, the hope is that these might also be applicable to wells which have already been the subject of EOR. The sonochemical method of EOR is one of the most promising methods and is important in that it can also be applied for the treatment of horizontal wells. The present article reports the theoretical background of the developed sonochemical technology for EOR in horizontal wells; describes the requirements to the equipment needed to embody the technology. The results of the first field tests of the technology are reported. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. The Uncertainty of Local Background Magnetic Field Orientation in Anisotropic Plasma Turbulence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerick, F.; Saur, J.; Papen, M. von, E-mail: felix.gerick@uni-koeln.de

    In order to resolve and characterize anisotropy in turbulent plasma flows, a proper estimation of the background magnetic field is crucially important. Various approaches to calculating the background magnetic field, ranging from local to globally averaged fields, are commonly used in the analysis of turbulent data. We investigate how the uncertainty in the orientation of a scale-dependent background magnetic field influences the ability to resolve anisotropy. Therefore, we introduce a quantitative measure, the angle uncertainty, that characterizes the uncertainty of the orientation of the background magnetic field that turbulent structures are exposed to. The angle uncertainty can be used asmore » a condition to estimate the ability to resolve anisotropy with certain accuracy. We apply our description to resolve the spectral anisotropy in fast solar wind data. We show that, if the angle uncertainty grows too large, the power of the turbulent fluctuations is attributed to false local magnetic field angles, which may lead to an incorrect estimation of the spectral indices. In our results, an apparent robustness of the spectral anisotropy to false local magnetic field angles is observed, which can be explained by a stronger increase of power for lower frequencies when the scale of the local magnetic field is increased. The frequency-dependent angle uncertainty is a measure that can be applied to any turbulent system.« less

  9. Super-resolved all-refocused image with a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wang, Xiang; Li, Lin; Hou, Guangqi

    2015-12-01

    This paper proposes an approach to produce the super-resolution all-refocused images with the plenoptic camera. The plenoptic camera can be produced by putting a micro-lens array between the lens and the sensor in a conventional camera. This kind of camera captures both the angular and spatial information of the scene in one single shot. A sequence of digital refocused images, which are refocused at different depth, can be produced after processing the 4D light field captured by the plenoptic camera. The number of the pixels in the refocused image is the same as that of the micro-lens in the micro-lens array. Limited number of the micro-lens will result in poor low resolution refocused images. Therefore, not enough details will exist in these images. Such lost details, which are often high frequency information, are important for the in-focus part in the refocused image. We decide to super-resolve these in-focus parts. The result of image segmentation method based on random walks, which works on the depth map produced from the 4D light field data, is used to separate the foreground and background in the refocused image. And focusing evaluation function is employed to determine which refocused image owns the clearest foreground part and which one owns the clearest background part. Subsequently, we employ single image super-resolution method based on sparse signal representation to process the focusing parts in these selected refocused images. Eventually, we can obtain the super-resolved all-focus image through merging the focusing background part and the focusing foreground part in the way of digital signal processing. And more spatial details will be kept in these output images. Our method will enhance the resolution of the refocused image, and just the refocused images owning the clearest foreground and background need to be super-resolved.

  10. Diagnostic techniques for measurement of aerodynamic noise in free field and reverberant environment of wind tunnels

    NASA Technical Reports Server (NTRS)

    El-Sum, H. M. A.; Mawardi, O. K.

    1973-01-01

    Techniques for studying aerodynamic noise generating mechanisms without disturbing the flow in a free field, and in the reverberation environment of the ARC wind tunnel were investigated along with the design and testing of an acoustic antenna with an electronic steering control. The acoustic characteristics of turbojet as a noise source, detection of direct sound from a source in a reverberant background, optical diagnostic methods, and the design characteristics of a high directivity acoustic antenna. Recommendations for further studies are included.

  11. SYMPATHETIC SOLAR FILAMENT ERUPTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Rui; Liu, Ying D.; Zimovets, Ivan

    2016-08-10

    The 2015 March 15 coronal mass ejection as one of the two that together drove the largest geomagnetic storm of solar cycle 24 so far was associated with sympathetic filament eruptions. We investigate the relations between the different filaments involved in the eruption. A surge-like small-scale filament motion is confirmed as the trigger that initiated the erupting filament with multi-wavelength observations and using a forced magnetic field extrapolation method. When the erupting filament moved to an open magnetic field region, it experienced an obvious acceleration process and was accompanied by a C-class flare and the rise of another larger filamentmore » that eventually failed to erupt. We measure the decay index of the background magnetic field, which presents a critical height of 118 Mm. Combining with a potential field source surface extrapolation method, we analyze the distributions of the large-scale magnetic field, which indicates that the open magnetic field region may provide a favorable condition for F2 rapid acceleration and have some relation with the largest solar storm. The comparison between the successful and failed filament eruptions suggests that the confining magnetic field plays an important role in the preconditions for an eruption.« less

  12. 3D printing in neurosurgery: A systematic review

    PubMed Central

    Randazzo, Michael; Pisapia, Jared M.; Singh, Nickpreet; Thawani, Jayesh P.

    2016-01-01

    Background: The recent expansion of three-dimensional (3D) printing technology into the field of neurosurgery has prompted a widespread investigation of its utility. In this article, we review the current body of literature describing rapid prototyping techniques with applications to the practice of neurosurgery. Methods: An extensive and systematic search of the Compendex, Scopus, and PubMed medical databases was conducted using keywords relating to 3D printing and neurosurgery. Results were manually screened for relevance to applications within the field. Results: Of the search results, 36 articles were identified and included in this review. The articles spanned the various subspecialties of the field including cerebrovascular, neuro-oncologic, spinal, functional, and endoscopic neurosurgery. Conclusions: We conclude that 3D printing techniques are practical and anatomically accurate methods of producing patient-specific models for surgical planning, simulation and training, tissue-engineered implants, and secondary devices. Expansion of this technology may, therefore, contribute to advancing the neurosurgical field from several standpoints. PMID:27920940

  13. Holographic free energy and thermodynamic geometry

    NASA Astrophysics Data System (ADS)

    Ghorai, Debabrata; Gangopadhyay, Sunandan

    2016-12-01

    We obtain the free energy and thermodynamic geometry of holographic superconductors in 2+1 dimensions. The gravitational theory in the bulk dual to this 2+1-dimensional strongly coupled theory lives in the 3+1 dimensions and is that of a charged AdS black hole together with a massive charged scalar field. The matching method is applied to obtain the nature of the fields near the horizon using which the holographic free energy is computed through the gauge/gravity duality. The critical temperature is obtained for a set of values of the matching point of the near horizon and the boundary behaviour of the fields in the probe limit approximation which neglects the back reaction of the matter fields on the background spacetime geometry. The thermodynamic geometry is then computed from the free energy of the boundary theory. From the divergence of the thermodynamic scalar curvature, the critical temperature is obtained once again. We then compare this result for the critical temperature with that obtained from the matching method.

  14. A multifrequency MUSIC algorithm for locating small inhomogeneities in inverse scattering

    NASA Astrophysics Data System (ADS)

    Griesmaier, Roland; Schmiedecke, Christian

    2017-03-01

    We consider an inverse scattering problem for time-harmonic acoustic or electromagnetic waves with sparse multifrequency far field data-sets. The goal is to localize several small penetrable objects embedded inside an otherwise homogeneous background medium from observations of far fields of scattered waves corresponding to incident plane waves with one fixed incident direction but several different frequencies. We assume that the far field is measured at a few observation directions only. Taking advantage of the smallness of the scatterers with respect to wavelength we utilize an asymptotic representation formula for the far field to design and analyze a MUSIC-type reconstruction method for this setup. We establish lower bounds on the number of frequencies and receiver directions that are required to recover the number and the positions of an ensemble of scatterers from the given measurements. Furthermore we briefly sketch a possible application of the reconstruction method to the practically relevant case of multifrequency backscattering data. Numerical examples are presented to document the potentials and limitations of this approach.

  15. Three-dimensional ionospheric tomography reconstruction using the model function approach in Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Wang, Sicheng; Huang, Sixun; Xiang, Jie; Fang, Hanxian; Feng, Jian; Wang, Yu

    2016-12-01

    Ionospheric tomography is based on the observed slant total electron content (sTEC) along different satellite-receiver rays to reconstruct the three-dimensional electron density distributions. Due to incomplete measurements provided by the satellite-receiver geometry, it is a typical ill-posed problem, and how to overcome the ill-posedness is still a crucial content of research. In this paper, Tikhonov regularization method is used and the model function approach is applied to determine the optimal regularization parameter. This algorithm not only balances the weights between sTEC observations and background electron density field but also converges globally and rapidly. The background error covariance is given by multiplying background model variance and location-dependent spatial correlation, and the correlation model is developed by using sample statistics from an ensemble of the International Reference Ionosphere 2012 (IRI2012) model outputs. The Global Navigation Satellite System (GNSS) observations in China are used to present the reconstruction results, and measurements from two ionosondes are used to make independent validations. Both the test cases using artificial sTEC observations and actual GNSS sTEC measurements show that the regularization method can effectively improve the background model outputs.

  16. Percent body fat estimations in college men using field and laboratory methods: A three-compartment model approach

    PubMed Central

    Moon, Jordan R; Tobkin, Sarah E; Smith, Abbie E; Roberts, Michael D; Ryan, Eric D; Dalbo, Vincent J; Lockwood, Chris M; Walter, Ashley A; Cramer, Joel T; Beck, Travis W; Stout, Jeffrey R

    2008-01-01

    Background Methods used to estimate percent body fat can be classified as a laboratory or field technique. However, the validity of these methods compared to multiple-compartment models has not been fully established. The purpose of this study was to determine the validity of field and laboratory methods for estimating percent fat (%fat) in healthy college-age men compared to the Siri three-compartment model (3C). Methods Thirty-one Caucasian men (22.5 ± 2.7 yrs; 175.6 ± 6.3 cm; 76.4 ± 10.3 kg) had their %fat estimated by bioelectrical impedance analysis (BIA) using the BodyGram™ computer program (BIA-AK) and population-specific equation (BIA-Lohman), near-infrared interactance (NIR) (Futrex® 6100/XL), four circumference-based military equations [Marine Corps (MC), Navy and Air Force (NAF), Army (A), and Friedl], air-displacement plethysmography (BP), and hydrostatic weighing (HW). Results All circumference-based military equations (MC = 4.7% fat, NAF = 5.2% fat, A = 4.7% fat, Friedl = 4.7% fat) along with NIR (NIR = 5.1% fat) produced an unacceptable total error (TE). Both laboratory methods produced acceptable TE values (HW = 2.5% fat; BP = 2.7% fat). The BIA-AK, and BIA-Lohman field methods produced acceptable TE values (2.1% fat). A significant difference was observed for the MC and NAF equations compared to both the 3C model and HW (p < 0.006). Conclusion Results indicate that the BP and HW are valid laboratory methods when compared to the 3C model to estimate %fat in college-age Caucasian men. When the use of a laboratory method is not feasible, BIA-AK, and BIA-Lohman are acceptable field methods to estimate %fat in this population. PMID:18426582

  17. Emergent kink stability of a magnetized plasma jet injected into a transverse background magnetic field

    NASA Astrophysics Data System (ADS)

    Zhang, Yue; Gilmore, Mark; Hsu, Scott C.; Fisher, Dustin M.; Lynn, Alan G.

    2017-11-01

    We report experimental results on the injection of a magnetized plasma jet into a transverse background magnetic field in the HelCat linear plasma device at the University of New Mexico [M. Gilmore et al., J. Plasma Phys. 81(1), 345810104 (2015)]. After the plasma jet leaves the plasma-gun muzzle, a tension force arising from an increasing curvature of the background magnetic field induces in the jet a sheared axial-flow gradient above the theoretical kink-stabilization threshold. We observe that this emergent sheared axial flow stabilizes the n = 1 kink mode in the jet, whereas a kink instability is observed in the jet when there is no background magnetic field present.

  18. Statistical simulations of the dust foreground to cosmic microwave background polarization

    NASA Astrophysics Data System (ADS)

    Vansyngel, F.; Boulanger, F.; Ghosh, T.; Wandelt, B.; Aumont, J.; Bracco, A.; Levrier, F.; Martin, P. G.; Montier, L.

    2017-07-01

    The characterization of the dust polarization foreground to the cosmic microwave background (CMB) is a necessary step toward the detection of the B-mode signal associated with primordial gravitational waves. We present a method to simulate maps of polarized dust emission on the sphere that is similar to the approach used for CMB anisotropies. This method builds on the understanding of Galactic polarization stemming from the analysis of Planck data. It relates the dust polarization sky to the structure of the Galactic magnetic field and its coupling with interstellar matter and turbulence. The Galactic magnetic field is modeled as a superposition of a mean uniform field and a Gaussian random (turbulent) component with a power-law power spectrum of exponent αM. The integration along the line of sight carried out to compute Stokes maps is approximated by a sum over a small number of emitting layers with different realizations of the random component of the magnetic field. The model parameters are constrained to fit the power spectra of dust polarization EE, BB, and TE measured using Planck data. We find that the slopes of the E and B power spectra of dust polarization are matched for αM = -2.5, an exponent close to that measured for total dust intensity but larger than the Kolmogorov exponent - 11/3. The model allows us to compute multiple realizations of the Stokes Q and U maps for different realizations of the random component of the magnetic field, and to quantify the variance of dust polarization spectra for any given sky area outside of the Galactic plane. The simulations reproduce the scaling relation between the dust polarization power and the mean total dust intensity including the observed dispersion around the mean relation. We also propose a method to carry out multifrequency simulations, including the decorrelation measured recently by Planck, using a given covariance matrix of the polarization maps. These simulations are well suited to optimize component separation methods and to quantify the confidence with which the dust and CMB B-modes can be separated in present and future experiments. We also provide an astrophysical perspective on our phenomenological modeling of the dust polarization spectra.

  19. Quantitative Assessment of Fat Levels in Caenorhabditis elegans Using Dark Field Microscopy

    PubMed Central

    Fouad, Anthony D.; Pu, Shelley H.; Teng, Shelly; Mark, Julian R.; Fu, Moyu; Zhang, Kevin; Huang, Jonathan; Raizen, David M.; Fang-Yen, Christopher

    2017-01-01

    The roundworm Caenorhabditis elegans is widely used as a model for studying conserved pathways for fat storage, aging, and metabolism. The most broadly used methods for imaging fat in C. elegans require fixing and staining the animal. Here, we show that dark field images acquired through an ordinary light microscope can be used to estimate fat levels in worms. We define a metric based on the amount of light scattered per area, and show that this light scattering metric is strongly correlated with worm fat levels as measured by Oil Red O (ORO) staining across a wide variety of genetic backgrounds and feeding conditions. Dark field imaging requires no exogenous agents or chemical fixation, making it compatible with live worm imaging. Using our method, we track fat storage with high temporal resolution in developing larvae, and show that fat storage in the intestine increases in at least one burst during development. PMID:28404661

  20. Fiber-Optic Magnetometry and Thermometry Using Optically Detected Magnetic Resonance With Nitrogen-Vacancy Centers in Diamond

    NASA Astrophysics Data System (ADS)

    Blakley, Sean Michael

    Nitrogen--vacancy diamond (NVD) quantum sensors are an emerging technology that has shown great promise in areas like high-resolution thermometry and magnetometry. Optical fibers provide attractive new application paradigms for NVD technology. A detailed description of the fabrication processes associated with the development of novel fiber-optic NVD probes are presented in this work. The demonstrated probes are tested on paradigmatic model systems designed to ascertain their suitability for use in challenging biological environments. Methods employing optically detected magnetic resonance (ODMR) are used to accurately measure and map temperature distributions of small objects and to demonstrate emergent temperature-dependent phenomena in genetically modified living organisms. These methods are also used to create detailed high resolution spatial maps of both magnetic scalar and magnetic vector field distributions of spatially localized weak field features in the presence of a noisy, high-field background.

  1. Coherent anti-Stokes Raman scattering under electric field stimulation

    NASA Astrophysics Data System (ADS)

    Capitaine, Erwan; Ould Moussa, Nawel; Louot, Christophe; Lefort, Claire; Pagnoux, Dominique; Duclère, Jean-René; Kaneyasu, Junya F.; Kano, Hideaki; Duponchel, Ludovic; Couderc, Vincent; Leproux, Philippe

    2016-12-01

    We introduce an experiment using electro-CARS, an electro-optical method based on the combination of ultrabroadband multiplex coherent anti-Stokes Raman scattering (M-CARS) spectroscopy and electric field stimulation. We demonstrate that this method can effectively discriminate the resonant CARS signal from the nonresonant background owing to a phenomenon of molecular orientation in the sample medium. Such molecular orientation is intrinsically related to the induction of an electric dipole moment by the applied static electric field. Evidence of the electro-CARS effect is obtained with a solution of n -alkanes (CnH2 n +2 , 15 ≤n ≤40 ), for which an enhancement of the CARS signal-to-noise ratio is achieved in the case of CH2 and CH3 symmetric/asymmetric stretching vibrations. Additionally, an electric-field-induced second-harmonic generation experiment is performed in order to corroborate the orientational organization of molecules due to the electric field excitation. Finally, we use a simple mathematical approach to compare the vibrational information extracted from electro-CARS measurements with spontaneous Raman data and to highlight the impact of electric stimulation on the vibrational signal.

  2. Continuous time-resolved regional methane leak detection with on-line background estimation using a novel combination of dual frequency comb laser spectroscopy and atmospheric inversions

    NASA Astrophysics Data System (ADS)

    Alden, C. B.; Coburn, S.; Wright, R.; Baumann, E.; Cossel, K.; Sweeney, C.; Ghosh, S.; Newbury, N.; Prasad, K.; Coddington, I.; Rieker, G. B.

    2017-12-01

    Advances in natural gas extraction technology have led to increased US production and transport activity, and as a consequence, an increased need for monitoring of methane leaks. Current leak detection methods provide time snapshots, and not continuous, time-varying estimates of emissions. Most approaches also require specific atmospheric conditions, operators, or the use of a tracer gas, requiring site access. Given known intermittency in fugitive methane emissions, continuous monitoring is a critical need for emissions mitigation. We present a novel leak detection method that employs dual frequency comb spectrometry to offer continuous, autonomous, leak detection and quantification over square-km scale areas. The spectrometer is situated in a field of natural gas pads, and a series of retroreflectors around the field direct light back to a detector. The laser light spans 1620-1680 nm with 0.002 nm line spacing, measuring thousands of individual absorption features from multiple species. The result is high-stability trace gas (here CH4, CO2, and H2O) measurements over long (1 km+) open paths through the atmosphere. Measurements are used in an atmospheric inversion to estimate the time variability of emissions at each location of interest. Importantly, the measurement framework and inversion solve explicitly for background concentrations, which vary rapidly in fields of active oil and gas production. We present the results of controlled-leak field tests in rural Colorado. We demonstrate the ability to locate and size a leak located 1 km away from the spectrometer and varying in strength from 1.5 to 7.7 g/min, resulting in mean atmospheric enhancements of 20 ppb. The inversion correctly identifies when the leak turned on and off over a 24-hour period, and determines the mean leak strength to within 10% of the true controlled rate. We further demonstrate the ability of the system to correctly locate and size the start and end of simultaneous 2.7 to 4.8 g/min leaks from 2 sources in a field of 5 potential leak locations. Finally, we present the results of leak-detection tests in active oil and gas fields in the Denver Julesburg Basin, where background methane is complex.

  3. Amplitudes on plane waves from ambitwistor strings

    NASA Astrophysics Data System (ADS)

    Adamo, Tim; Casali, Eduardo; Mason, Lionel; Nekovar, Stefan

    2017-11-01

    In marked contrast to conventional string theory, ambitwistor strings remain solvable worldsheet theories when coupled to curved background fields. We use this fact to consider the quantization of ambitwistor strings on plane wave metric and plane wave gauge field backgrounds. In each case, the worldsheet model is anomaly free as a consequence of the background satisfying the field equations. We derive vertex operators (in both fixed and descended picture numbers) for gravitons and gluons on these backgrounds from the worldsheet CFT, and study the 3-point functions of these vertex operators on the Riemann sphere. These worldsheet correlation functions reproduce the known results for 3-point scattering amplitudes of gravitons and gluons in gravitational and gauge theoretic plane wave backgrounds, respectively.

  4. [Research on the temperature field detection method of hot forging based on long-wavelength infrared spectrum].

    PubMed

    Zhang, Yu-Cun; Wei, Bin; Fu, Xian-Bin

    2014-02-01

    A temperature field detection method based on long-wavelength infrared spectrum for hot forging is proposed in the present paper. This method combines primary spectrum pyrometry and three-stage FP-cavity LCTF. By optimizing the solutions of three group nonlinear equations in the mathematical model of temperature detection, the errors are reduced, thus measuring results will be more objective and accurate. Then the system of three-stage FP-cavity LCTF was designed on the principle of crystal birefringence. The system realized rapid selection of any wavelength in a certain wavelength range. It makes the response of the temperature measuring system rapid and accurate. As a result, without the emissivity of hot forging, the method can acquire exact information of temperature field and effectively suppress the background light radiation around the hot forging and ambient light that impact the temperature detection accuracy. Finally, the results of MATLAB showed that the infrared spectroscopy through the three-stage FP-cavity LCTF could meet the requirements of design. And experiments verified the feasibility of temperature measuring method. Compared with traditional single-band thermal infrared imager, the accuracy of measuring result was improved.

  5. Mitigation strategies against radiation-induced background for space astronomy missions

    NASA Astrophysics Data System (ADS)

    Davis, C. S. W.; Hall, D.; Keelan, J.; O'Farrell, J.; Leese, M.; Holland, A.

    2018-01-01

    The Advanced Telescope for High ENergy Astrophysics (ATHENA) mission is a major upcoming space-based X-ray observatory due to be launched in 2028 by ESA, with the purpose of mapping the early universe and observing black holes. Background radiation is expected to constitute a large fraction of the total system noise in the Wide Field Imager (WFI) instrument on ATHENA, and designing an effective system to reduce the background radiation impacting the WFI will be crucial for maximising its sensitivity. Significant background sources are expected to include high energy protons, X-ray fluorescence lines, 'knock-on' electrons and Compton electrons. Due to the variety of the different background sources, multiple shielding methods may be required to achieve maximum sensitivity in the WFI. These techniques may also be of great interest for use in future space-based X-ray experiments. Simulations have been developed to model the effect of a graded-Z shield on the X-ray fluorescence background. In addition the effect of a 90nm optical blocking filter on the secondary electron background has been investigated and shown to modify the requirements of any secondary electron shielding that is to be used.

  6. Subjective Quality of Life of People with Intellectual Disabilities: The Role of Emotional Competence on Their Subjective Well-Being

    ERIC Educational Resources Information Center

    Rey, Lourdes; Extremera, Natalio; Duran, Auxiliadora; Ortiz-Tallo, Margarita

    2013-01-01

    Background: For decades, the field of quality of life for people with intellectual disabilities has focused on the improving the external life conditions. However, scarce research has examined the contribution of person-related psychological resources such as emotional competence (EC) on well-being in this population. Materials and Methods: Using…

  7. Social Studies Teacher Education in the Early Twentieth Century: A Historical Inquiry into the Relationship between Teacher Preparation and Curriculum Reform

    ERIC Educational Resources Information Center

    Jacobs, Benjamin M.

    2013-01-01

    Background/Context: The field of social studies education is hardly lacking in historical investigation. The historiography includes sweeping chronicles of longtime struggles over the curriculum as well as case studies of momentous eras, events, policies, trends, and people, with emphases on aims, subject matter, method, and much more. Curiously,…

  8. Hope as a Psychological Resilience Factor in Mothers and Fathers of Children with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Lloyd, T. J.; Hastings, R.

    2009-01-01

    Background: Positive psychology is an area gaining credence within the field of intellectual disability (ID). Hope is one facet of positive psychology that is relatively unstudied in parents of children with ID. In the present study, we explore hope and its relationships with parental well-being in parents of school-aged children with ID. Method:…

  9. Meta-Analysis of the Effectiveness of Individual Intervention in the Controlled Multisensory Environment (Snoezelen[R]) for Individuals with Intellectual Disability

    ERIC Educational Resources Information Center

    Lotan, Meir; Gold, Christian

    2009-01-01

    Background: The Snoezelen[R] is a multisensory intervention approach that has been implemented with various populations. Due to an almost complete absence of rigorous research in this field, the confirmation of this approach as an effective therapeutic intervention is warranted. Method: To evaluate the therapeutic influence of the…

  10. Drifting Continents and Magnetic Fields. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  11. Perceived Sleepiness, Sleep Habits and Sleep Concerns of Public School Teachers, Administrators and Other Personnel

    ERIC Educational Resources Information Center

    Amschler, Denise H.; McKenzie, James F.

    2010-01-01

    Background: Sleep deprivation is a world-wide health concern. Few studies have examined the sleep behaviors of those employed in the education field. Purpose: To describe the sleep habits and concerns of school personnel in a Midwest school corporation. Methods: A cross-sectional survey design was used to collect data about demographics, the…

  12. A field-based characterization of conductivity in areas of minimal alteration: A case example in the Cascades of northwestern United States.

    PubMed

    Cormier, Susan M; Zheng, Lei; Hayslip, Gretchen; Flaherty, Colleen M

    2018-08-15

    The concentration of salts in streams is increasing world-wide making freshwater a declining resource. Developing thresholds for freshwater with low specific conductivity (SC), a measure of dissolved ions in water, may protect high quality resources that are refugia for aquatic life and that dilute downstream waters. In this case example, methods are illustrated for estimating protective levels for streams with low SC. The Cascades in the Pacific Northwest of the United States of America was selected for the case study because a geophysical model indicated that the SC of freshwater streams was likely to be very low. Also, there was an insufficient range in the SC data to accurately derive a criterion using the 2011, US Environmental Protection Agency field-based extirpation concentration distribution method. Instead, background and a regression model was used to estimate chronic and acute SC levels that could extirpate 5% of benthic invertebrate genera. Background SC was estimated at the 25th centile (33μS/cm) of the measured data and used as the independent variable in a least squares empirical background-to-criteria (B-C) model. Because no comparison could be made with effect levels estimated from a paired SC and biological data set from the Cascades, the lower 50% prediction limit (PL) was identified as an example chronic water quality criterion (97μS/cm). The maximum exposure threshold was estimated at the 90th centile SC of streams meeting the chronic SC level. The example acute SC level was 190μS/cm. Because paired aquatic life and SC data are often sparse, the B-C method is useful for developing SC criteria for other systems with limited data. Published by Elsevier B.V.

  13. Three site Higgsless model at one loop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chivukula, R. Sekhar; Simmons, Elizabeth H.; Matsuzaki, Shinya

    2007-04-01

    In this paper we compute the one loop chiral-logarithmic corrections to all O(p{sup 4}) counterterms in the three site Higgsless model. The calculation is performed using the background field method for both the chiral and gauge fields, and using Landau gauge for the quantum fluctuations of the gauge fields. The results agree with our previous calculations of the chiral-logarithmic corrections to the S and T parameters in 't Hooft-Feynman gauge. The work reported here includes a complete evaluation of all one loop divergences in an SU(2)xU(1) nonlinear sigma model, corresponding to an electroweak effective Lagrangian in the absence of custodialmore » symmetry.« less

  14. Estimation of the chiral magnetic effect considering the magnetic field response of the QGP medium

    NASA Astrophysics Data System (ADS)

    Feng, Sheng-Qin; Ai, Xin; Pei, Lei; Sun, Fei; Zhong, Yang; Yin, Zhong-Bao

    2018-05-01

    The magnetic field plays a major role in searching for the chiral magnetic effect in relativistic heavy-ion collisions. If the lifetime of the magnetic field is too short, as predicted by simulations of the field in vacuum, the chiral magnetic effect will be largely suppressed. However, the lifetime of the magnetic field will become longer when the QGP medium response is considered. We give an estimate of the effect, especially considering the magnetic field response of the QGP medium, and compare it with the experimental results for the background-subtracted correlator H at RHIC and LHC energies. The results show that our method explains the experimental results better at the top RHIC energy than at the LHC energy. Supported by National Natural Science Foundation of China (11747115, 11475068), the CCNU-QLPL Innovation Fund (QLPL2016P01) and the Excellent Youth Foundation of Hubei Scientific Committee (2006ABB036)

  15. Chiral magnetic effect in the presence of electroweak interactions as a quasiclassical phenomenon

    NASA Astrophysics Data System (ADS)

    Dvornikov, Maxim; Semikoz, Victor B.

    2018-03-01

    We elaborate the quasiclassical approach to obtain the modified chiral magnetic effect (CME) in the case when the massless charged fermions interact with electromagnetic fields and the background matter by the electroweak forces. The derivation of the anomalous current along the external magnetic field involves the study of the energy density evolution of chiral particles in parallel electric and magnetic fields. We consider both the particle acceleration by the external electric field and the contribution of the Adler anomaly. The condition of the validity of this method for the derivation of the CME is formulated. We obtain the expression for the electric current along the external magnetic field, which appears to coincide with our previous results based on the purely quantum approach. Our results are compared with the findings of other authors.

  16. Consistency restrictions on maximal electric-field strength in quantum field theory.

    PubMed

    Gavrilov, S P; Gitman, D M

    2008-09-26

    Quantum field theory with an external background can be considered as a consistent model only if backreaction is relatively small with respect to the background. To find the corresponding consistency restrictions on an external electric field and its duration in QED and QCD, we analyze the mean-energy density of quantized fields for an arbitrary constant electric field E, acting during a large but finite time T. Using the corresponding asymptotics with respect to the dimensionless parameter eET2, one can see that the leading contributions to the energy are due to the creation of particles by the electric field. Assuming that these contributions are small in comparison with the energy density of the electric background, we establish the above-mentioned restrictions, which determine, in fact, the time scales from above of depletion of an electric field due to the backreaction.

  17. NEW OBSERVATION OF FAILED FILAMENT ERUPTIONS: THE INFLUENCE OF ASYMMETRIC CORONAL BACKGROUND FIELDS ON SOLAR ERUPTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Y.; Xu, Z.; Su, J.

    2009-05-01

    Failed filament eruptions not associated with a coronal mass ejection (CME) have been observed and reported as evidence for solar coronal field confinement on erupting flux ropes. In those events, each filament eventually returns to its origin on the solar surface. In this Letter, a new observation of two failed filament eruptions is reported which indicates that the mass of a confined filament can be ejected to places far from the original filament channel. The jetlike mass motions in the two failed filament eruptions are thought to be due to the asymmetry of the background coronal magnetic fields with respectmore » to the locations of the filament channels. The asymmetry of the coronal fields is confirmed by an extrapolation based on a potential field model. The obvious imbalance between the positive and negative magnetic flux (with a ratio of 1:3) in the bipolar active region is thought to be the direct cause of the formation of the asymmetric coronal fields. We think that the asymmetry of the background fields can not only influence the trajectories of ejecta, but also provide a relatively stronger confinement for flux rope eruptions than the symmetric background fields do.« less

  18. Intact skull chronic windows for mesoscopic wide-field imaging in awake mice

    PubMed Central

    Silasi, Gergely; Xiao, Dongsheng; Vanni, Matthieu P.; Chen, Andrew C. N.; Murphy, Timothy H.

    2016-01-01

    Background Craniotomy-based window implants are commonly used for microscopic imaging, in head-fixed rodents, however their field of view is typically small and incompatible with mesoscopic functional mapping of cortex. New Method We describe a reproducible and simple procedure for chronic through-bone wide-field imaging in awake head-fixed mice providing stable optical access for chronic imaging over large areas of the cortex for months. Results The preparation is produced by applying clear-drying dental cement to the intact mouse skull, followed by a glass coverslip to create a partially transparent imaging surface. Surgery time takes about 30 minutes. A single set-screw provides a stable means of attachment for mesoscale assessment without obscuring the cortical field of view. Comparison with Existing Methods We demonstrate the utility of this method by showing seed-pixel functional connectivity maps generated from spontaneous cortical activity of GCAMP6 signals in both awake and anesthetized mice. Conclusions We propose that the intact skull preparation described here may be used for most longitudinal studies that do not require micron scale resolution and where cortical neural or vascular signals are recorded with intrinsic sensors. PMID:27102043

  19. Auditory evoked field measurement using magneto-impedance sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, K., E-mail: o-kabou@echo.nuee.nagoya-u.ac.jp; Tajima, S.; Song, D.

    The magnetic field of the human brain is extremely weak, and it is mostly measured and monitored in the magnetoencephalography method using superconducting quantum interference devices. In this study, in order to measure the weak magnetic field of the brain, we constructed a Magneto-Impedance sensor (MI sensor) system that can cancel out the background noise without any magnetic shield. Based on our previous studies of brain wave measurements, we used two MI sensors in this system for monitoring both cerebral hemispheres. In this study, we recorded and compared the auditory evoked field signals of the subject, including the N100 (ormore » N1) and the P300 (or P3) brain waves. The results suggest that the MI sensor can be applied to brain activity measurement.« less

  20. Implementation of a flow-dependent background error correlation length scale formulation in the NEMOVAR OSTIA system

    NASA Astrophysics Data System (ADS)

    Fiedler, Emma; Mao, Chongyuan; Good, Simon; Waters, Jennifer; Martin, Matthew

    2017-04-01

    OSTIA is the Met Office's Operational Sea Surface Temperature (SST) and Ice Analysis system, which produces L4 (globally complete, gridded) analyses on a daily basis. Work is currently being undertaken to replace the original OI (Optimal Interpolation) data assimilation scheme with NEMOVAR, a 3D-Var data assimilation method developed for use with the NEMO ocean model. A dual background error correlation length scale formulation is used for SST in OSTIA, as implemented in NEMOVAR. Short and long length scales are combined according to the ratio of the decomposition of the background error variances into short and long spatial correlations. The pre-defined background error variances vary spatially and seasonally, but not on shorter time-scales. If the derived length scales applied to the daily analysis are too long, SST features may be smoothed out. Therefore a flow-dependent component to determining the effective length scale has also been developed. The total horizontal gradient of the background SST field is used to identify regions where the length scale should be shortened. These methods together have led to an improvement in the resolution of SST features compared to the previous OI analysis system, without the introduction of spurious noise. This presentation will show validation results for feature resolution in OSTIA using the OI scheme, the dual length scale NEMOVAR scheme, and the flow-dependent implementation.

  1. Analytical and numerical solutions of the potential and electric field generated by different electrode arrays in a tumor tissue under electrotherapy

    PubMed Central

    2011-01-01

    Background Electrotherapy is a relatively well established and efficient method of tumor treatment. In this paper we focus on analytical and numerical calculations of the potential and electric field distributions inside a tumor tissue in a two-dimensional model (2D-model) generated by means of electrode arrays with shapes of different conic sections (ellipse, parabola and hyperbola). Methods Analytical calculations of the potential and electric field distributions based on 2D-models for different electrode arrays are performed by solving the Laplace equation, meanwhile the numerical solution is solved by means of finite element method in two dimensions. Results Both analytical and numerical solutions reveal significant differences between the electric field distributions generated by electrode arrays with shapes of circle and different conic sections (elliptic, parabolic and hyperbolic). Electrode arrays with circular, elliptical and hyperbolic shapes have the advantage of concentrating the electric field lines in the tumor. Conclusion The mathematical approach presented in this study provides a useful tool for the design of electrode arrays with different shapes of conic sections by means of the use of the unifying principle. At the same time, we verify the good correspondence between the analytical and numerical solutions for the potential and electric field distributions generated by the electrode array with different conic sections. PMID:21943385

  2. Atmospheric effect on classification of finite fields. [satellite-imaged agricultural areas

    NASA Technical Reports Server (NTRS)

    Kaufman, Y. J.; Fraser, R. S.

    1984-01-01

    The atmospheric effect on the upward radiance of sunlight scattered from the earth-atmosphere system is strongly influenced by the contrasts between fields and their sizes. In this paper, the radiances above finite fields are computed to simulate radiances measured by a satellite. A simulation case including 11 agricultural fields and four natural fields (water, soil, savanah, and forest) is used to test the effect of field size, background reflectance, and optical thickness of the atmosphere on the classification accuracy. For a given atmospheric turbidity, the atmospheric effect on classification of surface features may be much stronger for nonuniform surfaces than for uniform surfaces. Therefore, the classification accuracy of agricultural fields and urban areas is dependent not only on the optical characteristics of the atmosphere, but also on the size of the surface elements to be classified and their contrasts. It is concluded that new atmospheric correction methods, which take into account the finite size of the fields, are needed.

  3. Can We Predict CME Deflections Based on Solar Magnetic Field Configuration Alone?

    NASA Astrophysics Data System (ADS)

    Kay, C.; Opher, M.; Evans, R. M.

    2013-12-01

    Accurate space weather forecasting requires knowledge of the trajectory of coronal mass ejections (CMEs), including predicting CME deflections close to the Sun and through interplanetary space. Deflections of CMEs occur due to variations in the background magnetic field or solar wind speed, magnetic reconnection, and interactions with other CMEs. Using our newly developed model of CME deflections due to gradients in the background solar magnetic field, ForeCAT (Kay et al. 2013), we explore the questions: (a) do all simulated CMEs ultimately deflect to the minimum in the background solar magnetic field? (b) does the majority of the deflection occur in the lower corona below 4 Rs? ForeCAT does not include temporal variations in the magnetic field of active regions (ARs), spatial variations in the background solar wind speed, magnetic reconnection, or interactions with other CMEs. Therefore we focus on the effects of the steady state solar magnetic field. We explore two different Carrington Rotations (CRs): CR 2029 (April-May 2005) and CR 2077 (November-December 2008). Little is known about how the density and magnetic field fall with distance in the lower corona. We consider four density models derived from observations (Chen 1996, Mann et al. 2003, Guhathakurta et al. 2006, Leblanc et al. 1996) and two magnetic field models (PFSS and a scaled model). ForeCAT includes drag resulting from both CME propagation and deflection through the background solar wind. We vary the drag coefficient to explore the effect of drag on the deflection at 1 AU.

  4. Use of capillary gas chromatography with negative ion-chemical ionization mass spectrometry for the determination of perfluorocarbon tracers in the atmosphere.

    PubMed

    Cooke, K M; Simmonds TPG; Nickless, G; Makepeace, A P

    2001-09-01

    A sensitive and selective technique for the quantitative measurement of atmospheric perfluorocarbon trace species at the sub part per quadrillion (10(-15)) levels is presented. The method utilizes advances in adsorbent enrichment techniques coupled with benchtop capillary gas chromatography and negative ion-chemical ionization mass spectrometry. The development and enhancement of sampling technology for tracer experiments is described, and the results from background measurements and a preliminary field experiment are presented. The overall precision of the analytical method with respect to the preferred tracer for these atmospheric transport studies, perfluoromethylcyclohexane, was +/-1.7%. The background concentrations of perfluorodimethylcyclobutane, perfluoromethylcyclopentane, and perfluoromethylcyclohexane at a remote coastal location (Mace Head, Ireland, 53 degrees N, 10 degrees W) were found to be 2.5 (+/-0.4), 6.8 (+/-1.0), and 5.2 fL L(-1) (+/-1.3), respectively. Background concentrations within an urban conurbation (Bristol, U.K.) were slightly greater at 3.0 (+/-1.5), 8.1 (+/-1.8), and 6.3 fL L(-1) (+/-1.1), respectively.

  5. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  6. A detection method for X-ray images based on wavelet transforms: the case of the ROSAT PSPC.

    NASA Astrophysics Data System (ADS)

    Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.

    1996-02-01

    The authors have developed a method based on wavelet transforms (WT) to detect efficiently sources in PSPC X-ray images. The multiscale approach typical of WT can be used to detect sources with a large range of sizes, and to estimate their size and count rate. Significance thresholds for candidate detections (found as local WT maxima) have been derived from a detailed study of the probability distribution of the WT of a locally uniform background. The use of the exposure map allows good detection efficiency to be retained even near PSPC ribs and edges. The algorithm may also be used to get upper limits to the count rate of undetected objects. Simulations of realistic PSPC images containing either pure background or background+sources were used to test the overall algorithm performances, and to assess the frequency of spurious detections (vs. detection threshold) and the algorithm sensitivity. Actual PSPC images of galaxies and star clusters show the algorithm to have good performance even in cases of extended sources and crowded fields.

  7. Assessing occupational exposure to sea lamprey pesticides

    PubMed Central

    Ceballos, Diana M; Beaucham, Catherine C; Kurtz, Kristine; Musolin, Kristin

    2015-01-01

    Background: Sea lampreys are parasitic fish found in lakes of the United States and Canada. Sea lamprey is controlled through manual application of the pesticides 3-trifluoromethyl-4-nitrophenol (TFM) and BayluscideTM into streams and tributaries. 3-Trifluoromethyl-4-nitrophenol may cause irritation and central nervous system depression and Bayluscide may cause irritation, dermatitis, blisters, cracking, edema, and allergic skin reactions. Objectives: To assess occupational exposures to sea lamprey pesticides. Methods: We developed a wipe method for evaluating surface and skin contamination with these pesticides. This method was field tested at a biological field station and at a pesticide river application. We also evaluated exposures using control banding tools. Results: We verified TFM surface contamination at the biological station. At the river application, we found surfaces and worker’s skin contaminated with pesticides. Conclusion: We recommended minimizing exposures by implementing engineering controls and improved use of personal protective equipment. PMID:25730600

  8. Clinical Case Studies in Psychoanalytic and Psychodynamic Treatment

    PubMed Central

    Willemsen, Jochem; Della Rosa, Elena; Kegerreis, Sue

    2017-01-01

    This manuscript provides a review of the clinical case study within the field of psychoanalytic and psychodynamic treatment. The method has been contested for methodological reasons and because it would contribute to theoretical pluralism in the field. We summarize how the case study method is being applied in different schools of psychoanalysis, and we clarify the unique strengths of this method and areas for improvement. Finally, based on the literature and on our own experience with case study research, we come to formulate nine guidelines for future case study authors: (1) basic information to include, (2) clarification of the motivation to select a particular patient, (3) information about informed consent and disguise, (4) patient background and context of referral or self-referral, (5) patient's narrative, therapist's observations and interpretations, (6) interpretative heuristics, (7) reflexivity and counter-transference, (8) leaving room for interpretation, and (9) answering the research question, and comparison with other cases. PMID:28210235

  9. Joint 3D Inversion of ZTEM Airborne and Ground MT Data with Application to Geothermal Exploration

    NASA Astrophysics Data System (ADS)

    Wannamaker, P. E.; Maris, V.; Kordy, M. A.

    2017-12-01

    ZTEM is an airborne electromagnetic (EM) geophysical technique developed by Geotech Inc® where naturally propagated EM fields originating with regional and global lightning discharges (sferics) are measured as a means of inferring subsurface electrical resistivity structure. A helicopter-borne coil platform (bird) measuring the vertical component of magnetic (H) field variations along a flown profile is referenced to a pair of horizontal coils at a fixed location on the ground in order to estimate a tensor H-field transfer function. The ZTEM method is distinct from the traditional magnetotelluric (MT) method in that the electric (E) fields are not considered because of the technological challenge of measuring E-fields in the dielectric air medium. This can lend some non-uniqueness to ZTEM interpretation because a range of conductivity structures in the earth depending upon an assumed background earth resistivity model can fit ZTEM data to within tolerance. MT data do not suffer this particular problem, but they are cumbersome to acquire in their common need for land-based transport often in near-roadless areas and for laying out and digging the electrodes and H coils. The complementary nature of ZTEM and MT logistics and resolution has motivated development of schemes to acquire appropriate amounts of each data type in a single survey and to produce an earth image through joint inversion. In particular, consideration is given to surveys where only sparse MT soundings are needed to drastically reduce the non-uniqueness associated with background uncertainty while straining logistics minimally. Synthetic and field data are analysed using 2D and 3D finite element platforms developed for this purpose. Results to date suggest that indeed dense ZTEM surveys can provide detailed heterogeneous model images with large-scale averages constrained by a modest number of MT soundings. Further research is needed in determining the allowable degree of MT sparseness and the relative weighting of the two data sets in joint inversion.

  10. Cosine problem in EPRL/FK spinfoam model

    NASA Astrophysics Data System (ADS)

    Vojinović, Marko

    2014-01-01

    We calculate the classical limit effective action of the EPRL/FK spinfoam model of quantum gravity coupled to matter fields. By employing the standard QFT background field method adapted to the spinfoam setting, we find that the model has many different classical effective actions. Most notably, these include the ordinary Einstein-Hilbert action coupled to matter, but also an action which describes antigravity. All those multiple classical limits appear as a consequence of the fact that the EPRL/FK vertex amplitude has cosine-like large spin asymptotics. We discuss some possible ways to eliminate the unwanted classical limits.

  11. Ground state structure of random magnets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bastea, S.; Duxbury, P.M.

    1998-10-01

    Using exact optimization methods, we find all of the ground states of ({plus_minus}h) random-field Ising magnets (RFIM) and of dilute antiferromagnets in a field (DAFF). The degenerate ground states are usually composed of isolated clusters (two-level systems) embedded in a frozen background. We calculate the paramagnetic response (sublattice response) and the ground state entropy for the RFIM (DAFF) due to these clusters. In both two and three dimensions there is a broad regime in which these quantities are strictly positive, even at irrational values of h/J (J is the exchange constant). {copyright} {ital 1998} {ital The American Physical Society}

  12. [Psychiatric Rehabilitation - From the Linear Continuum Approach Towards Supported Inclusion].

    PubMed

    Richter, Dirk; Hertig, Res; Hoffmann, Holger

    2016-11-01

    Background: For many decades, psychiatric rehabilitation in the German-speaking countries is following a conventional linear continuum approach. Methods: Recent developments in important fields related to psychiatric rehabilitation (UN Convention on the Rights of People with Disabilities, theory of rehabilitation, empirical research) are reviewed. Results: Common to all developments in the reviewed fields are the principles of choice, autonomy and social inclusion. These principles contradict the conventional linear continuum approach. Conclusions: The linear continuum approach of psychiatric rehabilitation should be replaced by the "supported inclusion"-approach. © Georg Thieme Verlag KG Stuttgart · New York.

  13. 4 CFR 202.1 - Description.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... background suggesting the American Flag; upon a blue field, which fills background space above the Eagle's..., monochromatic version of the seal in which the above-described blue field and red-and-gold stripes are replaced...

  14. 4 CFR 202.1 - Description.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... background suggesting the American Flag; upon a blue field, which fills background space above the Eagle's..., monochromatic version of the seal in which the above-described blue field and red-and-gold stripes are replaced...

  15. 4 CFR 202.1 - Description.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... background suggesting the American Flag; upon a blue field, which fills background space above the Eagle's..., monochromatic version of the seal in which the above-described blue field and red-and-gold stripes are replaced...

  16. Covariant effective action for a Galilean invariant quantum Hall system

    NASA Astrophysics Data System (ADS)

    Geracie, Michael; Prabhu, Kartik; Roberts, Matthew M.

    2016-09-01

    We construct effective field theories for gapped quantum Hall systems coupled to background geometries with local Galilean invariance i.e. Bargmann spacetimes. Along with an electromagnetic field, these backgrounds include the effects of curved Galilean spacetimes, including torsion and a gravitational field, allowing us to study charge, energy, stress and mass currents within a unified framework. A shift symmetry specific to single constituent theories constraints the effective action to couple to an effective background gauge field and spin connection that is solved for by a self-consistent equation, providing a manifestly covariant extension of Hoyos and Son's improvement terms to arbitrary order in m.

  17. Method and apparatus for predicting the direction of movement in machine vision

    NASA Technical Reports Server (NTRS)

    Lawton, Teri B. (Inventor)

    1992-01-01

    A computer-simulated cortical network is presented. The network is capable of computing the visibility of shifts in the direction of movement. Additionally, the network can compute the following: (1) the magnitude of the position difference between the test and background patterns; (2) localized contrast differences at different spatial scales analyzed by computing temporal gradients of the difference and sum of the outputs of paired even- and odd-symmetric bandpass filters convolved with the input pattern; and (3) the direction of a test pattern moved relative to a textured background. The direction of movement of an object in the field of view of a robotic vision system is detected in accordance with nonlinear Gabor function algorithms. The movement of objects relative to their background is used to infer the 3-dimensional structure and motion of object surfaces.

  18. Probabilistic segmentation and intensity estimation for microarray images.

    PubMed

    Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro

    2006-01-01

    We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.

  19. Dynamics of Magnetized Plasma Jets and Bubbles Launched into a Background Magnetized Plasma

    NASA Astrophysics Data System (ADS)

    Wallace, B.; Zhang, Y.; Fisher, D. M.; Gilmore, M.

    2016-10-01

    The propagation of dense magnetized plasma, either collimated with mainly azimuthal B-field (jet) or toroidal with closed B-field (bubble), in a background plasma occurs in a number of solar and astrophysical cases. Such cases include coronal mass ejections moving in the background solar wind and extragalactic radio lobes expanding into the extragalactic medium. Understanding the detailed MHD behavior is crucial for correctly modeling these events. In order to further the understanding of such systems, we are investigating the injection of dense magnetized jets and bubbles into a lower density background magnetized plasma using a coaxial plasma gun and a background helicon or cathode plasma. In both jet and bubble cases, the MHD dynamics are found to be very different when launched into background plasma or magnetic field, as compared to vacuum. In the jet case, it is found that the inherent kink instability is stabilized by velocity shear developed due to added magnetic tension from the background field. In the bubble case, rather than directly relaxing to a minimum energy Taylor state (spheromak) as in vacuum, there is an expansion asymmetry and the bubble becomes Rayleigh-Taylor unstable on one side. Recent results will be presented. Work supported by the Army Research Office Award No. W911NF1510480.

  20. Magnetar giant flares in multipolar magnetic fields. I. Fully and partially open eruptions of flux ropes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Lei; Yu, Cong, E-mail: muduri@shao.ac.cn, E-mail: cyu@ynao.ac.cn

    2014-04-01

    We propose a catastrophic eruption model for the enormous energy release of magnetars during giant flares, in which a toroidal and helically twisted flux rope is embedded within a force-free magnetosphere. The flux rope stays in stable equilibrium states initially and evolves quasi-statically. Upon the loss of equilibrium, the flux rope cannot sustain the stable equilibrium states and erupts catastrophically. During the process, the magnetic energy stored in the magnetosphere is rapidly released as the result of destabilization of global magnetic topology. The magnetospheric energy that could be accumulated is of vital importance for the outbursts of magnetars. We carefullymore » establish the fully open fields and partially open fields for various boundary conditions at the magnetar surface and study the relevant energy thresholds. By investigating the magnetic energy accumulated at the critical catastrophic point, we find that it is possible to drive fully open eruptions for dipole-dominated background fields. Nevertheless, it is hard to generate fully open magnetic eruptions for multipolar background fields. Given the observational importance of the multipolar magnetic fields in the vicinity of the magnetar surface, it would be worthwhile to explore the possibility of the alternative eruption approach in multipolar background fields. Fortunately, we find that flux ropes may give rise to partially open eruptions in the multipolar fields, which involve only partial opening of background fields. The energy release fractions are greater for cases with central-arcaded multipoles than those with central-caved multipoles that emerged in background fields. Eruptions would fail only when the centrally caved multipoles become extremely strong.« less

  1. Estimation of the dilution field near a marine outfall by using effluent turbidity as an environmental tracer and comparison with dye tracer data.

    PubMed

    Pecly, José Otavio Goulart

    2018-01-01

    The alternative use of effluent turbidity to determine the dilution field of a domestic marine outfall located off the city of Rio de Janeiro was evaluated through field work comprising fluorescent dye tracer injection and tracking with simultaneous monitoring of sea water turbidity. A preliminary laboratory assessment was carried out with a sample of the outfall effluent whose turbidity was measured by the nephelometric method before and during a serial dilution process. During the field campaign, the dye tracer was monitored with field fluorometers and the turbidity was observed with an optical backscattering sensor interfaced to an OEM data acquisition system. About 4,000 samples were gathered, covering an area of 3 km × 3 km near the outfall diffusers. At the far field - where a drift towards the coastline was observed - the effluent plume was adequately labeled by the dye tracer. The turbidity plume was biased due to the high and variable background turbidity of sea water. After processing the turbidity dataset with a baseline detrending method, the plume presented high correlation with the dye tracer plume drawn on the near dilution field. However, dye tracer remains more robust than effluent turbidity.

  2. From field schools and the lecture hall to online: Hands-on teaching based on the real science experience worldwide for MOOCs ?

    NASA Astrophysics Data System (ADS)

    Huettmann, F.

    2015-12-01

    University-teaching is among the most difficult teaching tasks. That's because it involves to present front-line research schemes to students with complex backgrounds as a precious human resource of the future using, latest teaching styles, and many institutional fallacies to handle well. Here I present 15 years of experience from teaching in field schools, in the class room, and with pedagogical methods such as traditional top-down teaching, inquiry-based learning, eLearning, and flipped classrooms. I contrast those with teaching Massive Open Access Online Classes (MOOC) style. Here I review pros and cons of all these teaching methods and provide and outlook taking class evaluations, cost models and satisfaction of students, teachers, the university and the wider good into account.

  3. An adaptive tensor voting algorithm combined with texture spectrum

    NASA Astrophysics Data System (ADS)

    Wang, Gang; Su, Qing-tang; Lü, Gao-huan; Zhang, Xiao-feng; Liu, Yu-huan; He, An-zhi

    2015-01-01

    An adaptive tensor voting algorithm combined with texture spectrum is proposed. The image texture spectrum is used to get the adaptive scale parameter of voting field. Then the texture information modifies both the attenuation coefficient and the attenuation field so that we can use this algorithm to create more significant and correct structures in the original image according to the human visual perception. At the same time, the proposed method can improve the edge extraction quality, which includes decreasing the flocculent region efficiently and making image clear. In the experiment for extracting pavement cracks, the original pavement image is processed by the proposed method which is combined with the significant curve feature threshold procedure, and the resulted image displays the faint crack signals submerged in the complicated background efficiently and clearly.

  4. Conducting Field Research in a Primary School Setting: Methodological Considerations for Maximizing Response Rates, Data Quality and Quantity

    ERIC Educational Resources Information Center

    Trapp, Georgina; Giles-Corti, Billie; Martin, Karen; Timperio, Anna; Villanueva, Karen

    2012-01-01

    Background: Schools are an ideal setting in which to involve children in research. Yet for investigators wishing to work in these settings, there are few method papers providing insights into working efficiently in this setting. Objective: The aim of this paper is to describe the five strategies used to increase response rates, data quality and…

  5. Bayesian spatial prediction of the site index in the study of the Missouri Ozark Forest Ecosystem Project

    Treesearch

    Xiaoqian Sun; Zhuoqiong He; John Kabrick

    2008-01-01

    This paper presents a Bayesian spatial method for analysing the site index data from the Missouri Ozark Forest Ecosystem Project (MOFEP). Based on ecological background and availability, we select three variables, the aspect class, the soil depth and the land type association as covariates for analysis. To allow great flexibility of the smoothness of the random field,...

  6. Impact of the Arizona NExSS Winter School on Interdisciplinary Knowledge and Attitudes

    NASA Astrophysics Data System (ADS)

    Huff, Cierra; Burnam-Fink, Michael; Desch, Steven; Apai, Dániel

    2018-01-01

    The Nexus for Exoplanet System Science (NExSS) is a NASA-funded research coordination network whose focus is on investigating exoplanet diversity and devising strategies for searching for life on exoplanets. The fields of exoplanets and astrobiology are inherently highly interdisciplinary. Progress in these fields demands that researchers with various scientific backgrounds understand the issues and techniques of allied fields of study, including the tools and approaches used to solve different problems, as well as their limitations.In 2016, the NExSS teams at Arizona State University (ASU) and University of Arizona (UA) hosted 32 graduate students and postdoctoral researchers from various scientific backgrounds for one week at the Arizona NExSS Winter School. To bridge the gaps between fields and promote interdisciplinarity, students participated in lessons, field trips, hands-on activities, and a capstone proposal-writing activity. To assess the impact of the School on knowledge and attitudes about other fields, we administered a pre- and post-School questionnaire designed using the Impact Analysis Method of Davis & Scalice (2015).The results show that all participants gained knowledge at the School, especially in areas outside their primary field of study. The questionnaire revealed interesting differences in attitudes as well. When asked whether the geochemistry of Earth without life is predictable, planetary scientists were more likely than average to say yes, and geologists were more likely than average to say no. Their attitudes had converged after participation in the School. These results demonstrate that the Arizona NExSS Winter School was impactful not just in the knowledge gained, but in the interdisciplinary attitudes of students.

  7. Chemical-exchange-sensitive MRI of amide, amine and NOE at 9.4 T versus 15.2 T.

    PubMed

    Chung, Julius Juhyun; Choi, Wonmin; Jin, Tao; Lee, Jung Hee; Kim, Seong-Gi

    2017-09-01

    Chemical exchange (CE)-sensitive MRI benefits greatly from stronger magnetic fields; however, field effects on CE-sensitive imaging have not yet been studied well in vivo. We have compared CE-sensitive Z-spectra and maps obtained at the fields of 9.4 T and 15.2 T in phantoms and rats with off-resonance chemical-exchange-sensitive spin lock (CESL), which is similar to conventional chemical exchange saturation transfer. At higher fields, the background peak at water resonance has less spread and the exchange rate relative to chemical shift decreases, thus CESL intensity is dependent on B 0 . For the in vivo amide and nuclear Overhauser enhancement (NOE) composite resonances of rat brains, intensities were similar for both magnetic fields, but effective amide proton transfer and NOE values obtained with three-point quantification or a curve fitting method were larger at 15.2 T due to the reduced spread of attenuation at the direct water resonance. When using intermediate exchange-sensitive irradiation parameters, the amine proton signal was 65% higher at 15.2 T than at 9.4 T due to a reduced ratio of exchange rate to chemical shift. In summary, increasing magnetic field provides enhancements to CE-sensitive signals in the intermediate exchange regime and reduces contamination from background signals in the slow exchange regime. Consequently, ultrahigh magnetic field is advantageous for CE-sensitive MRI, especially for amine and hydroxyl protons. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Rogue waves in the Davey-Stewartson I equation.

    PubMed

    Ohta, Yasuhiro; Yang, Jianke

    2012-09-01

    General rogue waves in the Davey-Stewartson-I equation are derived by the bilinear method. It is shown that the simplest (fundamental) rogue waves are line rogue waves which arise from the constant background with a line profile and then disappear into the constant background again. It is also shown that multirogue waves describe the interaction of several fundamental rogue waves. These multirogue waves also arise from the constant background and then decay back to it, but in the intermediate times, interesting curvy wave patterns appear. However, higher-order rogue waves exhibit different dynamics. Specifically, only part of the wave structure in the higher-order rogue waves rises from the constant background and then retreats back to it, and this transient wave possesses patterns such as parabolas. But the other part of the wave structure comes from the far distance as a localized lump, which decelerates to the near field and interacts with the transient rogue wave, and is then reflected back and accelerates to the large distance again.

  9. Dissipation function and adaptive gradient reconstruction based smoke detection in video

    NASA Astrophysics Data System (ADS)

    Li, Bin; Zhang, Qiang; Shi, Chunlei

    2017-11-01

    A method for smoke detection in video is proposed. The camera monitoring the scene is assumed to be stationary. With the atmospheric scattering model, dissipation function is reflected transmissivity between the background objects in the scene and the camera. Dark channel prior and fast bilateral filter are used for estimating dissipation function which is only the function of the depth of field. Based on dissipation function, visual background extractor (ViBe) can be used for detecting smoke as a result of smoke's motion characteristics as well as detecting other moving targets. Since smoke has semi-transparent parts, the things which are covered by these parts can be recovered by poisson equation adaptively. The similarity between the recovered parts and the original background parts in the same position is calculated by Normalized Cross Correlation (NCC) and the original background's value is selected from the frame which is nearest to the current frame. The parts with high similarity are considered as smoke parts.

  10. Compensating for magnetic field inhomogeneity in multigradient-echo-based MR thermometry.

    PubMed

    Simonis, Frank F J; Petersen, Esben T; Bartels, Lambertus W; Lagendijk, Jan J W; van den Berg, Cornelis A T

    2015-03-01

    MR thermometry (MRT) is a noninvasive method for measuring temperature that can potentially be used for radio frequency (RF) safety monitoring. This application requires measuring absolute temperature. In this study, a multigradient-echo (mGE) MRT sequence was used for that purpose. A drawback of this sequence, however, is that its accuracy is affected by background gradients. In this article, we present a method to minimize this effect and to improve absolute temperature measurements using MRI. By determining background gradients using a B0 map or by combining data acquired with two opposing readout directions, the error can be removed in a homogenous phantom, thus improving temperature maps. All scans were performed on a 3T system using ethylene glycol-filled phantoms. Background gradients were varied, and one phantom was uniformly heated to validate both compensation approaches. Independent temperature recordings were made with optical probes. Errors correlated closely to the background gradients in all experiments. Temperature distributions showed a much smaller standard deviation when the corrections were applied (0.21°C vs. 0.45°C) and correlated well with thermo-optical probes. The corrections offer the possibility to measure RF heating in phantoms more precisely. This allows mGE MRT to become a valuable tool in RF safety assessment. © 2014 Wiley Periodicals, Inc.

  11. A new method of inshore ship detection in high-resolution optical remote sensing images

    NASA Astrophysics Data System (ADS)

    Hu, Qifeng; Du, Yaling; Jiang, Yunqiu; Ming, Delie

    2015-10-01

    Ship as an important military target and water transportation, of which the detection has great significance. In the military field, the automatic detection of ships can be used to monitor ship dynamic in the harbor and maritime of enemy, and then analyze the enemy naval power. In civilian field, the automatic detection of ships can be used in monitoring transportation of harbor and illegal behaviors such as illegal fishing, smuggling and pirates, etc. In recent years, research of ship detection is mainly concentrated in three categories: forward-looking infrared images, downward-looking SAR image, and optical remote sensing images with sea background. Little research has been done into ship detection of optical remote sensing images with harbor background, as the gray-scale and texture features of ships are similar to the coast in high-resolution optical remote sensing images. In this paper, we put forward an effective harbor ship target detection method. First of all, in order to overcome the shortage of the traditional difference method in obtaining histogram valley as the segmentation threshold, we propose an iterative histogram valley segmentation method which separates the harbor and ships from the water quite well. Secondly, as landing ships in optical remote sensing images usually lead to discontinuous harbor edges, we use Hough Transform method to extract harbor edges. First, lines are detected by Hough Transform. Then, lines that have similar slope are connected into a new line, thus we access continuous harbor edges. Secondary segmentation on the result of the land-and-sea separation, we eventually get the ships. At last, we calculate the aspect ratio of the ROIs, thereby remove those targets which are not ship. The experiment results show that our method has good robustness and can tolerate a certain degree of noise and occlusion.

  12. Gravitational signature of Schwarzschild black holes in dynamical Chern-Simons gravity

    NASA Astrophysics Data System (ADS)

    Molina, C.; Pani, Paolo; Cardoso, Vitor; Gualtieri, Leonardo

    2010-06-01

    Dynamical Chern-Simons gravity is an extension of general relativity in which the gravitational field is coupled to a scalar field through a parity-violating Chern-Simons term. In this framework, we study perturbations of spherically symmetric black hole spacetimes, assuming that the background scalar field vanishes. Our results suggest that these spacetimes are stable, and small perturbations die away as a ringdown. However, in contrast to standard general relativity, the gravitational waveforms are also driven by the scalar field. Thus, the gravitational oscillation modes of black holes carry imprints of the coupling to the scalar field. This is a smoking gun for Chern-Simons theory and could be tested with gravitational-wave detectors, such as LIGO or LISA. For negative values of the coupling constant, ghosts are known to arise, and we explicitly verify their appearance numerically. Our results are validated using both time evolution and frequency domain methods.

  13. Gravitational signature of Schwarzschild black holes in dynamical Chern-Simons gravity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Molina, C.; Pani, Paolo; Cardoso, Vitor

    2010-06-15

    Dynamical Chern-Simons gravity is an extension of general relativity in which the gravitational field is coupled to a scalar field through a parity-violating Chern-Simons term. In this framework, we study perturbations of spherically symmetric black hole spacetimes, assuming that the background scalar field vanishes. Our results suggest that these spacetimes are stable, and small perturbations die away as a ringdown. However, in contrast to standard general relativity, the gravitational waveforms are also driven by the scalar field. Thus, the gravitational oscillation modes of black holes carry imprints of the coupling to the scalar field. This is a smoking gun formore » Chern-Simons theory and could be tested with gravitational-wave detectors, such as LIGO or LISA. For negative values of the coupling constant, ghosts are known to arise, and we explicitly verify their appearance numerically. Our results are validated using both time evolution and frequency domain methods.« less

  14. En route to Background Independence: Broken split-symmetry, and how to restore it with bi-metric average actions

    NASA Astrophysics Data System (ADS)

    Becker, D.; Reuter, M.

    2014-11-01

    The most momentous requirement a quantum theory of gravity must satisfy is Background Independence, necessitating in particular an ab initio derivation of the arena all non-gravitational physics takes place in, namely spacetime. Using the background field technique, this requirement translates into the condition of an unbroken split-symmetry connecting the (quantized) metric fluctuations to the (classical) background metric. If the regularization scheme used violates split-symmetry during the quantization process it is mandatory to restore it in the end at the level of observable physics. In this paper we present a detailed investigation of split-symmetry breaking and restoration within the Effective Average Action (EAA) approach to Quantum Einstein Gravity (QEG) with a special emphasis on the Asymptotic Safety conjecture. In particular we demonstrate for the first time in a non-trivial setting that the two key requirements of Background Independence and Asymptotic Safety can be satisfied simultaneously. Carefully disentangling fluctuation and background fields, we employ a 'bi-metric' ansatz for the EAA and project the flow generated by its functional renormalization group equation on a truncated theory space spanned by two separate Einstein-Hilbert actions for the dynamical and the background metric, respectively. A new powerful method is used to derive the corresponding renormalization group (RG) equations for the Newton- and cosmological constant, both in the dynamical and the background sector. We classify and analyze their solutions in detail, determine their fixed point structure, and identify an attractor mechanism which turns out instrumental in the split-symmetry restoration. We show that there exists a subset of RG trajectories which are both asymptotically safe and split-symmetry restoring: In the ultraviolet they emanate from a non-Gaussian fixed point, and in the infrared they loose all symmetry violating contributions inflicted on them by the non-invariant functional RG equation. As an application, we compute the scale dependent spectral dimension which governs the fractal properties of the effective QEG spacetimes at the bi-metric level. Earlier tests of the Asymptotic Safety conjecture almost exclusively employed 'single-metric truncations' which are blind towards the difference between quantum and background fields. We explore in detail under which conditions they can be reliable, and we discuss how the single-metric based picture of Asymptotic Safety needs to be revised in the light of the new results. We shall conclude that the next generation of truncations for quantitatively precise predictions (of critical exponents, for instance) is bound to be of the bi-metric type.

  15. Feasibility, acceptability and clinical utility of the Cultural Formulation Interview: mixed-methods results from the DSM-5 international field trial.

    PubMed

    Lewis-Fernández, Roberto; Aggarwal, Neil Krishan; Lam, Peter C; Galfalvy, Hanga; Weiss, Mitchell G; Kirmayer, Laurence J; Paralikar, Vasudeo; Deshpande, Smita N; Díaz, Esperanza; Nicasio, Andel V; Boiler, Marit; Alarcón, Renato D; Rohlof, Hans; Groen, Simon; van Dijk, Rob C J; Jadhav, Sushrut; Sarmukaddam, Sanjeev; Ndetei, David; Scalco, Monica Z; Bassiri, Kavoos; Aguilar-Gaxiola, Sergio; Ton, Hendry; Westermeyer, Joseph; Vega-Dienstmaier, Johann M

    2017-04-01

    Background There is a need for clinical tools to identify cultural issues in diagnostic assessment. Aims To assess the feasibility, acceptability and clinical utility of the DSM-5 Cultural Formulation Interview (CFI) in routine clinical practice. Method Mixed-methods evaluation of field trial data from six countries. The CFI was administered to diagnostically diverse psychiatric out-patients during a diagnostic interview. In post-evaluation sessions, patients and clinicians completed debriefing qualitative interviews and Likert-scale questionnaires. The duration of CFI administration and the full diagnostic session were monitored. Results Mixed-methods data from 318 patients and 75 clinicians found the CFI feasible, acceptable and useful. Clinician feasibility ratings were significantly lower than patient ratings and other clinician-assessed outcomes. After administering one CFI, however, clinician feasibility ratings improved significantly and subsequent interviews required less time. Conclusions The CFI was included in DSM-5 as a feasible, acceptable and useful cultural assessment tool. © The Royal College of Psychiatrists 2017.

  16. Quantification of cuttlefish (Sepia officinalis) camouflage: a study of color and luminance using in situ spectrometry.

    PubMed

    Akkaynak, Derya; Allen, Justine J; Mäthger, Lydia M; Chiao, Chuan-Chin; Hanlon, Roger T

    2013-03-01

    Cephalopods are renowned for their ability to adaptively camouflage on diverse backgrounds. Sepia officinalis camouflage body patterns have been characterized spectrally in the laboratory but not in the field due to the challenges of dynamic natural light fields and the difficulty of using spectrophotometric instruments underwater. To assess cuttlefish color match in their natural habitats, we studied the spectral properties of S. officinalis and their backgrounds on the Aegean coast of Turkey using point-by-point in situ spectrometry. Fifteen spectrometry datasets were collected from seven cuttlefish; radiance spectra from animal body components and surrounding substrates were measured at depths shallower than 5 m. We quantified luminance and color contrast of cuttlefish components and background substrates in the eyes of hypothetical di- and trichromatic fish predators. Additionally, we converted radiance spectra to sRGB color space to simulate their in situ appearance to a human observer. Within the range of natural colors at our study site, cuttlefish closely matched the substrate spectra in a variety of body patterns. Theoretical calculations showed that this effect might be more pronounced at greater depths. We also showed that a non-biological method ("Spectral Angle Mapper"), commonly used for spectral shape similarity assessment in the field of remote sensing, shows moderate correlation to biological measures of color contrast. This performance is comparable to that of a traditional measure of spectral shape similarity, hue and chroma. This study is among the first to quantify color matching of camouflaged cuttlefish in the wild.

  17. Merging Multi-model CMIP5/PMIP3 Past-1000 Ensemble Simulations with Tree Ring Proxy Data by Optimal Interpolation Approach

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Luo, Yong; Xing, Pei; Nie, Suping; Tian, Qinhua

    2015-04-01

    Two sets of gridded annual mean surface air temperature in past millennia over the Northern Hemisphere was constructed employing optimal interpolation (OI) method so as to merge the tree ring proxy records with the simulations from CMIP5 (the fifth phase of the Climate Model Intercomparison Project). Both the uncertainties in proxy reconstruction and model simulations can be taken into account applying OI algorithm. For better preservation of physical coordinated features and spatial-temporal completeness of climate variability in 7 copies of model results, we perform the Empirical Orthogonal Functions (EOF) analysis to truncate the ensemble mean field as the first guess (background field) for OI. 681 temperature sensitive tree-ring chronologies are collected and screened from International Tree Ring Data Bank (ITRDB) and Past Global Changes (PAGES-2k) project. Firstly, two methods (variance matching and linear regression) are employed to calibrate the tree ring chronologies with instrumental data (CRUTEM4v) individually. In addition, we also remove the bias of both the background field and proxy records relative to instrumental dataset. Secondly, time-varying background error covariance matrix (B) and static "observation" error covariance matrix (R) are calculated for OI frame. In our scheme, matrix B was calculated locally, and "observation" error covariance are partially considered in R matrix (the covariance value between the pairs of tree ring sites that are very close to each other would be counted), which is different from the traditional assumption that R matrix should be diagonal. Comparing our results, it turns out that regional averaged series are not sensitive to the selection for calibration methods. The Quantile-Quantile plots indicate regional climatologies based on both methods are tend to be more agreeable with regional reconstruction of PAGES-2k in 20th century warming period than in little ice age (LIA). Lager volcanic cooling response over Asia and Europe in context of recent millennium are detected in our datasets than that revealed in regional reconstruction from PAGES-2k network. Verification experiments have showed that the merging approach really reconcile the proxy data and model ensemble simulations in an optimal way (with smaller errors than both of them). Further research is needed to improve the error estimation on them.

  18. Landau problem with time dependent mass in time dependent electric and harmonic background fields

    NASA Astrophysics Data System (ADS)

    Lawson, Latévi M.; Avossevou, Gabriel Y. H.

    2018-04-01

    The spectrum of a Hamiltonian describing the dynamics of a Landau particle with time-dependent mass and frequency undergoing the influence of a uniform time-dependent electric field is obtained. The configuration space wave function of the model is expressed in terms of the generalised Laguerre polynomials. To diagonalize the time-dependent Hamiltonian, we employ the Lewis-Riesenfeld method of invariants. To this end, we introduce a unitary transformation in the framework of the algebraic formalism to construct the invariant operator of the system and then to obtain the exact solution of the Hamiltonian. We recover the solutions of the ordinary Landau problem in the absence of the electric and harmonic fields for a constant particle mass.

  19. Laboratory Experiments on Propagating Plasma Bubbles into Vacuum, Vacuum Magnetic Field, and Background Plasmas

    NASA Astrophysics Data System (ADS)

    Lynn, Alan G.; Zhang, Yue; Gilmore, Mark; Hsu, Scott

    2014-10-01

    We discuss the dynamics of plasma ``bubbles'' as they propagate through a variety of background media. These bubbles are formed by a pulsed coaxial gun with an externally applied magnetic field. Bubble parameters are typically ne ~1020 m-3, Te ~ 5 - 10 eV, and Ti ~ 10 - 15 eV. The structure of the bubbles can range from unmagnetized jet-like structures to spheromak-like structures with complex magnetic flux surfaces. Some of the background media the bubbles interact with are vacuum, vacuum with magnetic field, and other magnetized plasmas. These bubbles exhibit different qualitative behavior depending on coaxial gun parameters such as gas species, gun current, and gun bias magnetic field. Their behavior also depends on the parameters of the background they propagate through. Multi-frame fast camera imaging and magnetic probe data are used to characterize the bubble evolution under various conditions.

  20. Spinorial Geometry and Supergravity

    NASA Astrophysics Data System (ADS)

    Gillard, Joe

    2006-08-01

    In the main part of this thesis, we present the foundations and initial results of the Spinorial Geometry formalism for solving Killing spinor equations. This method can be used for any supergravity theory, although we largely focus on D=11 supergravity. The D=5 case is investigated in an appendix. The exposition provides a comprehensive introduction to the formalism, and contains background material on the complex spin representations which, it is hoped, will provide a useful bridge between the mathematical literature and our methods. Many solutions to the D=11 Killing spinor equations are presented, and the consequences for the spacetime geometry are explored in each case. Also in this thesis, we consider another class of supergravity solutions, namely heterotic string backgrounds with (2,0) world-sheet supersymmetry. We investigate the consequences of taking alpha-prime corrections into account in the field equations, in order to remain consistent with anomaly cancellation, while requiring that spacetime supersymmetry is preserved.

  1. Detection of bremsstrahlung radiation of 90Sr-90Y for emergency lung counting.

    PubMed

    Ho, A; Hakmana Witharana, S S; Jonkmans, G; Li, L; Surette, R A; Dubeau, J; Dai, X

    2012-09-01

    This study explores the possibility of developing a field-deployable (90)Sr detector for rapid lung counting in emergency situations. The detection of beta-emitters (90)Sr and its daughter (90)Y inside the human lung via bremsstrahlung radiation was performed using a 3″ × 3″ NaI(Tl) crystal detector and a polyethylene-encapsulated source to emulate human lung tissue. The simulation results show that this method is a viable technique for detecting (90)Sr with a minimum detectable activity (MDA) of 1.07 × 10(4) Bq, using a realistic dual-shielded detector system in a 0.25-µGy h(-1) background field for a 100-s scan. The MDA is sufficiently sensitive to meet the requirement for emergency lung counting of Type S (90)Sr intake. The experimental data were verified using Monte Carlo calculations, including an estimate for internal bremsstrahlung, and an optimisation of the detector geometry was performed. Optimisations in background reduction techniques and in the electronic acquisition systems are suggested.

  2. Functional determinants of radial operators in AdS 2

    NASA Astrophysics Data System (ADS)

    Aguilera-Damia, Jeremías; Faraggi, Alberto; Zayas, Leopoldo Pando; Rathee, Vimal; Silva, Guillermo A.

    2018-06-01

    We study the zeta-function regularization of functional determinants of Laplace and Dirac-type operators in two-dimensional Euclidean AdS 2 space. More specifically, we consider the ratio of determinants between an operator in the presence of background fields with circular symmetry and the free operator in which the background fields are absent. By Fourier-transforming the angular dependence, one obtains an infinite number of one-dimensional radial operators, the determinants of which are easy to compute. The summation over modes is then treated with care so as to guarantee that the result coincides with the two-dimensional zeta-function formalism. The method relies on some well-known techniques to compute functional determinants using contour integrals and the construction of the Jost function from scattering theory. Our work generalizes some known results in flat space. The extension to conformal AdS 2 geometries is also considered. We provide two examples, one bosonic and one fermionic, borrowed from the spectrum of fluctuations of the holographic 1/4 -BPS latitude Wilson loop.

  3. Allowing for Slow Evolution of Background Plasma in the 3D FDTD Plasma, Sheath, and Antenna Model

    NASA Astrophysics Data System (ADS)

    Smithe, David; Jenkins, Thomas; King, Jake

    2015-11-01

    We are working to include a slow-time evolution capability for what has previously been the static background plasma parameters, in the 3D finite-difference time-domain (FDTD) plasma and sheath model used to model ICRF antennas in fusion plasmas. A key aspect of this is SOL-density time-evolution driven by ponderomotive rarefaction from the strong fields in the vicinity of the antenna. We demonstrate and benchmark a Scalar Ponderomotive Potential method, based on local field amplitudes, which is included in the 3D simulation. And present a more advanced Tensor Ponderomotive Potential approach, which we hope to employ in the future, which should improve the physical fidelity in the highly anisotropic environment of the SOL. Finally, we demonstrate and benchmark slow time (non-linear) evolution of the RF sheath, and include realistic collisional effects from the neutral gas. Support from US DOE Grants DE-FC02-08ER54953, DE-FG02-09ER55006.

  4. Lens models and magnification maps of the six Hubble Frontier Fields clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Traci L.; Sharon, Keren; Bayliss, Matthew B.

    2014-12-10

    We present strong-lensing models as well as mass and magnification maps for the cores of the six Hubble Space Telescope (HST) Frontier Fields galaxy clusters. Our parametric lens models are constrained by the locations and redshifts of multiple image systems of lensed background galaxies. We use a combination of photometric redshifts and spectroscopic redshifts of the lensed background sources obtained by us (for A2744 and AS1063), collected from the literature, or kindly provided by the lensing community. Using our results, we (1) compare the derived mass distribution of each cluster to its light distribution, (2) quantify the cumulative magnification powermore » of the HST Frontier Fields clusters, (3) describe how our models can be used to estimate the magnification and image multiplicity of lensed background sources at all redshifts and at any position within the cluster cores, and (4) discuss systematic effects and caveats resulting from our modeling methods. We specifically investigate the effect of the use of spectroscopic and photometric redshift constraints on the uncertainties of the resulting models. We find that the photometric redshift estimates of lensed galaxies are generally in excellent agreement with spectroscopic redshifts, where available. However, the flexibility associated with relaxed redshift priors may cause the complexity of large-scale structure that is needed to account for the lensing signal to be underestimated. Our findings thus underline the importance of spectroscopic arc redshifts, or tight photometric redshift constraints, for high precision lens models. All products from our best-fit lens models (magnification, convergence, shear, deflection field) and model simulations for estimating errors are made available via the Mikulski Archive for Space Telescopes.« less

  5. Ion and impurity transport in turbulent, anisotropic magnetic fields

    NASA Astrophysics Data System (ADS)

    Negrea, M.; Petrisor, I.; Isliker, H.; Vogiannou, A.; Vlahos, L.; Weyssow, B.

    2011-08-01

    We investigate ion and impurity transport in turbulent, possibly anisotropic, magnetic fields. The turbulent magnetic field is modeled as a correlated stochastic field, with Gaussian distribution function and prescribed spatial auto-correlation function, superimposed onto a strong background field. The (running) diffusion coefficients of ions are determined in the three-dimensional environment, using two alternative methods, the semi-analytical decorrelation trajectory (DCT) method, and test-particle simulations. In a first step, the results of the test-particle simulations are compared with and used to validate the results obtained from the DCT method. For this purpose, a drift approximation was made in slab geometry, and relatively good qualitative agreement between the DCT method and the test-particle simulations was found. In a second step, the ion species He, Be, Ne and W, all assumed to be fully ionized, are considered under ITER-like conditions, and the scaling of their diffusivities is determined with respect to varying levels of turbulence (varying Kubo number), varying degrees of anisotropy of the turbulent structures and atomic number. In a third step, the test-particle simulations are repeated without drift approximation, directly using the Lorentz force, first in slab geometry, in order to assess the finite Larmor radius effects, and second in toroidal geometry, to account for the geometric effects. It is found that both effects are important, most prominently the effects due to toroidal geometry and the diffusivities are overestimated in slab geometry by an order of magnitude.

  6. A sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image

    NASA Astrophysics Data System (ADS)

    Li, Jing; Xie, Weixin; Pei, Jihong

    2018-03-01

    Sea-land segmentation is one of the key technologies of sea target detection in remote sensing images. At present, the existing algorithms have the problems of low accuracy, low universality and poor automatic performance. This paper puts forward a sea-land segmentation algorithm based on multi-feature fusion for a large-field remote sensing image removing island. Firstly, the coastline data is extracted and all of land area is labeled by using the geographic information in large-field remote sensing image. Secondly, three features (local entropy, local texture and local gradient mean) is extracted in the sea-land border area, and the three features combine a 3D feature vector. And then the MultiGaussian model is adopted to describe 3D feature vectors of sea background in the edge of the coastline. Based on this multi-gaussian sea background model, the sea pixels and land pixels near coastline are classified more precise. Finally, the coarse segmentation result and the fine segmentation result are fused to obtain the accurate sea-land segmentation. Comparing and analyzing the experimental results by subjective vision, it shows that the proposed method has high segmentation accuracy, wide applicability and strong anti-disturbance ability.

  7. Covariant effective action for a Galilean invariant quantum Hall system

    DOE PAGES

    Geracie, Michael; Prabhu, Kartik; Roberts, Matthew M.

    2016-09-16

    Here, we construct effective field theories for gapped quantum Hall systems coupled to background geometries with local Galilean invariance i.e. Bargmann spacetimes. Along with an electromagnetic field, these backgrounds include the effects of curved Galilean spacetimes, including torsion and a gravitational field, allowing us to study charge, energy, stress and mass currents within a unified framework. A shift symmetry specific to single constituent theories constraints the effective action to couple to an effective background gauge field and spin connection that is solved for by a self-consistent equation, providing a manifestly covariant extension of Hoyos and Son’s improvement terms to arbitrarymore » order in m.« less

  8. Scattering on plane waves and the double copy

    NASA Astrophysics Data System (ADS)

    Adamo, Tim; Casali, Eduardo; Mason, Lionel; Nekovar, Stefan

    2018-01-01

    Perturbatively around flat space, the scattering amplitudes of gravity are related to those of Yang–Mills by colour-kinematic duality, under which gravitational amplitudes are obtained as the ‘double copy’ of the corresponding gauge theory amplitudes. We consider the question of how to extend this relationship to curved scattering backgrounds, focusing on certain ‘sandwich’ plane waves. We calculate the 3-point amplitudes on these backgrounds and find that a notion of double copy remains in the presence of background curvature: graviton amplitudes on a gravitational plane wave are the double copy of gluon amplitudes on a gauge field plane wave. This is non-trivial in that it requires a non-local replacement rule for the background fields and the momenta and polarization vectors of the fields scattering on the backgrounds. It must also account for new ‘tail’ terms arising from scattering off the background. These encode a memory effect in the scattering amplitudes, which naturally double copies as well.

  9. Entanglement Entropy of the Six-Dimensional Horowitz-Strominger Black Hole

    NASA Astrophysics Data System (ADS)

    Li, Huai-Fan; Zhang, Sheng-Li; Wu, Yue-Qin; Ren, Zhao

    By using the entanglement entropy method, the statistical entropy of the Bose and Fermi fields in a thin film is calculated and the Bekenstein-Hawking entropy of six-dimensional Horowitz-Strominger black hole is obtained. Here, the Bose and Fermi fields are entangled with the quantum states in six-dimensional Horowitz-Strominger black hole and the fields are outside of the horizon. The divergence of brick-wall model is avoided without any cutoff by the new equation of state density obtained with the generalized uncertainty principle. The calculation implies that the high density quantum states near the event horizon are strongly correlated with the quantum states in black hole. The black hole entropy is a quantum effect. It is an intrinsic characteristic of space-time. The ultraviolet cutoff in the brick-wall model is unreasonable. The generalized uncertainty principle should be considered in the high energy quantum field near the event horizon. Using the quantum statistical method, we directly calculate the partition function of the Bose and Fermi fields under the background of the six-dimensional black hole. The difficulty in solving the wave equations of various particles is overcome.

  10. Two-Camera Acquisition and Tracking of a Flying Target

    NASA Technical Reports Server (NTRS)

    Biswas, Abhijit; Assad, Christopher; Kovalik, Joseph M.; Pain, Bedabrata; Wrigley, Chris J.; Twiss, Peter

    2008-01-01

    A method and apparatus have been developed to solve the problem of automated acquisition and tracking, from a location on the ground, of a luminous moving target in the sky. The method involves the use of two electronic cameras: (1) a stationary camera having a wide field of view, positioned and oriented to image the entire sky; and (2) a camera that has a much narrower field of view (a few degrees wide) and is mounted on a two-axis gimbal. The wide-field-of-view stationary camera is used to initially identify the target against the background sky. So that the approximate position of the target can be determined, pixel locations on the image-detector plane in the stationary camera are calibrated with respect to azimuth and elevation. The approximate target position is used to initially aim the gimballed narrow-field-of-view camera in the approximate direction of the target. Next, the narrow-field-of view camera locks onto the target image, and thereafter the gimbals are actuated as needed to maintain lock and thereby track the target with precision greater than that attainable by use of the stationary camera.

  11. On the origin of diverse aftershock mechanisms following the 1989 Loma Prieta earthquake

    USGS Publications Warehouse

    Kilb, Debi; Ellis, M.; Gomberg, J.; Davis, S.

    1997-01-01

    We test the hypothesis that the origin of the diverse suite of aftershock mechanisms following the 1989 M 7.1 Loma Prieta, California, earthquake is related to the post-main-shock static stress field. We use a 3-D boundary-element algorithm to calculate static stresses, combined with a Coulomb failure criterion to calculate conjugate failure planes at aftershock locations. The post-main-shock static stress field is taken as the sum of a pre-existing stress field and changes in stress due to the heterogeneous slip across the Loma Prieta rupture plane. The background stress field is assumed to be either a simple shear parallel to the regional trend of the San Andreas fault or approximately fault-normal compression. A suite of synthetic aftershock mechanisms from the conjugate failure planes is generated and quantitatively compared (allowing for uncertainties in both mechanism parameters and earthquake locations) to well-constrained mechanisms reported in the US Geological Survey Northern California Seismic Network catalogue. We also compare calculated rakes with those observed by resolving the calculated stress tensor onto observed focal mechanism nodal planes, assuming either plane to be a likely rupture plane. Various permutations of the assumed background stress field, frictional coefficients of aftershock fault planes, methods of comparisons, etc. explain between 52 and 92 per cent of the aftershock mechanisms. We can explain a similar proportion of mechanisms however by comparing a randomly reordered catalogue with the various suites of synthetic aftershocks. The inability to duplicate aftershock mechanisms reliably on a one-to-one basis is probably a function of the combined uncertainties in models of main-shock slip distribution, the background stress field, and aftershock locations. In particular we show theoretically that any specific main-shock slip distribution and a reasonable background stress field are able to generate a highly variable suite of failure planes such that quite different aftershock mechanisms may be expected to occur within a kilometre or less of each other. This scale of variability is less than the probable location error of aftershock earthquakes in the Loma Prieta region. We successfully duplicate a measure of the variability in the mechanisms of the entire suite of aftershocks. If static stress changes are responsible for the generation of aftershock mechanisms, we are able to place quantitative constraints on the level of stress that must have existed in the upper crust prior to the Loma Prieta rupture. This stress level appears to be too low to generate the average slip across the main-shock rupture plane. Possible reasons for this result range from incorrect initial assumptions of homogeneity in the background stress field, friction and fault geometry to driving stresses that arise from deeper in the crust or upper mantle. Alternatively, aftershock focal mechanisms may be determined by processes other than, or in addition to, static stress changes, such as pore-pressure changes or dynamic stresses.

  12. Percent body fat estimations in college women using field and laboratory methods: a three-compartment model approach

    PubMed Central

    Moon, Jordan R; Hull, Holly R; Tobkin, Sarah E; Teramoto, Masaru; Karabulut, Murat; Roberts, Michael D; Ryan, Eric D; Kim, So Jung; Dalbo, Vincent J; Walter, Ashley A; Smith, Abbie T; Cramer, Joel T; Stout, Jeffrey R

    2007-01-01

    Background Methods used to estimate percent body fat can be classified as a laboratory or field technique. However, the validity of these methods compared to multiple-compartment models has not been fully established. This investigation sought to determine the validity of field and laboratory methods for estimating percent fat (%fat) in healthy college-age women compared to the Siri three-compartment model (3C). Methods Thirty Caucasian women (21.1 ± 1.5 yrs; 164.8 ± 4.7 cm; 61.2 ± 6.8 kg) had their %fat estimated by BIA using the BodyGram™ computer program (BIA-AK) and population-specific equation (BIA-Lohman), NIR (Futrex® 6100/XL), a quadratic (SF3JPW) and linear (SF3WB) skinfold equation, air-displacement plethysmography (BP), and hydrostatic weighing (HW). Results All methods produced acceptable total error (TE) values compared to the 3C model. Both laboratory methods produced similar TE values (HW, TE = 2.4%fat; BP, TE = 2.3%fat) when compared to the 3C model, though a significant constant error (CE) was detected for HW (1.5%fat, p ≤ 0.006). The field methods produced acceptable TE values ranging from 1.8 – 3.8 %fat. BIA-AK (TE = 1.8%fat) yielded the lowest TE among the field methods, while BIA-Lohman (TE = 2.1%fat) and NIR (TE = 2.7%fat) produced lower TE values than both skinfold equations (TE > 2.7%fat) compared to the 3C model. Additionally, the SF3JPW %fat estimation equation resulted in a significant CE (2.6%fat, p ≤ 0.007). Conclusion Data suggest that the BP and HW are valid laboratory methods when compared to the 3C model to estimate %fat in college-age Caucasian women. When the use of a laboratory method is not feasible, NIR, BIA-AK, BIA-Lohman, SF3JPW, and SF3WB are acceptable field methods to estimate %fat in this population. PMID:17988393

  13. Near-Roadway Emission of Reactive Nitrogen Compounds and Other Non-Criteria Pollutants at a Southern California Freeway Site

    NASA Astrophysics Data System (ADS)

    Moss, J. A.; Baum, M.; Castonguay, A. E.; Aguirre, V., Jr.; Pesta, A.; Fanter, R. K.; Anderson, M.

    2015-12-01

    Emission control systems in light-duty motor vehicles (LDMVs) have played an important role in improving regional air quality by dramatically reducing the concentration of criteria pollutants (carbon monoxide, hydrocarbons, and nitrogen oxides) in exhaust emissions. Unintended side-reactions occurring on the surface of three-way catalysts may lead to emission of a number of non-criteria pollutants whose identity and emission rates are poorly understood. A series of near-roadway field studies conducted between 2009-2015 has investigated LDMV emissions of these pollutants with unprecedented depth of coverage, including reactive nitrogen compounds (NH3, amines, HCN, HONO, and HNO3), organic peroxides, and carbonyl compounds (aldehydes, ketones, and carboxylic acids). Methods to collect these pollutants using mist chambers, annular denuders, impingers, and solid-phase cartridges and quantify their concentration using GC-MS, LC-MS/MS, IC, and colorimetry were developed and validated in the laboratory and field. These methods were subsequently used in near-roadway field studies where the concentrations of the target compounds integrated over 1-4 hour blocks were measured at the edge of a freeway and at a background site 140 m from the roadway. Concentrations followed a steep decreasing gradient from the freeway to the background site. Emission factors (pollutant mass emitted per mass fuel consumed) were calculated by carbon mass balance using the difference in concentration measured between the freeway and background sites for the emitted pollutant and CO2 as a measure of carbon mass in the vehicle exhaust. The significance of these results will be discussed in terms of emissions inventories in the South Coast Air Basin of California, emission trends at this site over the period of 2009-2015, and for NH3, emission measurements conducted by our group and others over the period 2000-2015.

  14. A quantum kinematics for asymptotically flat gravity

    NASA Astrophysics Data System (ADS)

    Campiglia, Miguel; Varadarajan, Madhavan

    2015-07-01

    We construct a quantum kinematics for asymptotically flat gravity based on the Koslowski-Sahlmann (KS) representation. The KS representation is a generalization of the representation underlying loop quantum gravity (LQG) which supports, in addition to the usual LQG operators, the action of ‘background exponential operators’, which are connection dependent operators labelled by ‘background’ su(2) electric fields. KS states have, in addition to the LQG state label corresponding to one dimensional excitations of the triad, a label corresponding to a ‘background’ electric field that describes three dimensional excitations of the triad. Asymptotic behaviour in quantum theory is controlled through asymptotic conditions on the background electric fields that label the states and the background electric fields that label the operators. Asymptotic conditions on the triad are imposed as conditions on the background electric field state label while confining the LQG spin net graph labels to compact sets. We show that KS states can be realised as wave functions on a quantum configuration space of generalized connections and that the asymptotic behaviour of each such generalized connection is determined by that of the background electric fields which label the background exponential operators. Similar to the spatially compact case, the Gauss law and diffeomorphism constraints are then imposed through group averaging techniques to obtain a large sector of gauge invariant states. It is shown that this sector supports a unitary action of the group of asymptotic rotations and translations and that, as anticipated by Friedman and Sorkin, for appropriate spatial topology, this sector contains states that display fermionic behaviour under 2π rotations.

  15. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.

  16. In-line interferometer for broadband near-field scanning optical spectroscopy.

    PubMed

    Brauer, Jens; Zhan, Jinxin; Chimeh, Abbas; Korte, Anke; Lienau, Christoph; Gross, Petra

    2017-06-26

    We present and investigate a novel approach towards broad-bandwidth near-field scanning optical spectroscopy based on an in-line interferometer for homodyne mixing of the near field and a reference field. In scattering-type scanning near-field optical spectroscopy, the near-field signal is usually obscured by a large amount of unwanted background scattering from the probe shaft and the sample. Here we increase the light reflected from the sample by a semi-transparent gold layer and use it as a broad-bandwidth, phase-stable reference field to amplify the near-field signal in the visible and near-infrared spectral range. We experimentally demonstrate that this efficiently suppresses the unwanted background signal in monochromatic near-field measurements. For rapid acquisition of complete broad-bandwidth spectra we employ a monochromator and a fast line camera. Using this fast acquisition of spectra and the in-line interferometer we demonstrate the measurement of pure near-field spectra. The experimental observations are quantitatively explained by analytical expressions for the measured optical signals, based on Fourier decomposition of background and near field. The theoretical model and in-line interferometer together form an important step towards broad-bandwidth near-field scanning optical spectroscopy.

  17. Numerical simulation of large-scale field-aligned current generation from finite-amplitude magnetosonic waves

    NASA Technical Reports Server (NTRS)

    Yamauchi, M.

    1994-01-01

    A two-dimensional numerical simulation of finite-amplitude magnetohydrodynamic (MHD) magnetosonic waves is performed under a finite-velocity background convection condition. Isothermal cases are considered for simplicity. External dissipation is introduced by assuming that the field-aligned currents are generated in proportion to the accumulated charges. The simulation results are as follows: Paired field-aligned currents are found from the simulated waves. The flow directions of these field-aligned currents depend on the angle between the background convection and the wave normal, and hence two pairs of field-aligned currents are found from a bowed wave if we look at the overall structure. The majority of these field-aligned currents are closed within each pair rather than between two wings. These features are not observed under slow background convection. The result could be applied to the cusp current system and the substorm current system.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This report was prepared at the request of the Lawrence Livermore Laboratory (LLL) to provide background information for analyzing soil-structure interaction by the frequency-independent impedance function approach. LLL is conducting such analyses as part of its seismic review of selected operating plants under the Systematic Evaluation Program for the US Nuclear Regulatory Commission. The analytical background and basic assumptionsof the impedance function theory are briefly reviewed, and the role of radiation damping in soil-structure interaction analysis is discussed. The validity of modeling soil-structure interaction by using frequency-independent functions is evaluated based on data from several field tests. Finally, the recommendedmore » procedures for performing soil-structure interaction analyses are discussed with emphasis on the modal superposition method.« less

  19. TORO II: A finite element computer program for nonlinear quasi-static problems in electromagnetics: Part 1, Theoretical background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gartling, D.K.

    The theoretical and numerical background for the finite element computer program, TORO II, is presented in detail. TORO II is designed for the multi-dimensional analysis of nonlinear, electromagnetic field problems described by the quasi-static form of Maxwell`s equations. A general description of the boundary value problems treated by the program is presented. The finite element formulation and the associated numerical methods used in TORO II are also outlined. Instructions for the use of the code are documented in SAND96-0903; examples of problems analyzed with the code are also provided in the user`s manual. 24 refs., 8 figs.

  20. Magnetic Fields Recorded by Chondrules Formed in Nebular Shocks

    NASA Astrophysics Data System (ADS)

    Mai, Chuhong; Desch, Steven J.; Boley, Aaron C.; Weiss, Benjamin P.

    2018-04-01

    Recent laboratory efforts have constrained the remanent magnetizations of chondrules and the magnetic field strengths to which the chondrules were exposed as they cooled below their Curie points. An outstanding question is whether the inferred paleofields represent the background magnetic field of the solar nebula or were unique to the chondrule-forming environment. We investigate the amplification of the magnetic field above background values for two proposed chondrule formation mechanisms, large-scale nebular shocks and planetary bow shocks. Behind large-scale shocks, the magnetic field parallel to the shock front is amplified by factors of ∼10–30, regardless of the magnetic diffusivity. Therefore, chondrules melted in these shocks probably recorded an amplified magnetic field. Behind planetary bow shocks, the field amplification is sensitive to the magnetic diffusivity. We compute the gas properties behind a bow shock around a 3000 km radius planetary embryo, with and without atmospheres, using hydrodynamics models. We calculate the ionization state of the hot, shocked gas, including thermionic emission from dust, thermal ionization of gas-phase potassium atoms, and the magnetic diffusivity due to Ohmic dissipation and ambipolar diffusion. We find that the diffusivity is sufficiently large that magnetic fields have already relaxed to background values in the shock downstream where chondrules acquire magnetizations, and that these locations are sufficiently far from the planetary embryos that chondrules should not have recorded a significant putative dynamo field generated on these bodies. We conclude that, if melted in planetary bow shocks, chondrules probably recorded the background nebular field.

  1. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  2. Buoyant Helical Twin-Axial Wire Antenna

    DTIC Science & Technology

    2016-11-15

    300169 1 of 9 BUOYANT HELICAL TWIN-AXIAL WIRE ANTENNA CROSS REFERENCE TO OTHER PATENT APPLICATIONS [0001] This application is a divisional...Wire Antenna ” by the inventor, David A. Tonn. STATEMENT OF GOVERNMENT INTEREST [0002] The invention described herein may be manufactured and used by...BACKGROUND OF THE INVENTION (1) Field of the Invention [0003] The present invention is directed to a linear antenna for dual frequencies and a method for

  3. Bringing a Perspective from Outside the Field: A Commentary on Davis et al.'s (2010) Use of a Modified Regression Discontinuity Design to Evaluate a Gifted Program

    ERIC Educational Resources Information Center

    Adelson, Jill L.; Kelcey, Benjamin

    2016-01-01

    In this commentary of "Evaluating the Gifted Program of an Urban School District Using a Modified Regression Discontinuity Design" by Davis, Engberg, Epple, Sieg, and Zimmer, we examine the background of the study, critique the methods used, and discuss the results and implications. The study used a fuzzy regression discontinuity design…

  4. Proof of factorization using background field method of QCD

    NASA Astrophysics Data System (ADS)

    Nayak, Gouranga C.

    2010-02-01

    Factorization theorem plays the central role at high energy colliders to study standard model and beyond standard model physics. The proof of factorization theorem is given by Collins, Soper and Sterman to all orders in perturbation theory by using diagrammatic approach. One might wonder if one can obtain the proof of factorization theorem through symmetry considerations at the lagrangian level. In this paper we provide such a proof.

  5. Proof of factorization using background field method of QCD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nayak, Gouranga C.

    Factorization theorem plays the central role at high energy colliders to study standard model and beyond standard model physics. The proof of factorization theorem is given by Collins, Soper and Sterman to all orders in perturbation theory by using diagrammatic approach. One might wonder if one can obtain the proof of factorization theorem through symmetry considerations at the lagrangian level. In this paper we provide such a proof.

  6. A vessel segmentation method for multi-modality angiographic images based on multi-scale filtering and statistical models.

    PubMed

    Lu, Pei; Xia, Jun; Li, Zhicheng; Xiong, Jing; Yang, Jian; Zhou, Shoujun; Wang, Lei; Chen, Mingyang; Wang, Cheng

    2016-11-08

    Accurate segmentation of blood vessels plays an important role in the computer-aided diagnosis and interventional treatment of vascular diseases. The statistical method is an important component of effective vessel segmentation; however, several limitations discourage the segmentation effect, i.e., dependence of the image modality, uneven contrast media, bias field, and overlapping intensity distribution of the object and background. In addition, the mixture models of the statistical methods are constructed relaying on the characteristics of the image histograms. Thus, it is a challenging issue for the traditional methods to be available in vessel segmentation from multi-modality angiographic images. To overcome these limitations, a flexible segmentation method with a fixed mixture model has been proposed for various angiography modalities. Our method mainly consists of three parts. Firstly, multi-scale filtering algorithm was used on the original images to enhance vessels and suppress noises. As a result, the filtered data achieved a new statistical characteristic. Secondly, a mixture model formed by three probabilistic distributions (two Exponential distributions and one Gaussian distribution) was built to fit the histogram curve of the filtered data, where the expectation maximization (EM) algorithm was used for parameters estimation. Finally, three-dimensional (3D) Markov random field (MRF) were employed to improve the accuracy of pixel-wise classification and posterior probability estimation. To quantitatively evaluate the performance of the proposed method, two phantoms simulating blood vessels with different tubular structures and noises have been devised. Meanwhile, four clinical angiographic data sets from different human organs have been used to qualitatively validate the method. To further test the performance, comparison tests between the proposed method and the traditional ones have been conducted on two different brain magnetic resonance angiography (MRA) data sets. The results of the phantoms were satisfying, e.g., the noise was greatly suppressed, the percentages of the misclassified voxels, i.e., the segmentation error ratios, were no more than 0.3%, and the Dice similarity coefficients (DSCs) were above 94%. According to the opinions of clinical vascular specialists, the vessels in various data sets were extracted with high accuracy since complete vessel trees were extracted while lesser non-vessels and background were falsely classified as vessel. In the comparison experiments, the proposed method showed its superiority in accuracy and robustness for extracting vascular structures from multi-modality angiographic images with complicated background noises. The experimental results demonstrated that our proposed method was available for various angiographic data. The main reason was that the constructed mixture probability model could unitarily classify vessel object from the multi-scale filtered data of various angiography images. The advantages of the proposed method lie in the following aspects: firstly, it can extract the vessels with poor angiography quality, since the multi-scale filtering algorithm can improve the vessel intensity in the circumstance such as uneven contrast media and bias field; secondly, it performed well for extracting the vessels in multi-modality angiographic images despite various signal-noises; and thirdly, it was implemented with better accuracy, and robustness than the traditional methods. Generally, these traits declare that the proposed method would have significant clinical application.

  7. Gamma/Hadron Separation for the HAWC Observatory

    NASA Astrophysics Data System (ADS)

    Gerhardt, Michael J.

    The High-Altitude Water Cherenkov (HAWC) Observatory is a gamma-ray observatory sensitive to gamma rays from 100 GeV to 100 TeV with an instantaneous field of view of ˜2 sr. It is located on the Sierra Negra plateau in Mexico at an elevation of 4,100 m and began full operation in March 2015. The purpose of the detector is to study relativistic particles that are produced by interstellar and intergalactic objects such as: pulsars, supernova remnants, molecular clouds, black holes and more. To achieve optimal angular resolution, energy reconstruction and cosmic ray background suppression for the extensive air showers detected by HAWC, good timing and charge calibration are crucial, as well as optimization of quality cuts on background suppression variables. Additions to the HAWC timing calibration, in particular automating the calibration quality checks and a new method for background suppression using a multivariate analysis are presented in this thesis.

  8. Improved background suppression in ¹H MAS NMR using composite pulses.

    PubMed

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of ¹H MAS NMR spectroscopy, particularly of solids where the concentration of ¹H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from ¹H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the ¹H MAS spectrum, while those experiencing a low B₁ field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the ¹H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the ¹H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to ¹H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. Improved background suppression in 1H MAS NMR using composite pulses

    NASA Astrophysics Data System (ADS)

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of 1H MAS NMR spectroscopy, particularly of solids where the concentration of 1H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from 1H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the 1H MAS spectrum, while those experiencing a low B1 field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the 1H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the 1H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to 1H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials.

  10. The Far-Field Hubble Constant

    NASA Astrophysics Data System (ADS)

    Lauer, Tod

    1995-07-01

    We request deep, near-IR (F814W) WFPC2 images of five nearby Brightest Cluster Galaxies (BCG) to calibrate the BCG Hubble diagram by the Surface Brightness Fluctuation (SBF) method. Lauer & Postman (1992) show that the BCG Hubble diagram measured out to 15,000 km s^-1 is highly linear. Calibration of the Hubble diagram zeropoint by SBF will thus yield an accurate far-field measure of H_0 based on the entire volume within 15,000 km s^-1, thus circumventing any strong biases caused by local peculiar velocity fields. This method of reaching the far field is contrasted with those using distance ratios between Virgo and Coma, or any other limited sample of clusters. HST is required as the ground-based SBF method is limited to <3,000 km s^-1. The high spatial resolution of HST allows precise measurement of the SBF signal at large distances, and allows easy recognition of globular clusters, background galaxies, and dust clouds in the BCG images that must be removed prior to SBF detection. The proposing team developed the SBF method, the first BCG Hubble diagram based on a full-sky, volume-limited BCG sample, played major roles in the calibration of WFPC and WFPC2, and are conducting observations of local galaxies that will validate the SBF zeropoint (through GTO programs). This work uses the SBF method to tie both the Cepheid and Local Group giant-branch distances generated by HST to the large scale Hubble flow, which is most accurately traced by BCGs.

  11. Quantum gravity in the Southern Cone Conference. Proceedings. Conference, Bariloche (Argentina), 7 - 10 Jan 1998.

    NASA Astrophysics Data System (ADS)

    1999-04-01

    The following topics are discussed: Black hole formation by canonical dynamics of gravitating shells; canonical quantum gravity; Vassiliev invariants; midisuperspace models; quantum spacetime; large-N limit of superconformal field theories and supergravity; world-volume fields and background coupling of branes; gauge enhancement and chirality changes in nonperturbative orbifold models; chiral p-forms; formally renormalizable gravitationally self-interacting string models; gauge supergravities for all odd dimensions; black hole radiation and S-matrix; primordial black holes; fluctuations in a thermal field and dissipation of a black hole spacetime in far-field limit; adiabatic interpretation of particle creation in a de Sitter universe; nonequilibrium dynamics of quantum fields in inflationary cosmology; magnetic fields in the early Universe; classical regime of a quantum universe obtained through a functional method; decoherence and correlations in semiclassical cosmology; fluid of primordial fluctuations; causal statistical mechanics calculation of initial cosmic entropy and quantum gravity prospects and black hole-D-brane correspondence.

  12. Plane-parallel waves as duals of the flat background III: T-duality with torsionless B-field

    NASA Astrophysics Data System (ADS)

    Hlavatý, Ladislav; Petr, Ivo; Petrásek, Filip

    2018-04-01

    By addition of non-zero, but torsionless B-field, we expand the classification of (non-)Abelian T-duals of the flat background in four dimensions with respect to 1, 2, 3 and 4D subgroups of the Poincaré group. We discuss the influence of the additional B-field on the process of dualization, and identify essential parts of the torsionless B-field that cannot in general be eliminated by coordinate or gauge transformation of the dual background. These effects are demonstrated using particular examples. Due to their physical importance, we focus on duals whose metrics represent plane-parallel (pp-)waves. Besides the previously found metrics, we find new pp-waves depending on parameters originating from the torsionless B-field. These pp-waves are brought into their standard forms in Brinkmann and Rosen coordinates.

  13. Conservation laws and stress-energy-momentum tensors for systems with background fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gratus, Jonathan, E-mail: j.gratus@lancaster.ac.uk; The Cockcroft Institute, Daresbury Laboratory, Warrington WA4 4AD; Obukhov, Yuri N., E-mail: yo@thp.uni-koeln.de

    2012-10-15

    This article attempts to delineate the roles played by non-dynamical background structures and Killing symmetries in the construction of stress-energy-momentum tensors generated from a diffeomorphism invariant action density. An intrinsic coordinate independent approach puts into perspective a number of spurious arguments that have historically lead to the main contenders, viz the Belinfante-Rosenfeld stress-energy-momentum tensor derived from a Noether current and the Einstein-Hilbert stress-energy-momentum tensor derived in the context of Einstein's theory of general relativity. Emphasis is placed on the role played by non-dynamical background (phenomenological) structures that discriminate between properties of these tensors particularly in the context of electrodynamics inmore » media. These tensors are used to construct conservation laws in the presence of Killing Lie-symmetric background fields. - Highlights: Black-Right-Pointing-Pointer The role of background fields in diffeomorphism invariant actions is demonstrated. Black-Right-Pointing-Pointer Interrelations between different stress-energy-momentum tensors are emphasised. Black-Right-Pointing-Pointer The Abraham and Minkowski electromagnetic tensors are discussed in this context. Black-Right-Pointing-Pointer Conservation laws in the presence of nondynamic background fields are formulated. Black-Right-Pointing-Pointer The discussion is facilitated by the development of a new variational calculus.« less

  14. CNR considerations for rapid real-time MRI tumor tracking in radiotherapy hybrid devices: Effects of B{sub 0} field strength

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wachowicz, K., E-mail: keith.wachowicz@albertaheal

    2016-08-15

    Purpose: This work examines the subject of contrast-to-noise ratio (CNR), specifically between tumor and tissue background, and its dependence on the MRI field strength, B{sub 0}. This examination is motivated by the recent interest and developments in MRI/radiotherapy hybrids where real-time imaging can be used to guide treatment beams. The ability to distinguish a tumor from background tissue is of primary importance in this field, and this work seeks to elucidate the complex relationship between the CNR and B{sub 0} that is too often assumed to be purely linear. Methods: Experimentally based models of B{sub 0}-dependant relaxation for various tumormore » and normal tissues from the literature were used in conjunction with signal equations for MR sequences suitable for rapid real-time imaging to develop field-dependent predictions for CNR. These CNR models were developed for liver, lung, breast, glioma, and kidney tumors for spoiled gradient-echo, balanced steady-state free precession (bSSFP), and single-shot half-Fourier fast spin echo sequences. Results: Due to the pattern in which the relaxation properties of tissues are found to vary over B{sub 0} field (specifically the T{sub 1} time), there was always an improved CNR at lower fields compared to linear dependency. Further, in some tumor sites, the CNR at lower fields was found to be comparable to, or sometimes higher than those at higher fields (i.e., bSSFP CNR for glioma, kidney, and liver tumors). Conclusions: In terms of CNR, lower B{sub 0} fields have been shown to perform as well or better than higher fields for some tumor sites due to superior T{sub 1} contrast. In other sites this effect was less pronounced, reversing the CNR advantage. This complex relationship between CNR and B{sub 0} reveals both low and high magnetic fields as viable options for tumor tracking in MRI/radiotherapy hybrids.« less

  15. Nonrelativistic fluids on scale covariant Newton-Cartan backgrounds

    NASA Astrophysics Data System (ADS)

    Mitra, Arpita

    2017-12-01

    The nonrelativistic covariant framework for fields is extended to investigate fields and fluids on scale covariant curved backgrounds. The scale covariant Newton-Cartan background is constructed using the localization of space-time symmetries of nonrelativistic fields in flat space. Following this, we provide a Weyl covariant formalism which can be used to study scale invariant fluids. By considering ideal fluids as an example, we describe its thermodynamic and hydrodynamic properties and explicitly demonstrate that it satisfies the local second law of thermodynamics. As a further application, we consider the low energy description of Hall fluids. Specifically, we find that the gauge fields for scale transformations lead to corrections of the Wen-Zee and Berry phase terms contained in the effective action.

  16. Quantum estimation of parameters of classical spacetimes

    NASA Astrophysics Data System (ADS)

    Downes, T. G.; van Meter, J. R.; Knill, E.; Milburn, G. J.; Caves, C. M.

    2017-11-01

    We describe a quantum limit to the measurement of classical spacetimes. Specifically, we formulate a quantum Cramér-Rao lower bound for estimating the single parameter in any one-parameter family of spacetime metrics. We employ the locally covariant formulation of quantum field theory in curved spacetime, which allows for a manifestly background-independent derivation. The result is an uncertainty relation that applies to all globally hyperbolic spacetimes. Among other examples, we apply our method to the detection of gravitational waves with the electromagnetic field as a probe, as in laser-interferometric gravitational-wave detectors. Other applications are discussed, from terrestrial gravimetry to cosmology.

  17. Inverse random source scattering for the Helmholtz equation in inhomogeneous media

    NASA Astrophysics Data System (ADS)

    Li, Ming; Chen, Chuchu; Li, Peijun

    2018-01-01

    This paper is concerned with an inverse random source scattering problem in an inhomogeneous background medium. The wave propagation is modeled by the stochastic Helmholtz equation with the source driven by additive white noise. The goal is to reconstruct the statistical properties of the random source such as the mean and variance from the boundary measurement of the radiated random wave field at multiple frequencies. Both the direct and inverse problems are considered. We show that the direct problem has a unique mild solution by a constructive proof. For the inverse problem, we derive Fredholm integral equations, which connect the boundary measurement of the radiated wave field with the unknown source function. A regularized block Kaczmarz method is developed to solve the ill-posed integral equations. Numerical experiments are included to demonstrate the effectiveness of the proposed method.

  18. Infrared Thermography Approach for Effective Shielding Area of Field Smoke Based on Background Subtraction and Transmittance Interpolation.

    PubMed

    Tang, Runze; Zhang, Tonglai; Chen, Yongpeng; Liang, Hao; Li, Bingyang; Zhou, Zunning

    2018-05-06

    Effective shielding area is a crucial indicator for the evaluation of the infrared smoke-obscuring effectiveness on the battlefield. The conventional methods for assessing the shielding area of the smoke screen are time-consuming and labor intensive, in addition to lacking precision. Therefore, an efficient and convincing technique for testing the effective shielding area of the smoke screen has great potential benefits in the smoke screen applications in the field trial. In this study, a thermal infrared sensor with a mid-wavelength infrared (MWIR) range of 3 to 5 μm was first used to capture the target scene images through clear as well as obscuring smoke, at regular intervals. The background subtraction in motion detection was then applied to obtain the contour of the smoke cloud at each frame. The smoke transmittance at each pixel within the smoke contour was interpolated based on the data that was collected from the image. Finally, the smoke effective shielding area was calculated, based on the accumulation of the effective shielding pixel points. One advantage of this approach is that it utilizes only one thermal infrared sensor without any other additional equipment in the field trial, which significantly contributes to the efficiency and its convenience. Experiments have been carried out to demonstrate that this approach can determine the effective shielding area of the field infrared smoke both practically and efficiently.

  19. Particle production in a gravitational wave background

    NASA Astrophysics Data System (ADS)

    Jones, Preston; McDougall, Patrick; Singleton, Douglas

    2017-03-01

    We study the possibility that massless particles, such as photons, are produced by a gravitational wave. That such a process should occur is implied by tree-level Feynman diagrams such as two gravitons turning into two photons, i.e., g +g →γ +γ . Here we calculate the rate at which a gravitational wave creates a massless scalar field. This is done by placing the scalar field in the background of a plane gravitational wave and calculating the 4-current of the scalar field. Even in the vacuum limit of the scalar field it has a nonzero vacuum expectation value (similar to what occurs in the Higgs mechanism) and a nonzero current. We associate this with the production of scalar field quanta by the gravitational field. This effect has potential consequences for the attenuation of gravitational waves since the massless field is being produced at the expense of the gravitational field. This is related to the time-dependent Schwinger effect, but with the electric field replaced by the gravitational wave background and the electron/positron field quanta replaced by massless scalar "photons." Since the produced scalar quanta are massless there is no exponential suppression, as occurs in the Schwinger effect due to the electron mass.

  20. Large-scale 3D inversion of marine controlled source electromagnetic data using the integral equation method

    NASA Astrophysics Data System (ADS)

    Zhdanov, M. S.; Cuma, M.; Black, N.; Wilson, G. A.

    2009-12-01

    The marine controlled source electromagnetic (MCSEM) method has become widely used in offshore oil and gas exploration. Interpretation of MCSEM data is still a very challenging problem, especially if one would like to take into account the realistic 3D structure of the subsurface. The inversion of MCSEM data is complicated by the fact that the EM response of a hydrocarbon-bearing reservoir is very weak in comparison with the background EM fields generated by an electric dipole transmitter in complex geoelectrical structures formed by a conductive sea-water layer and the terranes beneath it. In this paper, we present a review of the recent developments in the area of large-scale 3D EM forward modeling and inversion. Our approach is based on using a new integral form of Maxwell’s equations allowing for an inhomogeneous background conductivity, which results in a numerically effective integral representation for 3D EM field. This representation provides an efficient tool for the solution of 3D EM inverse problems. To obtain a robust inverse model of the conductivity distribution, we apply regularization based on a focusing stabilizing functional which allows for the recovery of models with both smooth and sharp geoelectrical boundaries. The method is implemented in a fully parallel computer code, which makes it possible to run large-scale 3D inversions on grids with millions of inversion cells. This new technique can be effectively used for active EM detection and monitoring of the subsurface targets.

  1. Influence of temperature fluctuations on infrared limb radiance: a new simulation code

    NASA Astrophysics Data System (ADS)

    Rialland, Valérie; Chervet, Patrick

    2006-08-01

    Airborne infrared limb-viewing detectors may be used as surveillance sensors in order to detect dim military targets. These systems' performances are limited by the inhomogeneous background in the sensor field of view which impacts strongly on target detection probability. This background clutter, which results from small-scale fluctuations of temperature, density or pressure must therefore be analyzed and modeled. Few existing codes are able to model atmospheric structures and their impact on limb-observed radiance. SAMM-2 (SHARC-4 and MODTRAN4 Merged), the Air Force Research Laboratory (AFRL) background radiance code can be used to in order to predict the radiance fluctuation as a result of a normalized temperature fluctuation, as a function of the line-of-sight. Various realizations of cluttered backgrounds can then be computed, based on these transfer functions and on a stochastic temperature field. The existing SIG (SHARC Image Generator) code was designed to compute the cluttered background which would be observed from a space-based sensor. Unfortunately, this code was not able to compute accurate scenes as seen by an airborne sensor especially for lines-of-sight close to the horizon. Recently, we developed a new code called BRUTE3D and adapted to our configuration. This approach is based on a method originally developed in the SIG model. This BRUTE3D code makes use of a three-dimensional grid of temperature fluctuations and of the SAMM-2 transfer functions to synthesize an image of radiance fluctuations according to sensor characteristics. This paper details the working principles of the code and presents some output results. The effects of the small-scale temperature fluctuations on infrared limb radiance as seen by an airborne sensor are highlighted.

  2. A Review of Subsequence Time Series Clustering

    PubMed Central

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  3. Analysis of aggregate impact factor inflation in ophthalmology.

    PubMed

    Caramoy, Albert; Korwitz, Ulrich; Eppelin, Anita; Kirchhof, Bernd; Fauser, Sascha

    2013-01-01

    To analyze the aggregate impact factor (AIF) in ophthalmology, its inflation rate, and its relation to other subject fields. A retrospective, database review of all subject fields in the Journal Citation Reports (JCR), Science edition. Citation data, AIF, number of journals and citations from the years 2003-2011 were analyzed. Data were retrieved from JCR. Future trends were calculated using a linear regression method. The AIF varies considerably between subjects. It shows also an inflation rate, which varies annually. The AIF inflation rate in ophthalmology was not as high as the background AIF inflation rate. The AIF inflation rate caused the AIF to increase annually. Not considering these variations in the AIF between years and between fields will make the AIF as a bibliometric tool inappropriate. Copyright © 2012 S. Karger AG, Basel.

  4. A review of subsequence time series clustering.

    PubMed

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  5. Remediation of metal-contaminated land for plant cultivation in the Arctic/subarctic region

    NASA Astrophysics Data System (ADS)

    Kikuchi, Ryunosuke; Gorbacheva, Tamara T.; Ferreira, Carla S.

    2017-04-01

    Hazardous activities and/or industries involve the use, storage or disposal of hazardous substances. These substances can sometimes contaminate the soil, which can remain contaminated for many years. The metals can have severe effects of on ecosystems. In the Arctic/subarctic regions, the Kola Peninsula (66-70°N and 28°30'-41°30'E) in Russia is one of the seriously polluted regions: close to the nickel-copper smelters, the deposition of metal pollutants has severely damaged the soil and ground vegetation, resulting in a desert area. An area of 10-15 km around the smelters on the Kola Peninsula is today dry sandy and stony ground. A great amount of financial aid is usually required to recover theland. Considering cost performance, a pilot-scale (4ha) field test was carried out to investigate how to apply municipal sewage sludge for rehabilitation of degraded land near the Ni-Cu smelter complex on the Kola Peninsula. The above-mentioned field test for soil rehabilitation was performed while smelting activities were going on; thus, the survey fields were suffering from pollution emitted by the metallurgical industry, and may continue to suffer in the future. After the composting of sewage sludge, the artificial substratum made from the compost was introduced to the test field for the polluted-land remediation, and then willows, birches and grasses were planted on the substratum. The following remarkable points in pollution load were observed between the background field and the rehabilitation test field (e.g. polluted land): (i) the annual precipitation amount of SO42- (5668 g/ha) in the rehabilitation test field was over 5 times greater than that in the background field; (ii) the Pb amount (1.5 g/ha) in the rehabilitation test field was 29 times greater than that in the background field; (iii) the Co amount (10.9 g/ha) in the rehabilitation test field was 54 times greater than that in the background field; (iv) the Cu amount (752 g/ha) in the rehabilitation field was over 600 times greater than that in the background field; and (v) the Ni amount (448 g/ha) in the rehabilitation test field was over 1,000 times greater than that in the background field. The lost vegetation is being restored by the formation of an artificial substratum made from sewage sludge compost. Essentially, sewage sludge is a solid waste; however, the obtained data imply that sewage sludge is a helpful raw material for land remediation even where there is a harsh climate, poor-nutrient soil and metal-pollution load. The test results presented in this abstract seem to be a good example of how to combine natural conservation (remediation and maintenance) with recycling of resources (sewage sludge).

  6. Researching on the process of remote sensing video imagery

    NASA Astrophysics Data System (ADS)

    Wang, He-rao; Zheng, Xin-qi; Sun, Yi-bo; Jia, Zong-ren; Wang, He-zhan

    Unmanned air vehicle remotely-sensed imagery on the low-altitude has the advantages of higher revolution, easy-shooting, real-time accessing, etc. It's been widely used in mapping , target identification, and other fields in recent years. However, because of conditional limitation, the video images are unstable, the targets move fast, and the shooting background is complex, etc., thus it is difficult to process the video images in this situation. In other fields, especially in the field of computer vision, the researches on video images are more extensive., which is very helpful for processing the remotely-sensed imagery on the low-altitude. Based on this, this paper analyzes and summarizes amounts of video image processing achievement in different fields, including research purposes, data sources, and the pros and cons of technology. Meantime, this paper explores the technology methods more suitable for low-altitude video image processing of remote sensing.

  7. Non-rigid precession of magnetic stars

    NASA Astrophysics Data System (ADS)

    Lander, S. K.; Jones, D. I.

    2017-06-01

    Stars are, generically, rotating and magnetized objects with a misalignment between their magnetic and rotation axes. Since a magnetic field induces a permanent distortion to its host, it provides effective rigidity even to a fluid star, leading to bulk stellar motion that resembles free precession. This bulk motion is, however, accompanied by induced interior velocity and magnetic field perturbations, which are oscillatory on the precession time-scale. Extending previous work, we show that these quantities are described by a set of second-order perturbation equations featuring cross-terms scaling with the product of the magnetic and centrifugal distortions to the star. For the case of a background toroidal field, we reduce these to a set of differential equations in radial functions, and find a method for their solution. The resulting magnetic field and velocity perturbations show complex multipolar structure and are strongest towards the centre of the star.

  8. Phylomemetic patterns in science evolution--the rise and fall of scientific fields.

    PubMed

    Chavalarias, David; Cointet, Jean-Philippe

    2013-01-01

    We introduce an automated method for the bottom-up reconstruction of the cognitive evolution of science, based on big-data issued from digital libraries, and modeled as lineage relationships between scientific fields. We refer to these dynamic structures as phylomemetic networks or phylomemies, by analogy with biological evolution; and we show that they exhibit strong regularities, with clearly identifiable phylomemetic patterns. Some structural properties of the scientific fields - in particular their density -, which are defined independently of the phylomemy reconstruction, are clearly correlated with their status and their fate in the phylomemy (like their age or their short term survival). Within the framework of a quantitative epistemology, this approach raises the question of predictibility for science evolution, and sketches a prototypical life cycle of the scientific fields: an increase of their cohesion after their emergence, the renewal of their conceptual background through branching or merging events, before decaying when their density is getting too low.

  9. Background Independence and Duality Invariance in String Theory.

    PubMed

    Hohm, Olaf

    2017-03-31

    Closed string theory exhibits an O(D,D) duality symmetry on tori, which in double field theory is manifest before compactification. I prove that to first order in α^{'} there is no manifestly background independent and duality invariant formulation of bosonic string theory in terms of a metric, b field, and dilaton. To this end I use O(D,D) invariant second order perturbation theory around flat space to show that the unique background independent candidate expression for the gauge algebra at order α^{'} is inconsistent with the Jacobi identity. A background independent formulation exists instead for frame variables subject to α^{'}-deformed frame transformations (generalized Green-Schwarz transformations). Potential applications for curved backgrounds, as in cosmology, are discussed.

  10. Self-force via m-mode regularization and 2+1D evolution. II. Scalar-field implementation on Kerr spacetime

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolan, Sam R.; Barack, Leor; Wardell, Barry

    2011-10-15

    This is the second in a series of papers aimed at developing a practical time-domain method for self-force calculations in Kerr spacetime. The key elements of the method are (i) removal of a singular part of the perturbation field with a suitable analytic 'puncture' based on the Detweiler-Whiting decomposition, (ii) decomposition of the perturbation equations in azimuthal (m-)modes, taking advantage of the axial symmetry of the Kerr background, (iii) numerical evolution of the individual m-modes in 2+1 dimensions with a finite-difference scheme, and (iv) reconstruction of the physical self-force from the mode sum. Here we report an implementation of themore » method to compute the scalar-field self-force along circular equatorial geodesic orbits around a Kerr black hole. This constitutes a first time-domain computation of the self-force in Kerr geometry. Our time-domain code reproduces the results of a recent frequency-domain calculation by Warburton and Barack, but has the added advantage of being readily adaptable to include the backreaction from the self-force in a self-consistent manner. In a forthcoming paper--the third in the series--we apply our method to the gravitational self-force (in the Lorenz gauge).« less

  11. Near-field electromagnetic holography for high-resolution analysis of network interactions in neuronal tissue

    PubMed Central

    Kjeldsen, Henrik D.; Kaiser, Marcus; Whittington, Miles A.

    2015-01-01

    Background Brain function is dependent upon the concerted, dynamical interactions between a great many neurons distributed over many cortical subregions. Current methods of quantifying such interactions are limited by consideration only of single direct or indirect measures of a subsample of all neuronal population activity. New method Here we present a new derivation of the electromagnetic analogy to near-field acoustic holography allowing high-resolution, vectored estimates of interactions between sources of electromagnetic activity that significantly improves this situation. In vitro voltage potential recordings were used to estimate pseudo-electromagnetic energy flow vector fields, current and energy source densities and energy dissipation in reconstruction planes at depth into the neural tissue parallel to the recording plane of the microelectrode array. Results The properties of the reconstructed near-field estimate allowed both the utilization of super-resolution techniques to increase the imaging resolution beyond that of the microelectrode array, and facilitated a novel approach to estimating causal relationships between activity in neocortical subregions. Comparison with existing methods The holographic nature of the reconstruction method allowed significantly better estimation of the fine spatiotemporal detail of neuronal population activity, compared with interpolation alone, beyond the spatial resolution of the electrode arrays used. Pseudo-energy flow vector mapping was possible with high temporal precision, allowing a near-realtime estimate of causal interaction dynamics. Conclusions Basic near-field electromagnetic holography provides a powerful means to increase spatial resolution from electrode array data with careful choice of spatial filters and distance to reconstruction plane. More detailed approaches may provide the ability to volumetrically reconstruct activity patterns on neuronal tissue, but the ability to extract vectored data with the method presented already permits the study of dynamic causal interactions without bias from any prior assumptions on anatomical connectivity. PMID:26026581

  12. Research on infrared dim-point target detection and tracking under sea-sky-line complex background

    NASA Astrophysics Data System (ADS)

    Dong, Yu-xing; Li, Yan; Zhang, Hai-bo

    2011-08-01

    Target detection and tracking technology in infrared image is an important part of modern military defense system. Infrared dim-point targets detection and recognition under complex background is a difficulty and important strategic value and challenging research topic. The main objects that carrier-borne infrared vigilance system detected are sea-skimming aircrafts and missiles. Due to the characteristics of wide field of view of vigilance system, the target is usually under the sea clutter. Detection and recognition of the target will be taken great difficulties .There are some traditional point target detection algorithms, such as adaptive background prediction detecting method. When background has dispersion-decreasing structure, the traditional target detection algorithms would be more useful. But when the background has large gray gradient, such as sea-sky-line, sea waves etc .The bigger false-alarm rate will be taken in these local area .It could not obtain satisfactory results. Because dim-point target itself does not have obvious geometry or texture feature ,in our opinion , from the perspective of mathematics, the detection of dim-point targets in image is about singular function analysis .And from the perspective image processing analysis , the judgment of isolated singularity in the image is key problem. The foregoing points for dim-point targets detection, its essence is a separation of target and background of different singularity characteristics .The image from infrared sensor usually accompanied by different kinds of noise. These external noises could be caused by the complicated background or from the sensor itself. The noise might affect target detection and tracking. Therefore, the purpose of the image preprocessing is to reduce the effects from noise, also to raise the SNR of image, and to increase the contrast of target and background. According to the low sea-skimming infrared flying small target characteristics , the median filter is used to eliminate noise, improve signal-to-noise ratio, then the multi-point multi-storey vertical Sobel algorithm will be used to detect the sea-sky-line ,so that we can segment sea and sky in the image. Finally using centroid tracking method to capture and trace target. This method has been successfully used to trace target under the sea-sky complex background.

  13. En route to Background Independence: Broken split-symmetry, and how to restore it with bi-metric average actions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, D., E-mail: BeckerD@thep.physik.uni-mainz.ded; Reuter, M., E-mail: reuter@thep.physik.uni-mainz.de

    2014-11-15

    The most momentous requirement a quantum theory of gravity must satisfy is Background Independence, necessitating in particular an ab initio derivation of the arena all non-gravitational physics takes place in, namely spacetime. Using the background field technique, this requirement translates into the condition of an unbroken split-symmetry connecting the (quantized) metric fluctuations to the (classical) background metric. If the regularization scheme used violates split-symmetry during the quantization process it is mandatory to restore it in the end at the level of observable physics. In this paper we present a detailed investigation of split-symmetry breaking and restoration within the Effective Averagemore » Action (EAA) approach to Quantum Einstein Gravity (QEG) with a special emphasis on the Asymptotic Safety conjecture. In particular we demonstrate for the first time in a non-trivial setting that the two key requirements of Background Independence and Asymptotic Safety can be satisfied simultaneously. Carefully disentangling fluctuation and background fields, we employ a ‘bi-metric’ ansatz for the EAA and project the flow generated by its functional renormalization group equation on a truncated theory space spanned by two separate Einstein–Hilbert actions for the dynamical and the background metric, respectively. A new powerful method is used to derive the corresponding renormalization group (RG) equations for the Newton- and cosmological constant, both in the dynamical and the background sector. We classify and analyze their solutions in detail, determine their fixed point structure, and identify an attractor mechanism which turns out instrumental in the split-symmetry restoration. We show that there exists a subset of RG trajectories which are both asymptotically safe and split-symmetry restoring: In the ultraviolet they emanate from a non-Gaussian fixed point, and in the infrared they loose all symmetry violating contributions inflicted on them by the non-invariant functional RG equation. As an application, we compute the scale dependent spectral dimension which governs the fractal properties of the effective QEG spacetimes at the bi-metric level. Earlier tests of the Asymptotic Safety conjecture almost exclusively employed ‘single-metric truncations’ which are blind towards the difference between quantum and background fields. We explore in detail under which conditions they can be reliable, and we discuss how the single-metric based picture of Asymptotic Safety needs to be revised in the light of the new results. We shall conclude that the next generation of truncations for quantitatively precise predictions (of critical exponents, for instance) is bound to be of the bi-metric type. - Highlights: • The Asymptotic Safety scenario in quantum gravity is explored. • A bi-metric generalization of the Einstein–Hilbert truncation is investigated. • We find that Background Independence can coexist with Asymptotic Safety. • RG trajectories restoring (background-quantum) split-symmetry are constructed. • The degree of validity of single-metric truncations is critically assessed.« less

  14. New wrinkles on black hole perturbations: Numerical treatment of acoustic and gravitational waves

    NASA Astrophysics Data System (ADS)

    Tenyotkin, Valery

    2009-06-01

    This thesis develops two main topics. A full relativistic calculation of quasinormal modes of an acoustic black hole is carried out. The acoustic black hole is formed by a perfect, inviscid, relativistic, ideal gas that is spherically accreting onto a Schwarzschild black hole. The second major part is the calculation of sourceless vector (electromagnetic) and tensor (gravitational) covariant field evolution equations for perturbations on a Schwarzschild background using the relatively recent [Special characters omitted.] decomposition method. Scattering calculations are carried out in Schwarzschild coordinates for electromagnetic and gravitational cases as validation of the method and the derived equations.

  15. Hawking radiation from charged black holes via gauge and gravitational anomalies.

    PubMed

    Iso, Satoshi; Umetsu, Hiroshi; Wilczek, Frank

    2006-04-21

    Extending the method of Robinson and Wolczek, we show that in order to avoid a breakdown of general covariance and gauge invariance at the quantum level the total flux of charge and energy in each outgoing partial wave of a charged quantum field in a Reissner-Nordström black hole background must be equal to that of a (1 + 1)-dimensional blackbody at the Hawking temperature with the appropriate chemical potential.

  16. U.S. Air Force Families with Young Children Who Have Special Needs

    DTIC Science & Technology

    2005-06-22

    least indirectly related (MFRI, 2004) . These are the areas that have been identified as influencing performance in th e 5 military, affecting retention...methods, an d families served . One area of commonality is the emergence of these fields from the same historical and cultural background, discussed next...place limits on entitlements, an d delegate significant roles to states and localities resulting in great inequities . Yet there are areas which can

  17. Prevalence of Obesity, Binge Eating, and Night Eating in a Cross-Sectional Field Survey of 6-Year-Old Children and Their Parents in a German Urban Population

    ERIC Educational Resources Information Center

    Lamerz, Andreas; Kuepper-Nybelen, Jutta; Bruning, Nicole; Wehle, Christine; Trost-Brinkhues, Gabriele; Brenner, Hermann; Hebebrand, Johannes; Herpertz-Dahlmann, Beate

    2005-01-01

    Background: To assess the prevalence of obesity, obesity-related binge eating, non-obesity-related binge eating, and night eating in five- to six-year-old children and to examine the impact of parental eating disturbances. Methods: When 2020 children attended their obligatory health exam prior to school entry in the city of Aachen, Germany, 1979…

  18. An Exploration of the Differing Perceptions of Problem-Based Learning (PBL) from Students and Facilitators of Diverse Cultural Backgrounds, in the Fields of Theological and Nursing Education

    ERIC Educational Resources Information Center

    Fung, Nancy L. Y.

    2013-01-01

    Theological education has not widely utilized the PBL approach and there is very little research examining the utility of PBL in theological education. Lectures are currently the preferred teaching method in theological education, however, it is recognized that there is a need for a more holistic approach. As theological education is used in both…

  19. Background radiation measurements at high power research reactors

    NASA Astrophysics Data System (ADS)

    Ashenfelter, J.; Balantekin, B.; Baldenegro, C. X.; Band, H. R.; Barclay, G.; Bass, C. D.; Berish, D.; Bowden, N. S.; Bryan, C. D.; Cherwinka, J. J.; Chu, R.; Classen, T.; Davee, D.; Dean, D.; Deichert, G.; Dolinski, M. J.; Dolph, J.; Dwyer, D. A.; Fan, S.; Gaison, J. K.; Galindo-Uribarri, A.; Gilje, K.; Glenn, A.; Green, M.; Han, K.; Hans, S.; Heeger, K. M.; Heffron, B.; Jaffe, D. E.; Kettell, S.; Langford, T. J.; Littlejohn, B. R.; Martinez, D.; McKeown, R. D.; Morrell, S.; Mueller, P. E.; Mumm, H. P.; Napolitano, J.; Norcini, D.; Pushin, D.; Romero, E.; Rosero, R.; Saldana, L.; Seilhan, B. S.; Sharma, R.; Stemen, N. T.; Surukuchi, P. T.; Thompson, S. J.; Varner, R. L.; Wang, W.; Watson, S. M.; White, B.; White, C.; Wilhelmi, J.; Williams, C.; Wise, T.; Yao, H.; Yeh, M.; Yen, Y.-R.; Zhang, C.; Zhang, X.; Prospect Collaboration

    2016-01-01

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the background fields encountered. The general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.

  20. Generation of Rising-tone Chorus in a Two-dimensional Mirror Field by Using the General Curvilinear PIC Code

    NASA Astrophysics Data System (ADS)

    Ke, Y.; Gao, X.; Lu, Q.; Wang, X.; Wang, S.

    2017-12-01

    Recently, the generation of rising-tone chorus has been implemented with one-dimensional (1-D) particle-in-cell (PIC) simulations in an inhomogeneous background magnetic field, where both the propagation of waves and motion of electrons are simply forced to be parallel to the background magnetic field. We have developed a two-dimensional(2-D) general curvilinear PIC simulation code, and successfully reproduced rising-tone chorus waves excited from an anisotropic electron distribution in a 2-D mirror field. Our simulation results show that whistler waves are mainly generated around the magnetic equator, and continuously gain growth during their propagation toward higher-latitude regions. The rising-tone chorus waves are formed off the magnetic equator, which propagate quasi-parallel to the background magnetic field with the finite wave normal angle. Due to the propagating effect, the wave normal angle of chorus waves is increasing during their propagation toward higher-latitude regions along an enough curved field line. The chirping rate of chorus waves are found to be larger along a field line more close to the middle field line in the mirror field.

  1. 7 CFR 1944.404 - Eligibility.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATIONS (CONTINUED) HOUSING Self-Help Technical Assistance Grants § 1944.404 Eligibility. To receive a... background and experience with proven ability to perform responsibly in the field of mutual self-help or... in the field of mutual self-help; or (2) Be sponsored by an organization with background experience...

  2. BRST-BFV analysis of anomalies in bosonic string theory interacting with background gravitational field

    NASA Astrophysics Data System (ADS)

    Buchbinder, I. L.; Mistchuk, B. R.; Pershin, V. D.

    1995-02-01

    A general BRST-BFV analysis of the anomaly in string theory coupled to background fields is carried out. An exact equation for the c-valued symbol of the anomaly operator is found and the structure of its solution is studied.

  3. Evanescent-wave and ambient chiral sensing by signal-reversing cavity ringdown polarimetry.

    PubMed

    Sofikitis, Dimitris; Bougas, Lykourgos; Katsoprinakis, Georgios E; Spiliotis, Alexandros K; Loppinet, Benoit; Rakitzis, T Peter

    2014-10-02

    Detecting and quantifying chirality is important in fields ranging from analytical and biological chemistry to pharmacology and fundamental physics: it can aid drug design and synthesis, contribute to protein structure determination, and help detect parity violation of the weak force. Recent developments employ microwaves, femtosecond pulses, superchiral light or photoionization to determine chirality, yet the most widely used methods remain the traditional methods of measuring circular dichroism and optical rotation. However, these signals are typically very weak against larger time-dependent backgrounds. Cavity-enhanced optical methods can be used to amplify weak signals by passing them repeatedly through an optical cavity, and two-mirror cavities achieving up to 10(5) cavity passes have enabled absorption and birefringence measurements with record sensitivities. But chiral signals cancel when passing back and forth through a cavity, while the ubiquitous spurious linear birefringence background is enhanced. Even when intracavity optics overcome these problems, absolute chirality measurements remain difficult and sometimes impossible. Here we use a pulsed-laser bowtie cavity ringdown polarimeter with counter-propagating beams to enhance chiral signals by a factor equal to the number of cavity passes (typically >10(3)); to suppress the effects of linear birefringence by means of a large induced intracavity Faraday rotation; and to effect rapid signal reversals by reversing the Faraday rotation and subtracting signals from the counter-propagating beams. These features allow absolute chiral signal measurements in environments where background subtraction is not feasible: we determine optical rotation from α-pinene vapour in open air, and from maltodextrin and fructose solutions in the evanescent wave produced by total internal reflection at a prism surface. The limits of the present polarimeter, when using a continuous-wave laser locked to a stable, high-finesse cavity, should match the sensitivity of linear birefringence measurements (3 × 10(-13) radians), which is several orders of magnitude more sensitive than current chiral detection limits and is expected to transform chiral sensing in many fields.

  4. Moving branes in the presence of background tachyon fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rezaei, Z., E-mail: z.rezaei@aut.ac.ir; Kamani, D., E-mail: kamani@aut.ac.ir

    2011-12-15

    We compute the boundary state associated with a moving Dp-brane in the presence of the open string tachyon field as a background field. The effect of the tachyon condensation on the boundary state is discussed. It leads to a boundary state associated with a lower-dimensional moving D-brane or a stationary instantonic D-brane. The former originates from condensation along the spatial directions and the latter comes from the temporal direction of the D-brane worldvolume. Using the boundary state, we also study the interaction amplitude between two arbitrary Dp{sub 1}- and Dp{sub 2}-branes. The long-range behavior of the amplitude is investigated, demonstratingmore » an obvious deviation from the conventional form, due to the presence of the background tachyon field.« less

  5. Temperature-and field dependent characterization of a twisted stacked-tape cable

    NASA Astrophysics Data System (ADS)

    Barth, C.; Takayasu, M.; Bagrets, N.; Bayer, C. M.; Weiss, K.-P.; Lange, C.

    2015-04-01

    The twisted stacked-tape cable (TSTC) is one of the major high temperature superconductor cable concepts combining scalability, ease of fabrication and high current density making it a possible candidate as conductor for large scale magnets. To simulate the boundary conditions of such a magnets as well as the temperature dependence of TSTCs a 1.16 m long sample consisting of 40, 4 mm wide SuperPower REBCO tapes is characterized using the ‘FBI’ (force-field-current) superconductor test facility of the Institute for Technical Physics of the Karlsruhe Institute of Technology. In a first step, the magnetic background field is cycled while measuring the current carrying capabilities to determine the impact of Lorentz forces on the TSTC sample performance. In the first field cycle, the critical current of the TSTC sample is tested up to 12 T. A significant Lorentz force of up to 65.6 kN m-1 at the maximal magnetic background field of 12 T result in a 11.8% irreversible degradation of the current carrying capabilities. The degradation saturates (critical cable current of 5.46 kA at 4.2 K and 12 T background field) and does not increase in following field cycles. In a second step, the sample is characterized at different background fields (4-12 T) and surface temperatures (4.2-37.8 K) utilizing the variable temperature insert of the ‘FBI’ test facility. In a third step, the performance along the length of the sample is determined at 77 K, self-field. A 15% degradation is obtained for the central part of the sample which was within the high field region of the magnet during the in-field measurements.

  6. Gamma neutron assay method and apparatus

    DOEpatents

    Cole, J.D.; Aryaeinejad, R.; Greenwood, R.C.

    1995-01-03

    The gamma neutron assay technique is an alternative method to standard safeguards techniques for the identification and assaying of special nuclear materials in a field or laboratory environment, as a tool for dismantlement and destruction of nuclear weapons, and to determine the isotopic ratios for a blend-down program on uranium. It is capable of determining the isotopic ratios of fissionable material from the spontaneous or induced fission of a sample to within approximately 0.5%. This is based upon the prompt coincidence relationships that occur in the fission process and the proton conservation and quasi-conservation of nuclear mass (A) that exists between the two fission fragments. The system is used in both passive (without an external neutron source) and active (with an external neutron source) mode. The apparatus consists of an array of neutron and gamma-ray detectors electronically connected to determine coincident events. The method can also be used to assay radioactive waste which contains fissile material, even in the presence of a high background radiation field. 7 figures.

  7. Gamma neutron assay method and apparatus

    DOEpatents

    Cole, Jerald D.; Aryaeinejad, Rahmat; Greenwood, Reginald C.

    1995-01-01

    The gamma neutron assay technique is an alternative method to standard safeguards techniques for the identification and assaying of special nuclear materials in a field or laboratory environment, as a tool for dismantlement and destruction of nuclear weapons, and to determine the isotopic ratios for a blend-down program on uranium. It is capable of determining the isotopic ratios of fissionable material from the spontaneous or induced fission of a sample to within approximately 0.5%. This is based upon the prompt coincidence relationships that occur in the fission process and the proton conservation and quasi-conservation of nuclear mass (A) that exists between the two fission fragments. The system is used in both passive (without an external neutron source and active (with an external neutron source) mode. The apparatus consists of an array of neutron and gamma-ray detectors electronically connected to determine coincident events. The method can also be used to assay radioactive waste which contains fissile material, even in the presence of a high background radiation field.

  8. CB4-03: An Eye on the Future: A Review of Data Virtualization Techniques to Improve Research Analytics

    PubMed Central

    Richter, Jack; McFarland, Lela; Bredfeldt, Christine

    2012-01-01

    Background/Aims Integrating data across systems can be a daunting process. The traditional method of moving data to a common location, mapping fields with different formats and meanings, and performing data cleaning activities to ensure valid and reliable integration across systems can be both expensive and extremely time consuming. As the scope of needed research data increases, the traditional methodology may not be sustainable. Data Virtualization provides an alternative to traditional methods that may reduce the effort required to integrate data across disparate systems. Objective Our goal was to survey new methods in data integration, cloud computing, enterprise data management and virtual data management for opportunities to increase the efficiency of producing VDW and similar data sets. Methods Kaiser Permanente Information Technology (KPIT), in collaboration with the Mid-Atlantic Permanente Research Institute (MAPRI) reviewed methodologies in the burgeoning field of Data Virtualization. We identified potential strengths and weaknesses of new approaches to data integration. For each method, we evaluated its potential application for producing effective research data sets. Results Data Virtualization provides opportunities to reduce the amount of data movement required to integrate data sources on different platforms in order to produce research data sets. Additionally, Data Virtualization also includes methods for managing “fuzzy” matching used to match fields known to have poor reliability such as names, addresses and social security numbers. These methods could improve the efficiency of integrating state and federal data such as patient race, death, and tumors with internal electronic health record data. Discussion The emerging field of Data Virtualization has considerable potential for increasing the efficiency of producing research data sets. An important next step will be to develop a proof of concept project that will help us understand to benefits and drawbacks of these techniques.

  9. Real time standoff gas detection and environmental monitoring with LWIR hyperspectral imager

    NASA Astrophysics Data System (ADS)

    Prel, Florent; Moreau, Louis; Lavoie, Hugo; Bouffard, François; Thériault, Jean-Marc; Vallieres, Christian; Roy, Claude; Dubé, Denis

    2012-10-01

    MR-i is a dual band Hyperspectral Imaging Spectro-radiometer. This field instrument generates spectral datacubes in the MWIR and LWIR. MR-i is modular and can be configured in different ways. One of its configurations is optimized for the standoff measurements of gases in differential mode. In this mode, the instrument is equipped with a dual-input telescope to perform optical background subtraction. The resulting signal is the differential between the spectral radiance entering each input port. With that method, the signal from the background is automatically removed from the signal of the target of interest. The spectral range of this configuration extends in the VLWIR (cut-off near 14 μm) to take full advantage of the LW atmospheric window.

  10. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data.

    PubMed

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  11. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data

    NASA Astrophysics Data System (ADS)

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  12. Analysis of variation in calibration curves for Kodak XV radiographic film using model-based parameters.

    PubMed

    Hsu, Shu-Hui; Kulasekere, Ravi; Roberson, Peter L

    2010-08-05

    Film calibration is time-consuming work when dose accuracy is essential while working in a range of photon scatter environments. This study uses the single-target single-hit model of film response to fit the calibration curves as a function of calibration method, processor condition, field size and depth. Kodak XV film was irradiated perpendicular to the beam axis in a solid water phantom. Standard calibration films (one dose point per film) were irradiated at 90 cm source-to-surface distance (SSD) for various doses (16-128 cGy), depths (0.2, 0.5, 1.5, 5, 10 cm) and field sizes (5 × 5, 10 × 10 and 20 × 20 cm²). The 8-field calibration method (eight dose points per film) was used as a reference for each experiment, taken at 95 cm SSD and 5 cm depth. The delivered doses were measured using an Attix parallel plate chamber for improved accuracy of dose estimation in the buildup region. Three fitting methods with one to three dose points per calibration curve were investigated for the field sizes of 5 × 5, 10 × 10 and 20 × 20 cm². The inter-day variation of model parameters (background, saturation and slope) were 1.8%, 5.7%, and 7.7% (1 σ) using the 8-field method. The saturation parameter ratio of standard to 8-field curves was 1.083 ± 0.005. The slope parameter ratio of standard to 8-field curves ranged from 0.99 to 1.05, depending on field size and depth. The slope parameter ratio decreases with increasing depth below 0.5 cm for the three field sizes. It increases with increasing depths above 0.5 cm. A calibration curve with one to three dose points fitted with the model is possible with 2% accuracy in film dosimetry for various irradiation conditions. The proposed fitting methods may reduce workload while providing energy dependence correction in radiographic film dosimetry. This study is limited to radiographic XV film with a Lumisys scanner.

  13. Image analysis for skeletal evaluation of carpal bones

    NASA Astrophysics Data System (ADS)

    Ko, Chien-Chuan; Mao, Chi-Wu; Lin, Chi-Jen; Sun, Yung-Nien

    1995-04-01

    The assessment of bone age is an important field to the pediatric radiology. It provides very important information for treatment and prediction of skeletal growth in a developing child. So far, various computerized algorithms for automatically assessing the skeletal growth have been reported. Most of these methods made attempt to analyze the phalangeal growth. The most fundamental step in these automatic measurement methods is the image segmentation that extracts bones from soft-tissue and background. These automatic segmentation methods of hand radiographs can roughly be categorized into two main approaches that are edge and region based methods. This paper presents a region-based carpal-bone segmentation approach. It is organized into four stages: contrast enhancement, moment-preserving thresholding, morphological processing, and region-growing labeling.

  14. A Comparison of Signal Enhancement Methods for Extracting Tonal Acoustic Signals

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.

    1998-01-01

    The measurement of pure tone acoustic pressure signals in the presence of masking noise, often generated by mean flow, is a continual problem in the field of passive liner duct acoustics research. In support of the Advanced Subsonic Technology Noise Reduction Program, methods were investigated for conducting measurements of advanced duct liner concepts in harsh, aeroacoustic environments. This report presents the results of a comparison study of three signal extraction methods for acquiring quality acoustic pressure measurements in the presence of broadband noise (used to simulate the effects of mean flow). The performance of each method was compared to a baseline measurement of a pure tone acoustic pressure 3 dB above a uniform, broadband noise background.

  15. Robust active noise control in the loadmaster area of a military transport aircraft.

    PubMed

    Kochan, Kay; Sachau, Delf; Breitbach, Harald

    2011-05-01

    The active noise control (ANC) method is based on the superposition of a disturbance noise field with a second anti-noise field using loudspeakers and error microphones. This method can be used to reduce the noise level inside the cabin of a propeller aircraft. However, during the design process of the ANC system, extensive measurements of transfer functions are necessary to optimize the loudspeaker and microphone positions. Sometimes, the transducer positions have to be tailored according to the optimization results to achieve a sufficient noise reduction. The purpose of this paper is to introduce a controller design method for such narrow band ANC systems. The method can be seen as an extension of common transducer placement optimization procedures. In the presented method, individual weighting parameters for the loudspeakers and microphones are used. With this procedure, the tailoring of the transducer positions is replaced by adjustment of controller parameters. Moreover, the ANC system will be robust because of the fact that the uncertainties are considered during the optimization of the controller parameters. The paper describes the necessary theoretic background for the method and demonstrates the efficiency in an acoustical mock-up of a military transport aircraft.

  16. A new method to unveil embedded stellar clusters

    NASA Astrophysics Data System (ADS)

    Lombardi, Marco; Lada, Charles J.; Alves, João

    2017-11-01

    In this paper we present a novel method to identify and characterize stellar clusters deeply embedded in a dark molecular cloud. The method is based on measuring stellar surface density in wide-field infrared images using star counting techniques. It takes advantage of the differing H-band luminosity functions (HLFs) of field stars and young stellar populations and is able to statistically associate each star in an image as a member of either the background stellar population or a young stellar population projected on or near the cloud. Moreover, the technique corrects for the effects of differential extinction toward each individual star. We have tested this method against simulations as well as observations. In particular, we have applied the method to 2MASS point sources observed in the Orion A and B complexes, and the results obtained compare very well with those obtained from deep Spitzer and Chandra observations where presence of infrared excess or X-ray emission directly determines membership status for every star. Additionally, our method also identifies unobscured clusters and a low resolution version of the Orion stellar surface density map shows clearly the relatively unobscured and diffuse OB 1a and 1b sub-groups and provides useful insights on their spatial distribution.

  17. Dim target trajectory-associated detection in bright earth limb background

    NASA Astrophysics Data System (ADS)

    Chen, Penghui; Xu, Xiaojian; He, Xiaoyu; Jiang, Yuesong

    2015-09-01

    The intensive emission of earth limb in the field of view of sensors contributes much to the observation images. Due to the low signal-to-noise ratio (SNR), it is a challenge to detect small targets in earth limb background, especially for the detection of point-like targets from a single frame. To improve the target detection, track before detection (TBD) based on the frame sequence is performed. In this paper, a new technique is proposed to determine the target associated trajectories, which jointly carries out background removing, maximum value projection (MVP) and Hough transform. The background of the bright earth limb in the observation images is removed according to the profile characteristics. For a moving target, the corresponding pixels in the MVP image are shifting approximately regularly in time sequence. And the target trajectory is determined by Hough transform according to the pixel characteristics of the target and the clutter and noise. Comparing with traditional frame-by-frame methods, determining associated trajectories from MVP reduces the computation load. Numerical simulations are presented to demonstrate the effectiveness of the approach proposed.

  18. Single and tandem Fabry-Perot etalons as solar background filters for lidar.

    PubMed

    McKay, J A

    1999-09-20

    Atmospheric lidar is difficult in daylight because of sunlight scattered into the receiver field of view. In this research methods for the design and performance analysis of Fabry-Perot etalons as solar background filters are presented. The factor by which the signal to background ratio is enhanced is defined as a measure of the performance of the etalon as a filter. Equations for evaluating this parameter are presented for single-, double-, and triple-etalon filter systems. The role of reflective coupling between etalons is examined and shown to substantially reduce the contributions of the second and third etalons to the filter performance. Attenuators placed between the etalons can improve the filter performance, at modest cost to the signal transmittance. The principal parameter governing the performance of the etalon filters is the etalon defect finesse. Practical limitations on etalon plate smoothness and parallelism cause the defect finesse to be relatively low, especially in the ultraviolet, and this sets upper limits to the capability of tandem etalon filters to suppress the solar background at tolerable cost to the signal.

  19. Infrared small target detection in heavy sky scene clutter based on sparse representation

    NASA Astrophysics Data System (ADS)

    Liu, Depeng; Li, Zhengzhou; Liu, Bing; Chen, Wenhao; Liu, Tianmei; Cao, Lei

    2017-09-01

    A novel infrared small target detection method based on sky clutter and target sparse representation is proposed in this paper to cope with the representing uncertainty of clutter and target. The sky scene background clutter is described by fractal random field, and it is perceived and eliminated via the sparse representation on fractal background over-complete dictionary (FBOD). The infrared small target signal is simulated by generalized Gaussian intensity model, and it is expressed by the generalized Gaussian target over-complete dictionary (GGTOD), which could describe small target more efficiently than traditional structured dictionaries. Infrared image is decomposed on the union of FBOD and GGTOD, and the sparse representation energy that target signal and background clutter decomposed on GGTOD differ so distinctly that it is adopted to distinguish target from clutter. Some experiments are induced and the experimental results show that the proposed approach could improve the small target detection performance especially under heavy clutter for background clutter could be efficiently perceived and suppressed by FBOD and the changing target could also be represented accurately by GGTOD.

  20. The effect of finite field size on classification and atmospheric correction

    NASA Technical Reports Server (NTRS)

    Kaufman, Y. J.; Fraser, R. S.

    1981-01-01

    The atmospheric effect on the upward radiance of sunlight scattered from the Earth-atmosphere system is strongly influenced by the contrasts between fields and their sizes. For a given atmospheric turbidity, the atmospheric effect on classification of surface features is much stronger for nonuniform surfaces than for uniform surfaces. Therefore, the classification accuracy of agricultural fields and urban areas is dependent not only on the optical characteristics of the atmosphere, but also on the size of the surface do not account for the nonuniformity of the surface have only a slight effect on the classification accuracy; in other cases the classification accuracy descreases. The radiances above finite fields were computed to simulate radiances measured by a satellite. A simulation case including 11 agricultural fields and four natural fields (water, soil, savanah, and forest) was used to test the effect of the size of the background reflectance and the optical thickness of the atmosphere on classification accuracy. It is concluded that new atmospheric correction methods, which take into account the finite size of the fields, have to be developed to improve significantly the classification accuracy.

  1. Basics of Bayesian methods.

    PubMed

    Ghosh, Sujit K

    2010-01-01

    Bayesian methods are rapidly becoming popular tools for making statistical inference in various fields of science including biology, engineering, finance, and genetics. One of the key aspects of Bayesian inferential method is its logical foundation that provides a coherent framework to utilize not only empirical but also scientific information available to a researcher. Prior knowledge arising from scientific background, expert judgment, or previously collected data is used to build a prior distribution which is then combined with current data via the likelihood function to characterize the current state of knowledge using the so-called posterior distribution. Bayesian methods allow the use of models of complex physical phenomena that were previously too difficult to estimate (e.g., using asymptotic approximations). Bayesian methods offer a means of more fully understanding issues that are central to many practical problems by allowing researchers to build integrated models based on hierarchical conditional distributions that can be estimated even with limited amounts of data. Furthermore, advances in numerical integration methods, particularly those based on Monte Carlo methods, have made it possible to compute the optimal Bayes estimators. However, there is a reasonably wide gap between the background of the empirically trained scientists and the full weight of Bayesian statistical inference. Hence, one of the goals of this chapter is to bridge the gap by offering elementary to advanced concepts that emphasize linkages between standard approaches and full probability modeling via Bayesian methods.

  2. [The psychiatric aspects of animal assisted therapy].

    PubMed

    Bánszky, Noémi; Kardos, Edina; Rózsa, Linda; Gerevich, József

    2012-01-01

    Animal assisted therapy is a known preventive and interventive method which is held by the contribution of specially trained animals and professionals. One of its main indication fields is psychiatry. The purpose of this summary is to give an overview on the animal assisted therapy's background, possible uses and effectiveness with literature. It looks for the answer if this therapeutic method can be used for effectively easing the symptoms of specific psychiatric diseases and on which fields can it be used most effectively. Due to the data provided by literature it can be determined that the therapy supported by animals is able to give an effective help on the fields of various psychiatric supports, preventions, interventions and rehabilitations regardless of the age. It is mostly used in the case of depression, anxiety, addiction, schizophrenia and autism spectrum disorder. Aside from these it could also be used effectively in the rehabilitation of victims of sexual abuse especially in the case of children. It can also play a role in the re-socialization of inadapted adolescences and adults, even with farmtherapy. Due to experiences the therapies supported by animals are effective on the following fields: improving social and communication skills, easing anxiety, improving mood, helping independent living, improving emphatic skills.

  3. Stability of flat spacetime in quantum gravity

    NASA Astrophysics Data System (ADS)

    Jordan, R. D.

    1987-12-01

    In a previous paper, a modified effective-action formalism was developed which produces equations satisfied by the expectation value of the field, rather than the usual in-out average. Here this formalism is applied to a quantized scalar field in a background which is a small perturbation from Minkowski spacetime. The one-loop effective field equation describes the back reaction of created particles on the gravitational field, and is calculated in this paper to linear order in the perturbation. In this way we rederive an equation first found by Horowitz using completely different methods. This equation possesses exponentially growing solutions, so we confirm Horowitz's conclusion that flat spacetime is unstable in this approximation to the theory. The new derivation shows that the field equation is just as useful as the one-loop approximation to the in-out equation, contrary to earlier arguments. However, the instability suggests that the one-loop approximation cannot be trusted for gravity. These results are compared with the corresponding situation in QED and QCD.

  4. Axial segmentation of lungs CT scan images using canny method and morphological operation

    NASA Astrophysics Data System (ADS)

    Noviana, Rina; Febriani, Rasal, Isram; Lubis, Eva Utari Cintamurni

    2017-08-01

    Segmentation is a very important topic in digital image process. It is found simply in varied fields of image analysis, particularly within the medical imaging field. Axial segmentation of lungs CT scan is beneficial in designation of abnormalities and surgery planning. It will do to ascertain every section within the lungs. The results of the segmentation are accustomed discover the presence of nodules. The method which utilized in this analysis are image cropping, image binarization, Canny edge detection and morphological operation. Image cropping is done so as to separate the lungs areas, that is the region of interest. Binarization method generates a binary image that has 2 values with grey level, that is black and white (ROI), from another space of lungs CT scan image. Canny method used for the edge detection. Morphological operation is applied to smoothing the lungs edge. The segmentation methodology shows an honest result. It obtains an awfully smooth edge. Moreover, the image background can also be removed in order to get the main focus, the lungs.

  5. Principles of PET/MR Imaging.

    PubMed

    Disselhorst, Jonathan A; Bezrukov, Ilja; Kolb, Armin; Parl, Christoph; Pichler, Bernd J

    2014-06-01

    Hybrid PET/MR systems have rapidly progressed from the prototype stage to systems that are increasingly being used in the clinics. This review provides an overview of developments in hybrid PET/MR systems and summarizes the current state of the art in PET/MR instrumentation, correction techniques, and data analysis. The strong magnetic field requires considerable changes in the manner by which PET images are acquired and has led, among others, to the development of new PET detectors, such as silicon photomultipliers. During more than a decade of active PET/MR development, several system designs have been described. The technical background of combined PET/MR systems is explained and related challenges are discussed. The necessity for PET attenuation correction required new methods based on MR data. Therefore, an overview of recent developments in this field is provided. Furthermore, MR-based motion correction techniques for PET are discussed, as integrated PET/MR systems provide a platform for measuring motion with high temporal resolution without additional instrumentation. The MR component in PET/MR systems can provide functional information about disease processes or brain function alongside anatomic images. Against this background, we point out new opportunities for data analysis in this new field of multimodal molecular imaging. © 2014 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  6. Fast words boundaries localization in text fields for low quality document images

    NASA Astrophysics Data System (ADS)

    Ilin, Dmitry; Novikov, Dmitriy; Polevoy, Dmitry; Nikolaev, Dmitry

    2018-04-01

    The paper examines the problem of word boundaries precise localization in document text zones. Document processing on a mobile device consists of document localization, perspective correction, localization of individual fields, finding words in separate zones, segmentation and recognition. While capturing an image with a mobile digital camera under uncontrolled capturing conditions, digital noise, perspective distortions or glares may occur. Further document processing gets complicated because of its specifics: layout elements, complex background, static text, document security elements, variety of text fonts. However, the problem of word boundaries localization has to be solved at runtime on mobile CPU with limited computing capabilities under specified restrictions. At the moment, there are several groups of methods optimized for different conditions. Methods for the scanned printed text are quick but limited only for images of high quality. Methods for text in the wild have an excessively high computational complexity, thus, are hardly suitable for running on mobile devices as part of the mobile document recognition system. The method presented in this paper solves a more specialized problem than the task of finding text on natural images. It uses local features, a sliding window and a lightweight neural network in order to achieve an optimal algorithm speed-precision ratio. The duration of the algorithm is 12 ms per field running on an ARM processor of a mobile device. The error rate for boundaries localization on a test sample of 8000 fields is 0.3

  7. genRE: A Method to Extend Gridded Precipitation Climatology Data Sets in Near Real-Time for Hydrological Forecasting Purposes

    NASA Astrophysics Data System (ADS)

    van Osnabrugge, B.; Weerts, A. H.; Uijlenhoet, R.

    2017-11-01

    To enable operational flood forecasting and drought monitoring, reliable and consistent methods for precipitation interpolation are needed. Such methods need to deal with the deficiencies of sparse operational real-time data compared to quality-controlled offline data sources used in historical analyses. In particular, often only a fraction of the measurement network reports in near real-time. For this purpose, we present an interpolation method, generalized REGNIE (genRE), which makes use of climatological monthly background grids derived from existing gridded precipitation climatology data sets. We show how genRE can be used to mimic and extend climatological precipitation data sets in near real-time using (sparse) real-time measurement networks in the Rhine basin upstream of the Netherlands (approximately 160,000 km2). In the process, we create a 1.2 × 1.2 km transnational gridded hourly precipitation data set for the Rhine basin. Precipitation gauge data are collected, spatially interpolated for the period 1996-2015 with genRE and inverse-distance squared weighting (IDW), and then evaluated on the yearly and daily time scale against the HYRAS and EOBS climatological data sets. Hourly fields are compared qualitatively with RADOLAN radar-based precipitation estimates. Two sources of uncertainty are evaluated: station density and the impact of different background grids (HYRAS versus EOBS). The results show that the genRE method successfully mimics climatological precipitation data sets (HYRAS/EOBS) over daily, monthly, and yearly time frames. We conclude that genRE is a good interpolation method of choice for real-time operational use. genRE has the largest added value over IDW for cases with a low real-time station density and a high-resolution background grid.

  8. The Impacts of Heating Strategy on Soil Moisture Estimation Using Actively Heated Fiber Optics.

    PubMed

    Dong, Jianzhi; Agliata, Rosa; Steele-Dunne, Susan; Hoes, Olivier; Bogaard, Thom; Greco, Roberto; van de Giesen, Nick

    2017-09-13

    Several recent studies have highlighted the potential of Actively Heated Fiber Optics (AHFO) for high resolution soil moisture mapping. In AHFO, the soil moisture can be calculated from the cumulative temperature ( T cum ), the maximum temperature ( T max ), or the soil thermal conductivity determined from the cooling phase after heating ( λ ). This study investigates the performance of the T cum , T max and λ methods for different heating strategies, i.e., differences in the duration and input power of the applied heat pulse. The aim is to compare the three approaches and to determine which is best suited to field applications where the power supply is limited. Results show that increasing the input power of the heat pulses makes it easier to differentiate between dry and wet soil conditions, which leads to an improved accuracy. Results suggest that if the power supply is limited, the heating strength is insufficient for the λ method to yield accurate estimates. Generally, the T cum and T max methods have similar accuracy. If the input power is limited, increasing the heat pulse duration can improve the accuracy of the AHFO method for both of these techniques. In particular, extending the heating duration can significantly increase the sensitivity of T cum to soil moisture. Hence, the T cum method is recommended when the input power is limited. Finally, results also show that up to 50% of the cable temperature change during the heat pulse can be attributed to soil background temperature, i.e., soil temperature changed by the net solar radiation. A method is proposed to correct this background temperature change. Without correction, soil moisture information can be completely masked by the background temperature error.

  9. The Impacts of Heating Strategy on Soil Moisture Estimation Using Actively Heated Fiber Optics

    PubMed Central

    Dong, Jianzhi; Agliata, Rosa; Steele-Dunne, Susan; Hoes, Olivier; Bogaard, Thom; Greco, Roberto; van de Giesen, Nick

    2017-01-01

    Several recent studies have highlighted the potential of Actively Heated Fiber Optics (AHFO) for high resolution soil moisture mapping. In AHFO, the soil moisture can be calculated from the cumulative temperature (Tcum), the maximum temperature (Tmax), or the soil thermal conductivity determined from the cooling phase after heating (λ). This study investigates the performance of the Tcum, Tmax and λ methods for different heating strategies, i.e., differences in the duration and input power of the applied heat pulse. The aim is to compare the three approaches and to determine which is best suited to field applications where the power supply is limited. Results show that increasing the input power of the heat pulses makes it easier to differentiate between dry and wet soil conditions, which leads to an improved accuracy. Results suggest that if the power supply is limited, the heating strength is insufficient for the λ method to yield accurate estimates. Generally, the Tcum and Tmax methods have similar accuracy. If the input power is limited, increasing the heat pulse duration can improve the accuracy of the AHFO method for both of these techniques. In particular, extending the heating duration can significantly increase the sensitivity of Tcum to soil moisture. Hence, the Tcum method is recommended when the input power is limited. Finally, results also show that up to 50% of the cable temperature change during the heat pulse can be attributed to soil background temperature, i.e., soil temperature changed by the net solar radiation. A method is proposed to correct this background temperature change. Without correction, soil moisture information can be completely masked by the background temperature error. PMID:28902141

  10. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation. Volume 4; Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols; Revised

    NASA Technical Reports Server (NTRS)

    Mueller, J. L. (Editor); Fargion, Giuletta S. (Editor); McClain, Charles R. (Editor); Pegau, Scott; Zaneveld, J. Ronald V.; Mitchell, B. Gregg; Kahru, Mati; Wieland, John; Stramska, Malgorzat

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 (Mueller and Fargion 2002, Volumes 1 and 2) is entirely superseded by the six volumes of Revision 4 listed above.

  11. Establishing Substantial Equivalence: Metabolomics

    NASA Astrophysics Data System (ADS)

    Beale, Michael H.; Ward, Jane L.; Baker, John M.

    Modern ‘metabolomic’ methods allow us to compare levels of many structurally diverse compounds in an automated fashion across a large number of samples. This technology is ideally suited to screening of populations of plants, including trials where the aim is the determination of unintended effects introduced by GM. A number of metabolomic methods have been devised for the determination of substantial equivalence. We have developed a methodology, using [1H]-NMR fingerprinting, for metabolomic screening of plants and have applied it to the study of substantial equivalence of field-grown GM wheat. We describe here the principles and detail of that protocol as applied to the analysis of flour generated from field plots of wheat. Particular emphasis is given to the downstream data processing and comparison of spectra by multivariate analysis, from which conclusions regarding metabolome changes due to the GM can be assessed against the background of natural variation due to environment.

  12. Plasmonic photoluminescence for recovering native chemical information from surface-enhanced Raman scattering

    PubMed Central

    Lin, Kai-Qiang; Yi, Jun; Zhong, Jin-Hui; Hu, Shu; Liu, Bi-Ju; Liu, Jun-Yang; Zong, Cheng; Lei, Zhi-Chao; Wang, Xiang; Aizpurua, Javier; Esteban, Rubén; Ren, Bin

    2017-01-01

    Surface-enhanced Raman scattering (SERS) spectroscopy has attracted tremendous interests as a highly sensitive label-free tool. The local field produced by the excitation of localized surface plasmon resonances (LSPRs) dominates the overall enhancement of SERS. Such an electromagnetic enhancement is unfortunately accompanied by a strong modification in the relative intensity of the original Raman spectra, which highly distorts spectral features providing chemical information. Here we propose a robust method to retrieve the fingerprint of intrinsic chemical information from the SERS spectra. The method is established based on the finding that the SERS background originates from the LSPR-modulated photoluminescence, which contains the local field information shared also by SERS. We validate this concept of retrieval of intrinsic fingerprint information in well controlled single metallic nanoantennas of varying aspect ratios. We further demonstrate its unambiguity and generality in more complicated systems of tip-enhanced Raman spectroscopy (TERS) and SERS of silver nanoaggregates. PMID:28348368

  13. Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4, Volume IV: Inherent Optical Properties: Instruments, Characterizations, Field Measurements and Data Analysis Protocols

    NASA Technical Reports Server (NTRS)

    Mueller, J. L.; Fargion, G. S.; McClain, C. R. (Editor); Pegau, S.; Zanefeld, J. R. V.; Mitchell, B. G.; Kahru, M.; Wieland, J.; Stramska, M.

    2003-01-01

    This document stipulates protocols for measuring bio-optical and radiometric data for the Sensor Intercomparision and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities and algorithm development. The document is organized into 6 separate volumes as Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 4. Volume I: Introduction, Background, and Conventions; Volume II: Instrument Specifications, Characterization and Calibration; Volume III: Radiometric Measurements and Data Analysis Methods; Volume IV: Inherent Optical Properties: Instruments, Characterization, Field Measurements and Data Analysis Protocols; Volume V: Biogeochemical and Bio-Optical Measurements and Data Analysis Methods; Volume VI: Special Topics in Ocean Optics Protocols and Appendices. The earlier version of Ocean Optics Protocols for Satellite Ocean Color Sensor Validation, Revision 3 is entirely superseded by the six volumes of Revision 4 listed above.

  14. Scalar field vacuum expectation value induced by gravitational wave background

    NASA Astrophysics Data System (ADS)

    Jones, Preston; McDougall, Patrick; Ragsdale, Michael; Singleton, Douglas

    2018-06-01

    We show that a massless scalar field in a gravitational wave background can develop a non-zero vacuum expectation value. We draw comparisons to the generation of a non-zero vacuum expectation value for a scalar field in the Higgs mechanism and with the dynamical Casimir vacuum. We propose that this vacuum expectation value, generated by a gravitational wave, can be connected with particle production from gravitational waves and may have consequences for the early Universe where scalar fields are thought to play an important role.

  15. Extracting harmonic signal from a chaotic background with local linear model

    NASA Astrophysics Data System (ADS)

    Li, Chenlong; Su, Liyun

    2017-02-01

    In this paper, the problems of blind detection and estimation of harmonic signal in strong chaotic background are analyzed, and new methods by using local linear (LL) model are put forward. The LL model has been exhaustively researched and successfully applied for fitting and forecasting chaotic signal in many chaotic fields. We enlarge the modeling capacity substantially. Firstly, we can predict the short-term chaotic signal and obtain the fitting error based on the LL model. Then we detect the frequencies from the fitting error by periodogram, a property on the fitting error is proposed which has not been addressed before, and this property ensures that the detected frequencies are similar to that of harmonic signal. Secondly, we establish a two-layer LL model to estimate the determinate harmonic signal in strong chaotic background. To estimate this simply and effectively, we develop an efficient backfitting algorithm to select and optimize the parameters that are hard to be exhaustively searched for. In the method, based on sensitivity to initial value of chaos motion, the minimum fitting error criterion is used as the objective function to get the estimation of the parameters of the two-layer LL model. Simulation shows that the two-layer LL model and its estimation technique have appreciable flexibility to model the determinate harmonic signal in different chaotic backgrounds (Lorenz, Henon and Mackey-Glass (M-G) equations). Specifically, the harmonic signal can be extracted well with low SNR and the developed background algorithm satisfies the condition of convergence in repeated 3-5 times.

  16. C2-C6 background hydrocarbon concentrations monitored at a roof top and green park site, in Dublin City centre.

    PubMed

    O'Donoghue, R T; Broderick, B M

    2007-09-01

    A 5 week monitoring campaign was carried out in Dublin City centre, to establish which site gave a more accurate background city centre estimation: a roof-top or green field site. This background represented a conservative estimate of HC exposure in Dublin City centre, useful for quantifying health effects related to this form of pollution and also for establishing a local background relative to the four surrounding main roads when the wind direction is travelling towards each road with the background receptor upwind. Over the entire monitoring campaign, the lowest concentrations and relative standard deviations were observed at the green field site, regardless of time of day or meteorological effects.

  17. Precise orbit determination based on raw GPS measurements

    NASA Astrophysics Data System (ADS)

    Zehentner, Norbert; Mayer-Gürr, Torsten

    2016-03-01

    Precise orbit determination is an essential part of the most scientific satellite missions. Highly accurate knowledge of the satellite position is used to geolocate measurements of the onboard sensors. For applications in the field of gravity field research, the position itself can be used as observation. In this context, kinematic orbits of low earth orbiters (LEO) are widely used, because they do not include a priori information about the gravity field. The limiting factor for the achievable accuracy of the gravity field through LEO positions is the orbit accuracy. We make use of raw global positioning system (GPS) observations to estimate the kinematic satellite positions. The method is based on the principles of precise point positioning. Systematic influences are reduced by modeling and correcting for all known error sources. Remaining effects such as the ionospheric influence on the signal propagation are either unknown or not known to a sufficient level of accuracy. These effects are modeled as unknown parameters in the estimation process. The redundancy in the adjustment is reduced; however, an improvement in orbit accuracy leads to a better gravity field estimation. This paper describes our orbit determination approach and its mathematical background. Some examples of real data applications highlight the feasibility of the orbit determination method based on raw GPS measurements. Its suitability for gravity field estimation is presented in a second step.

  18. An Intelligent Architecture Based on Field Programmable Gate Arrays Designed to Detect Moving Objects by Using Principal Component Analysis

    PubMed Central

    Bravo, Ignacio; Mazo, Manuel; Lázaro, José L.; Gardel, Alfredo; Jiménez, Pedro; Pizarro, Daniel

    2010-01-01

    This paper presents a complete implementation of the Principal Component Analysis (PCA) algorithm in Field Programmable Gate Array (FPGA) devices applied to high rate background segmentation of images. The classical sequential execution of different parts of the PCA algorithm has been parallelized. This parallelization has led to the specific development and implementation in hardware of the different stages of PCA, such as computation of the correlation matrix, matrix diagonalization using the Jacobi method and subspace projections of images. On the application side, the paper presents a motion detection algorithm, also entirely implemented on the FPGA, and based on the developed PCA core. This consists of dynamically thresholding the differences between the input image and the one obtained by expressing the input image using the PCA linear subspace previously obtained as a background model. The proposal achieves a high ratio of processed images (up to 120 frames per second) and high quality segmentation results, with a completely embedded and reliable hardware architecture based on commercial CMOS sensors and FPGA devices. PMID:22163406

  19. Numerical simulation of the generation of turbulence from cometary ion pick-up

    NASA Technical Reports Server (NTRS)

    Goldstein, M. L.; Roberts, D. A.; Matthaeus, W. H.

    1987-01-01

    Observations of magnetic field fluctuations near Comet Halley have revealed a rapid development of a Kolmogoroff-like turbulence spectrum extending from below 0.01 Hz to above 0.1 Hz. Spectra obtained far from the comet have a strong peak in power near the Doppler-shifted ion-cyclotron frequency of singly ionized water. Closer to the comet, the spectrum at higher frequencies is enhanced in power level over the background solar wind spectrum by approximately an order of magnitude. The equations of incompressible MHD are solved using a two-dimensional 256 x 256 mode spectral method code to simulate this spectral evolution as an inertial range turbulent cascade. The initial conditions contained a constant magnetic field and a single coherent wave mode at a low wave number. The solar wind turbulence was modeled by a background noise spectrum having a Kolmogoroff spectral index. The coherent mode decayed into an inertial range spectrum with Kolmogoroff slope within a few eddy-turnover times. Both the time scale and the increase in power level of the turbulence seen in the simulation are in accord with the Giotto observations.

  20. An intelligent architecture based on Field Programmable Gate Arrays designed to detect moving objects by using Principal Component Analysis.

    PubMed

    Bravo, Ignacio; Mazo, Manuel; Lázaro, José L; Gardel, Alfredo; Jiménez, Pedro; Pizarro, Daniel

    2010-01-01

    This paper presents a complete implementation of the Principal Component Analysis (PCA) algorithm in Field Programmable Gate Array (FPGA) devices applied to high rate background segmentation of images. The classical sequential execution of different parts of the PCA algorithm has been parallelized. This parallelization has led to the specific development and implementation in hardware of the different stages of PCA, such as computation of the correlation matrix, matrix diagonalization using the Jacobi method and subspace projections of images. On the application side, the paper presents a motion detection algorithm, also entirely implemented on the FPGA, and based on the developed PCA core. This consists of dynamically thresholding the differences between the input image and the one obtained by expressing the input image using the PCA linear subspace previously obtained as a background model. The proposal achieves a high ratio of processed images (up to 120 frames per second) and high quality segmentation results, with a completely embedded and reliable hardware architecture based on commercial CMOS sensors and FPGA devices.

  1. Functional determinants of radial operators in AdS2

    DOE PAGES

    Aguilera-Damia, Jeremías; Faraggi, Alberto; Zayas, Leopoldo Pando; ...

    2018-06-01

    We study the zeta-function regularization of functional determinants of Laplace and Dirac-type operators in two-dimensional Euclidean AdS2 space. More specifically, we consider the ratio of determinants between an operator in the presence of background fields with circular symmetry and the free operator in which the background fields are absent. By Fourier-transforming the angular dependence, one obtains an infinite number of one-dimensional radial operators, the determinants of which are easy to compute. The summation over modes is then treated with care so as to guarantee that the result coincides with the two-dimensional zeta-function formalism. The method relies on some well-known techniquesmore » to compute functional determinants using contour integrals and the construction of the Jost function from scattering theory. Our work generalizes some known results in flat space. The extension to conformal AdS2 geometries is also considered. We provide two examples, one bosonic and one fermionic, borrowed from the spectrum of fluctuations of the holographic 1/4-BPS latitude Wilson loop.« less

  2. Perturbative study of the QCD phase diagram for heavy quarks at nonzero chemical potential: Two-loop corrections

    NASA Astrophysics Data System (ADS)

    Maelger, J.; Reinosa, U.; Serreau, J.

    2018-04-01

    We extend a previous investigation [U. Reinosa et al., Phys. Rev. D 92, 025021 (2015), 10.1103/PhysRevD.92.025021] of the QCD phase diagram with heavy quarks in the context of background field methods by including the two-loop corrections to the background field effective potential. The nonperturbative dynamics in the pure-gauge sector is modeled by a phenomenological gluon mass term in the Landau-DeWitt gauge-fixed action, which results in an improved perturbative expansion. We investigate the phase diagram at nonzero temperature and (real or imaginary) chemical potential. Two-loop corrections yield an improved agreement with lattice data as compared to the leading-order results. We also compare with the results of nonperturbative continuum approaches. We further study the equation of state as well as the thermodynamic stability of the system at two-loop order. Finally, using simple thermodynamic arguments, we show that the behavior of the Polyakov loops as functions of the chemical potential complies with their interpretation in terms of quark and antiquark free energies.

  3. Nyx: Adaptive mesh, massively-parallel, cosmological simulation code

    NASA Astrophysics Data System (ADS)

    Almgren, Ann; Beckner, Vince; Friesen, Brian; Lukic, Zarija; Zhang, Weiqun

    2017-12-01

    Nyx code solves equations of compressible hydrodynamics on an adaptive grid hierarchy coupled with an N-body treatment of dark matter. The gas dynamics in Nyx use a finite volume methodology on an adaptive set of 3-D Eulerian grids; dark matter is represented as discrete particles moving under the influence of gravity. Particles are evolved via a particle-mesh method, using Cloud-in-Cell deposition/interpolation scheme. Both baryonic and dark matter contribute to the gravitational field. In addition, Nyx includes physics for accurately modeling the intergalactic medium; in optically thin limits and assuming ionization equilibrium, the code calculates heating and cooling processes of the primordial-composition gas in an ionizing ultraviolet background radiation field.

  4. Quantum nature of the big bang.

    PubMed

    Ashtekar, Abhay; Pawlowski, Tomasz; Singh, Parampreet

    2006-04-14

    Some long-standing issues concerning the quantum nature of the big bang are resolved in the context of homogeneous isotropic models with a scalar field. Specifically, the known results on the resolution of the big-bang singularity in loop quantum cosmology are significantly extended as follows: (i) the scalar field is shown to serve as an internal clock, thereby providing a detailed realization of the "emergent time" idea; (ii) the physical Hilbert space, Dirac observables, and semiclassical states are constructed rigorously; (iii) the Hamiltonian constraint is solved numerically to show that the big bang is replaced by a big bounce. Thanks to the nonperturbative, background independent methods, unlike in other approaches the quantum evolution is deterministic across the deep Planck regime.

  5. Wide-Field Imaging of Single-Nanoparticle Extinction with Sub-nm2 Sensitivity

    NASA Astrophysics Data System (ADS)

    Payne, Lukas M.; Langbein, Wolfgang; Borri, Paola

    2018-03-01

    We report on a highly sensitive wide-field imaging technique for quantitative measurement of the optical extinction cross section σext of single nanoparticles. The technique is simple and high speed, and it enables the simultaneous acquisition of hundreds of nanoparticles for statistical analysis. Using rapid referencing, fast acquisition, and a deconvolution analysis, a shot-noise-limited sensitivity down to 0.4 nm2 is achieved. Measurements on a set of individual gold nanoparticles of 5 nm diameter using this method yield σext=(10.0 ±3.1 ) nm2, which is consistent with theoretical expectations and well above the background fluctuations of 0.9 nm2 .

  6. Background Noise Analysis in a Few-Photon-Level Qubit Memory

    NASA Astrophysics Data System (ADS)

    Mittiga, Thomas; Kupchak, Connor; Jordaan, Bertus; Namazi, Mehdi; Nolleke, Christian; Figeroa, Eden

    2014-05-01

    We have developed an Electromagnetically Induced Transparency based polarization qubit memory. The device is composed of a dual-rail probe field polarization setup colinear with an intense control field to store and retrieve any arbitrary polarization state by addressing a Λ-type energy level scheme in a 87Rb vapor cell. To achieve a signal-to-background ratio at the few photon level sufficient for polarization tomography of the retrieved state, the intense control field is filtered out through an etalon filtrating system. We have developed an analytical model predicting the influence of the signal-to-background ratio on the fidelities and compared it to experimental data. Experimentally measured global fidelities have been found to follow closely the theoretical prediction as signal-to-background decreases. These results suggest the plausibility of employing room temperature memories to store photonic qubits at the single photon level and for future applications in long distance quantum communication schemes.

  7. Clinical Reasoning in Massage Therapy

    PubMed Central

    LeMoon, Kim

    2008-01-01

    Background: Clinical reasoning has long been a valuable tool for health care practitioners, but it has been under-researched in the field of massage therapy. Case reports have been a useful method for exploring the clinical reasoning process in various fields of manual therapy and can provide a model for similar research in the field of massage therapy. A diagnostically challenging case concerning a client with low back pain serves as a guideline for examining the clinical reasoning process of a massage therapist. Methods: A two-part methodology was employed: Client profileReflective inquiry The inquiry included questions pertaining to beliefs about health problems; beliefs about the mechanisms of pain; medical conditions that could explain the client’s symptoms; knowledge of the client’s anatomy, assessment, and treatment choices; observations made during treatment; extent of experience in treating similar problems; and ability to recognize clinical patterns. Results: The clinical reasoning process of a massage therapist contributed to a differential diagnosis, which provided an explanation for the client’s symptoms and led to a satisfactory treatment resolution. Conclusion: The present report serves as an example of the value of clinical reasoning in the field of massage therapy, and the need for expanded research into its methods and applications. The results of such research could be beneficial in teaching the clinical reasoning process at both the introductory and the advanced levels of massage therapy education. PMID:21589814

  8. A Program of Research on Microfabrication Techniques for VLSI Magnetic Devices.

    DTIC Science & Technology

    1982-10-01

    contribution to the implantation- induced uniaxial anisotropy field change. BACKGROUND Magnetic garnet films are grown by liquid phase epitaxy ( LPE ) on non...a single crystal, non-magnetic garnet substrate by the liquid phase epitaxy ( LPE ) method. These thin films , usually one to three microns in thickness...microscopy. Experimental Procedures Films of (SmYGdTm)3Ca0a.Fe4.6012 garnet were grown by liquid phase epitaxy ( LPE ) on gadolinium-gallium garnet (GGG

  9. On the two-loop divergences of the 2-point hypermultiplet supergraphs for 6D, N = (1 , 1) SYM theory

    NASA Astrophysics Data System (ADS)

    Buchbinder, I. L.; Ivanov, E. A.; Merzlikin, B. S.; Stepanyantz, K. V.

    2018-03-01

    We consider 6D, N = (1 , 1) supersymmetric Yang-Mills theory formulated in N = (1 , 0) harmonic superspace and analyze the structure of the two-loop divergences in the hypermultiplet sector. Using the N = (1 , 0) superfield background field method we study the two-point supergraphs with the hypermultiplet legs and prove that their total contribution to the divergent part of effective action vanishes off shell.

  10. Enhancing STEM Education through Cubesats: Using Satellite Integration as a Teaching Tool at a Non-Tech School

    NASA Astrophysics Data System (ADS)

    Bernardes, S.; Cotten, D. L.

    2016-12-01

    University-based satellite programs have been successfully used as a platform for teaching STEM related fields, bringing tremendous benefits to graduate and undergraduate education. Considering their infrastructure and curricula, tech schools have traditionally been considered logical candidates for hosting such programs. More recently, with the dissemination of small satellites initiatives, non-tech schools have been presented the opportunity of developing satellite design and implementation programs. This work reports on the experiences and challenges associated with implementing a satellite program at the University of Georgia (UGA), a non-tech university. With funding from the Air Force Research Laboratory's (AFRL) University Nanosat Program (UNP) and NASA's Undergraduate Student Instrument Project (USIP) a team of undergraduates at UGA has recently been tasked with building two small satellites and helping to create a Small Satellite Research Laboratory (SSRL) at the university. Unique features of the satellite program at UGA include its team of students from a broad range of backgrounds and departments (Engineering, Computer Science, Art, Business, and Geography) and the previous exposure of many of these students to synergistic technologies, including arduino and unmanned aerial systems. We show how informal exposure to those technologies and willingness of students to focus on areas outside of their field of study can benefit from the implementation of satellite programs. In this regard, we report on methods and techniques used to find and recruit driven and knowledgeable students to work in a high paced field such as satellite system integration. We show how students and faculty from multiple departments have collaborated to reach a common, far reaching goal and describe our proposed methods to evaluate and measure educational goals based around SSRL and its projects. We also present the challenges associated with the lack of a developed engineering program, including our solutions to a shortage of equipment and expertise regarding building satellite systems and a satellite laboratory. Finally, we our outreach methods, including K-12, and share our experience and successes finding industry partners, considering an absence of background in the field and prior collaborations.

  11. Myocardial strains from 3D displacement encoded magnetic resonance imaging

    PubMed Central

    2012-01-01

    Background The ability to measure and quantify myocardial motion and deformation provides a useful tool to assist in the diagnosis, prognosis and management of heart disease. The recent development of magnetic resonance imaging methods, such as harmonic phase analysis of tagging and displacement encoding with stimulated echoes (DENSE), make detailed non-invasive 3D kinematic analyses of human myocardium possible in the clinic and for research purposes. A robust analysis method is required, however. Methods We propose to estimate strain using a polynomial function which produces local models of the displacement field obtained with DENSE. Given a specific polynomial order, the model is obtained as the least squares fit of the acquired displacement field. These local models are subsequently used to produce estimates of the full strain tensor. Results The proposed method is evaluated on a numerical phantom as well as in vivo on a healthy human heart. The evaluation showed that the proposed method produced accurate results and showed low sensitivity to noise in the numerical phantom. The method was also demonstrated in vivo by assessment of the full strain tensor and to resolve transmural strain variations. Conclusions Strain estimation within a 3D myocardial volume based on polynomial functions yields accurate and robust results when validated on an analytical model. The polynomial field is capable of resolving the measured material positions from the in vivo data, and the obtained in vivo strains values agree with previously reported myocardial strains in normal human hearts. PMID:22533791

  12. Impact of Neutral Boundary-Layer Turbulence on Wind-Turbine Wakes: A Numerical Modelling Study

    NASA Astrophysics Data System (ADS)

    Englberger, Antonia; Dörnbrack, Andreas

    2017-03-01

    The wake characteristics of a wind turbine in a turbulent boundary layer under neutral stratification are investigated systematically by means of large-eddy simulations. A methodology to maintain the turbulence of the background flow for simulations with open horizontal boundaries, without the necessity of the permanent import of turbulence data from a precursor simulation, was implemented in the geophysical flow solver EULAG. These requirements are fulfilled by applying the spectral energy distribution of a neutral boundary layer in the wind-turbine simulations. A detailed analysis of the wake response towards different turbulence levels of the background flow results in a more rapid recovery of the wake for a higher level of turbulence. A modified version of the Rankine-Froude actuator disc model and the blade element momentum method are tested as wind-turbine parametrizations resulting in a strong dependence of the near-wake wind field on the parametrization, whereas the far-wake flow is fairly insensitive to it. The wake characteristics are influenced by the two considered airfoils in the blade element momentum method up to a streamwise distance of 14 D ( D = rotor diameter). In addition, the swirl induced by the rotation has an impact on the velocity field of the wind turbine even in the far wake. Further, a wake response study reveals a considerable effect of different subgrid-scale closure models on the streamwise turbulent intensity.

  13. Unified field theories, the early big bang, and the microwave background paradox

    NASA Technical Reports Server (NTRS)

    Stecker, F. W.

    1979-01-01

    It is suggested that a superunified field theory incorporating gravity and possessing asymptotic freedom could provide a solution to the paradox of the isotropy of the universal 3K background radiation. Thermal equilibrium could be established in this context through interactions occurring in a temporally indefinite preplanckian era.

  14. Cross-correlation cosmography with intensity mapping of the neutral hydrogen 21 cm emission

    NASA Astrophysics Data System (ADS)

    Pourtsidou, A.; Bacon, D.; Crittenden, R.

    2015-11-01

    The cross-correlation of a foreground density field with two different background convergence fields can be used to measure cosmographic distance ratios and constrain dark energy parameters. We investigate the possibility of performing such measurements using a combination of optical galaxy surveys and neutral hydrogen (HI) intensity mapping surveys, with emphasis on the performance of the planned Square Kilometre Array (SKA). Using HI intensity mapping to probe the foreground density tracer field and/or the background source fields has the advantage of excellent redshift resolution and a longer lever arm achieved by using the lensing signal from high redshift background sources. Our results show that, for our best SKA-optical configuration of surveys, a constant equation of state for dark energy can be constrained to ≃8 % for a sky coverage fsky=0.5 and assuming a σ (ΩDE)=0.03 prior for the dark energy density parameter. We also show that using the cosmic microwave background as the second source plane is not competitive, even when considering a COrE-like satellite.

  15. Nature of magnetization and lateral spin-orbit interaction in gated semiconductor nanowires.

    PubMed

    Karlsson, H; Yakimenko, I I; Berggren, K-F

    2018-05-31

    Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin-orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree-Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.

  16. Nature of magnetization and lateral spin–orbit interaction in gated semiconductor nanowires

    NASA Astrophysics Data System (ADS)

    Karlsson, H.; Yakimenko, I. I.; Berggren, K.-F.

    2018-05-01

    Semiconductor nanowires are interesting candidates for realization of spintronics devices. In this paper we study electronic states and effects of lateral spin–orbit coupling (LSOC) in a one-dimensional asymmetrically biased nanowire using the Hartree–Fock method with Dirac interaction. We have shown that spin polarization can be triggered by LSOC at finite source-drain bias,as a result of numerical noise representing a random magnetic field due to wiring or a random background magnetic field by Earth magnetic field, for instance. The electrons spontaneously arrange into spin rows in the wire due to electron interactions leading to a finite spin polarization. The direction of polarization is, however, random at zero source-drain bias. We have found that LSOC has an effect on orientation of spin rows only in the case when source-drain bias is applied.

  17. Feature-Free Activity Classification of Inertial Sensor Data With Machine Vision Techniques: Method, Development, and Evaluation

    PubMed Central

    O'Reilly, Martin; Whelan, Darragh; Caulfield, Brian; Ward, Tomas E

    2017-01-01

    Background Inertial sensors are one of the most commonly used sources of data for human activity recognition (HAR) and exercise detection (ED) tasks. The time series produced by these sensors are generally analyzed through numerical methods. Machine learning techniques such as random forests or support vector machines are popular in this field for classification efforts, but they need to be supported through the isolation of a potentially large number of additionally crafted features derived from the raw data. This feature preprocessing step can involve nontrivial digital signal processing (DSP) techniques. However, in many cases, the researchers interested in this type of activity recognition problems do not possess the necessary technical background for this feature-set development. Objective The study aimed to present a novel application of established machine vision methods to provide interested researchers with an easier entry path into the HAR and ED fields. This can be achieved by removing the need for deep DSP skills through the use of transfer learning. This can be done by using a pretrained convolutional neural network (CNN) developed for machine vision purposes for exercise classification effort. The new method should simply require researchers to generate plots of the signals that they would like to build classifiers with, store them as images, and then place them in folders according to their training label before retraining the network. Methods We applied a CNN, an established machine vision technique, to the task of ED. Tensorflow, a high-level framework for machine learning, was used to facilitate infrastructure needs. Simple time series plots generated directly from accelerometer and gyroscope signals are used to retrain an openly available neural network (Inception), originally developed for machine vision tasks. Data from 82 healthy volunteers, performing 5 different exercises while wearing a lumbar-worn inertial measurement unit (IMU), was collected. The ability of the proposed method to automatically classify the exercise being completed was assessed using this dataset. For comparative purposes, classification using the same dataset was also performed using the more conventional approach of feature-extraction and classification using random forest classifiers. Results With the collected dataset and the proposed method, the different exercises could be recognized with a 95.89% (3827/3991) accuracy, which is competitive with current state-of-the-art techniques in ED. Conclusions The high level of accuracy attained with the proposed approach indicates that the waveform morphologies in the time-series plots for each of the exercises is sufficiently distinct among the participants to allow the use of machine vision approaches. The use of high-level machine learning frameworks, coupled with the novel use of machine vision techniques instead of complex manually crafted features, may facilitate access to research in the HAR field for individuals without extensive digital signal processing or machine learning backgrounds. PMID:28778851

  18. Promoting Translational Research Among Movement Science, Occupational Science, and Occupational Therapy.

    PubMed

    Sainburg, Robert L; Liew, Sook-Lei; Frey, Scott H; Clark, Florence

    2017-01-01

    Integration of research in the fields of neural control of movement and biomechanics (collectively referred to as movement science) with the field of human occupation directly benefits both areas of study. Specifically, incorporating many of the quantitative scientific methods and analyses employed in movement science can help accelerate the development of rehabilitation-relevant research in occupational therapy (OT) and occupational science (OS). Reciprocally, OT and OS, which focus on the performance of everyday activities (occupations) to promote health and well-being, provide theoretical frameworks to guide research on the performance of actions in the context of social, psychological, and environmental factors. Given both fields' mutual interest in the study of movement as it relates to health and disease, the authors posit that combining OS and OT theories and principles with the theories and methods in movement science may lead to new, impactful, and clinically relevant knowledge. The first step is to ensure that individuals with OS or OT backgrounds are academically prepared to pursue advanced study in movement science. In this article, the authors propose 2 strategies to address this need.

  19. Electric line source illumination of a chiral cylinder placed in another chiral background medium

    NASA Astrophysics Data System (ADS)

    Aslam, M.; Saleem, A.; Awan, Z. A.

    2018-05-01

    An electric line source illumination of a chiral cylinder embedded in a chiral background medium is considered. The field expressions inside and outside of a chiral cylinder have been derived using the wave field decomposition approach. The effects of various chiral cylinders, chiral background media and source locations upon the scattering gain pattern have been investigated. It is observed that the chiral background reduces the backward scattering gain as compared to the free space background for a dielectric cylinder. It is also studied that by moving a line source away from a cylinder reduces the backward scattering gain for a chiral cylinder placed in a chiral background under some specific conditions. A unique phenomenon of reduced scattering gain has been observed at a specific observation angle for a chiral cylinder placed in a chiral background having an electric line source location of unity free space wavelength. An isotropic scattering gain pattern is observed for a chiral nihility background provided that if cylinder is chiral or chiral nihility type. It is also observed that this isotropic behaviour is independent of background and cylinder chirality.

  20. Evolution of the Magnetic Field during Chondrule Formation in Planetary Bow Shocks

    NASA Astrophysics Data System (ADS)

    Mai, Chuhong; Desch, Steven; Boley, Aaron C.

    2016-10-01

    Recent laboratory efforts (Fu et al., 2014, 2015) have constrained the remanent magnetizations of chondrules and the magnetic field strengths they were exposed to as they cooled below their Curie points. An outstanding question is whether these fields represent the background magnetic field of the solar nebula or were unique to the chondrule-forming environment. We investigate the amplification of the magnetic field above background values in a planetary bow shock, which is one proposed mechanism for chondrule formation. We use a hydrodynamic code to model the temperature and pressure around a 3000 km-radius planetary embryo as it moves supersonically through the nebula gas. We calculate the ionization of hot, shocked gas considering thermionic emission of electrons and ions from grains and thermal ionization of potassium. We calculate the magnetic diffusion rate, including Ohmic dissipation and ambipolar diffusion (assuming a magnetic field strength comparable to 0.5 G). We compute the steady-state magnetic field around in the bow shock and find that behind the planet the field is amplified, but everywhere else it quickly diffuses out of the shocked region and recovers the background value. We consider the trajectories taken by chondrules behind the shock and present likely values of the magnetic field amplification experienced by chondrules as they cool after melting in the shock.

  1. Harmonic demodulation and minimum enhancement factors in field-enhanced near-field optical microscopy.

    PubMed

    Scarpettini, A F; Bragas, A V

    2015-01-01

    Field-enhanced scanning optical microscopy relies on the design and fabrication of plasmonic probes which had to provide optical and chemical contrast at the nanoscale. In order to do so, the scattering containing the near-field information recorded in a field-enhanced scanning optical microscopy experiment, has to surpass the background light, always present due to multiple interferences between the macroscopic probe and sample. In this work, we show that when the probe-sample distance is modulated with very low amplitude, the higher the harmonic demodulation is, the better the ratio between the near-field signal and the interferometric background results. The choice of working at a given n harmonic is dictated by the experiment when the signal at the n + 1 harmonic goes below the experimental noise. We demonstrate that the optical contrast comes from the nth derivative of the near-field scattering, amplified by the interferometric background. By modelling the far and near field we calculate the probe-sample approach curves, which fit very well the experimental ones. After taking a great amount of experimental data for different probes and samples, we conclude with a table of the minimum enhancement factors needed to have optical contrast with field-enhanced scanning optical microscopy. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  2. Magnetic field in expanding quark-gluon plasma

    NASA Astrophysics Data System (ADS)

    Stewart, Evan; Tuchin, Kirill

    2018-04-01

    Intense electromagnetic fields are created in the quark-gluon plasma by the external ultrarelativistic valence charges. The time evolution and the strength of this field are strongly affected by the electrical conductivity of the plasma. Yet, it has recently been observed that the effect of the magnetic field on the plasma flow is small. We compute the effect of plasma flow on magnetic field and demonstrate that it is less than 10%. These observations indicate that the plasma hydrodynamics and the dynamics of electromagnetic field decouple. Thus, it is a very good approximation, on the one hand, to study QGP in the background electromagnetic field generated by external sources and, on the other hand, to investigate the dynamics of magnetic field in the background plasma. We also argue that the wake induced by the magnetic field in plasma is negligible.

  3. Multiphoton amplitude in a constant background field

    NASA Astrophysics Data System (ADS)

    Ahmad, Aftab; Ahmadiniaz, Naser; Corradini, Olindo; Kim, Sang Pyo; Schubert, Christian

    2018-01-01

    In this contribution, we present our recent compact master formulas for the multiphoton amplitudes of a scalar propagator in a constant background field using the worldline fomulation of quantum field theory. The constant field has been included nonperturbatively, which is crucial for strong external fields. A possible application is the scattering of photons by electrons in a strong magnetic field, a process that has been a subject of great interest since the discovery of astrophysical objects like radio pulsars, which provide evidence that magnetic fields of the order of 1012G are present in nature. The presence of a strong external field leads to a strong deviation from the classical scattering amplitudes. We explicitly work out the Compton scattering amplitude in a magnetic field, which is a process of potential relevance for astrophysics. Our final result is compact and suitable for numerical integration.

  4. Photoacoustic infrared spectroscopy for conducting gas tracer tests and measuring water saturations in landfills.

    PubMed

    Jung, Yoojin; Han, Byunghyun; Mostafid, M Erfan; Chiu, Pei; Yazdani, Ramin; Imhoff, Paul T

    2012-02-01

    Gas tracer tests can be used to determine gas flow patterns within landfills, quantify volatile contaminant residence time, and measure water within refuse. While gas chromatography (GC) has been traditionally used to analyze gas tracers in refuse, photoacoustic spectroscopy (PAS) might allow real-time measurements with reduced personnel costs and greater mobility and ease of use. Laboratory and field experiments were conducted to evaluate the efficacy of PAS for conducting gas tracer tests in landfills. Two tracer gases, difluoromethane (DFM) and sulfur hexafluoride (SF(6)), were measured with a commercial PAS instrument. Relative measurement errors were invariant with tracer concentration but influenced by background gas: errors were 1-3% in landfill gas but 4-5% in air. Two partitioning gas tracer tests were conducted in an aerobic landfill, and limits of detection (LODs) were 3-4 times larger for DFM with PAS versus GC due to temporal changes in background signals. While higher LODs can be compensated by injecting larger tracer mass, changes in background signals increased the uncertainty in measured water saturations by up to 25% over comparable GC methods. PAS has distinct advantages over GC with respect to personnel costs and ease of use, although for field applications GC analyses of select samples are recommended to quantify instrument interferences. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Trends in tungsten coil atomic spectrometry

    NASA Astrophysics Data System (ADS)

    Donati, George L.

    Renewed interest in electrothermal atomic spectrometric methods based on tungsten coil atomizers is a consequence of a world wide increasing demand for fast, inexpensive, sensitive, and portable analytical methods for trace analysis. In this work, tungsten coil atomic absorption spectrometry (WCAAS) and tungsten coil atomic emission spectrometry (WCAES) are used to determine several different metals and even a non-metal at low levels in different samples. Improvements in instrumentation and new strategies to reduce matrix effects and background signals are presented. Investigation of the main factors affecting both WCAAS and WCAES analytical signals points to the importance of a reducing, high temperature gas phase in the processes leading to atomic cloud generation. Some more refractory elements such as V and Ti were determined for the first time by double tungsten coil atomic emission spectrometry (DWCAES). The higher temperatures provided by two atomizers in DWCAES also allowed the detection of Ag, Cu and Sn emission signals for the first time. Simultaneous determination of several elements by WCAES in relatively complex sample matrices was possible after a simple acid extraction. The results show the potential of this method as an alternative to more traditional, expensive methods for fast, more effective analyses and applications in the field. The development of a new metallic atomization cell is also presented. Lower limits of detection in both WCAAS and WCAES determinations were obtained due to factors such as better control of background signal, smaller, more isothermal system, with atomic cloud concentration at the optical path for a longer period of time. Tungsten coil-based methods are especially well suited to applications requiring low sample volume, low cost, sensitivity and portability. Both WCAAS and WCAES have great commercial potential in fields as diverse as archeology and industrial quality control. They are simple, inexpensive, effective methods for trace metal determinations in several different samples, representing an important asset in today's analytical chemistry.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dong; Liu, Rui; Wang, Yuming

    We studied the background field for 60 two-ribbon flares of M-and-above classes during 2011–2015. These flares are categorized into two groups, i.e., eruptive and confined flares, based on whether a flare is associated with a coronal mass ejection or not. The background field of source active regions is approximated by a potential field extrapolated from the B {sub z} component of vector magnetograms provided by the Helioseismic and Magnetic Imager. We calculated the decay index n of the background field above the flaring polarity inversion line, and defined a critical height h {sub crit} corresponding to the theoretical threshold (more » n {sub crit} = 1.5) of the torus instability. We found that h {sub crit} is approximately half of the distance between the centroids of opposite polarities in active regions and that the distribution of h {sub crit} is bimodal: it is significantly higher for confined flares than for eruptive ones. The decay index increases monotonously with increasing height for 86% (84%) of the eruptive (confined) flares but displays a saddle-like profile for the rest, 14% (16%), which are found exclusively in active regions of multipolar field configuration. Moreover, n at the saddle bottom is significantly smaller in confined flares than that in eruptive ones. These results highlight the critical role of background field in regulating the eruptive behavior of two-ribbon flares.« less

  7. Four-wavelength lidar evaluation of particle characteristics and aerosol densities

    NASA Astrophysics Data System (ADS)

    Uthe, E. E.; Livingston, J. M.; Delateur, S. A.; Nielsen, N. B.

    1985-06-01

    The SRI International four-wavelength (0.53, 1.06, 3.8, 10.6 micron) lidar systems was used during the SNOW-ONE-B and Smoke Week XI/SNOW-TWO field experiments to validate its capabilities in assessing obscurant optical and physical properties. The lidar viewed along a horizontal path terminated by a passive reflector. Data examples were analyzed in terms of time-dependent transmission, wavelength dependence of optical depth, and range-resolved extinction coefficients. Three methods were used to derive extinction data from the lidar signatures. These were target method, Klett method and experimental data method. The results of the field and analysis programs are reported in the journal and conference papers that are appended to this report, and include: comparison study of lidar extinction methods, submitted to applied optics, error analysis of lidar solution techniques for range-resolved extinction coefficients based on observational data, smoke/obscurants symposium 9, Four--Wavelength Lidar Measurements from smoke week 6/SNOW-TWO, smoke/obscurants symposium 8, SNOW-ONE-B multiple-wavelength lidar measurements. Snow symposium 3, and lidar applications for obscurant evaluations, smoke/obscurants Symposium 7. The report also provides a summary of background work leading to this project, and of project results.

  8. A new four-step hierarchy method for combined assessment of groundwater quality and pollution.

    PubMed

    Zhu, Henghua; Ren, Xiaohua; Liu, Zhizheng

    2017-12-28

    A new four-step hierarchy method was constructed and applied to evaluate the groundwater quality and pollution of the Dagujia River Basin. The assessment index system is divided into four types: field test indices, common inorganic chemical indices, inorganic toxicology indices, and trace organic indices. Background values of common inorganic chemical indices and inorganic toxicology indices were estimated with the cumulative-probability curve method, and the results showed that the background values of Mg 2+ (51.1 mg L -1 ), total hardness (TH) (509.4 mg L -1 ), and NO 3 - (182.4 mg L -1 ) are all higher than the corresponding grade III values of Quality Standard for Groundwater, indicating that they were poor indicators and therefore were not included in the groundwater quality assessment. The quality assessment results displayed that the field test indices were mainly classified as grade II, accounting for 60.87% of wells sampled. The indices of common inorganic chemical and inorganic toxicology were both mostly in the range of grade III, whereas the trace organic indices were predominantly classified as grade I. The variabilities and excess ratios of the indices were also calculated and evaluated. Spatial distributions showed that the groundwater with poor quality indices was mainly located in the northeast of the basin, which was well-connected with seawater intrusion. Additionally, the pollution assessment revealed that groundwater in well 44 was classified as "moderately polluted," wells 5 and 8 were "lightly polluted," and other wells were classified as "unpolluted."

  9. An Evaluation on Iran International Public Health Summer School in Relation to its Efficacy Based on Participants' Experience and Opinions.

    PubMed

    Parnia, Aidin; Yamani, Nikoo; Zamani, Ahmadreza; Badihian, Shervin; Manouchehri, Navid; Fakhri, Maryam

    2017-01-01

    A serious challenge to educate health staff for public health is to appear encouraging enough to persuade them for learning issues on this field and implementing new educational methods and innovative ways. Iran International Public Health Summer School (IPHS) made an effort to provide medical sciences students with a fortune to get familiar with and involved in public health. This study intended to evaluate the efficacy of this event. This cross-sectional study was performed in March-April 2015 by the help of an electronic self-administered questionnaire filled out by 49 Iranian participants 6 months after IPHS2014. The questionnaire assessed the main goals in seven main domains: Interest, activities, and general knowledge in the field of public health, general skills, educational methods, educational and executive schedules, and general satisfaction. Average scores of all domains were >3 (the mean), and all were statistically significant. The highest average score belonged to educational methods (3.92) and the lowest was calculated for the item regarding participants' activities on public health (3.5). No significant difference was found between positive answers of individuals who were interested or active in public health prior to the event and those who had no background. We believe IPHS was a unique instance in Public Health Education in Iran. Considering the level of success of this program to reach its goals for both students' with or without any previous background on public health, it is recommended as a general model to be simulated in other developing countries.

  10. Particle Streak Anemometry: A New Method for Proximal Flow Sensing from Aircraft

    NASA Astrophysics Data System (ADS)

    Nichols, T. W.

    Accurate sensing of relative air flow direction from fixed-wing small unmanned aircraft (sUAS) is challenging with existing multi-hole pitot-static and vane systems. Sub-degree direction accuracy is generally not available on such systems and disturbances to the local flow field, induced by the airframe, introduce an additional error source. An optical imaging approach to make a relative air velocity measurement with high-directional accuracy is presented. Optical methods offer the capability to make a proximal measurement in undisturbed air outside of the local flow field without the need to place sensors on vulnerable probes extended ahead of the aircraft. Current imaging flow analysis techniques for laboratory use rely on relatively thin imaged volumes and sophisticated hardware and intensity thresholding in low-background conditions. A new method is derived and assessed using a particle streak imaging technique that can be implemented with low-cost commercial cameras and illumination systems, and can function in imaged volumes of arbitrary depth with complex background signal. The new technique, referred to as particle streak anemometry (PSA) (to differentiate from particle streak velocimetry which makes a field measurement rather than a single bulk flow measurement) utilizes a modified Canny Edge detection algorithm with a connected component analysis and principle component analysis to detect streak ends in complex imaging conditions. A linear solution for the air velocity direction is then implemented with a random sample consensus (RANSAC) solution approach. A single DOF non-linear, non-convex optimization problem is then solved for the air speed through an iterative approach. The technique was tested through simulation and wind tunnel tests yielding angular accuracies under 0.2 degrees, superior to the performance of existing commercial systems. Air speed error standard deviations varied from 1.6 to 2.2 m/s depending on the techniques of implementation. While air speed sensing is secondary to accurate flow direction measurement, the air speed results were in line with commercial pitot static systems at low speeds.

  11. Investigation of MHD Instabilities in Jets and Bubbles Using a Compact Coaxial Plasma Gun in a Background Magnetized Plasma

    NASA Astrophysics Data System (ADS)

    Zhang, Y.; Fisher, D. M.; Wallace, B.; Gilmore, M.; Hsu, S. C.

    2016-10-01

    A compact coaxial plasma gun is employed for experimental investigation of launching plasma into a lower density background magnetized plasma. Experiments are being conducted in the linear device HelCat at UNM. Four distinct operational regimes with qualitatively different dynamics are identified by fast CCD camera images. For regime I plasma jet formation, a global helical magnetic configuration is determined by a B-dot probe array data. Also the m =1 kink instability is observed and verified. Furthermore, when the jet is propagating into background magnetic field, a longer length and lifetime jet is formed. Axial shear flow caused by the background magnetic tension force contributes to the increased stability of the jet body. In regime II, a spheromak-like plasma bubble formation is identified when the gun plasma is injected into vacuum. In contrast, when the bubble propagates into a background magnetic field, the closed magnetic field configuration does not hold anymore and a lateral side, Reilgh-Taylor instability develops. Detailed experimental data and analysis will be presented for these cases.

  12. Serial single molecule electron diffraction imaging: diffraction background of superfluid helium droplets

    NASA Astrophysics Data System (ADS)

    Zhang, Jie; He, Yunteng; Lei, Lei; Alghamdi, Maha; Oswalt, Andrew; Kong, Wei

    2017-08-01

    In an effort to solve the crystallization problem in crystallography, we have been engaged in developing a method termed "serial single molecule electron diffraction imaging" (SS-EDI). The unique features of SS-EDI are superfluid helium droplet cooling and field-induced orientation: together the two features constitute a molecular goniometer. Unfortunately, the helium atoms surrounding the sample molecule also contribute to a diffraction background. In this report, we analyze the properties of a superfluid helium droplet beam and its doping statistics, and demonstrate the feasibility of overcoming the background issue by using the velocity slip phenomenon of a pulsed droplet beam. Electron diffraction profiles and pair correlation functions of ferrocene-monomer-doped droplets and iodine-nanocluster-doped droplets are presented. The timing of the pulsed electron gun and the effective doping efficiency under different dopant pressures can both be controlled for size selection. This work clears any doubt of the effectiveness of superfluid helium droplets in SS-EDI, thereby advancing the effort in demonstrating the "proof-of-concept" one step further.

  13. Probing the Intergalactic Magnetic Field with the Anisotropy of the Extragalactic Gamma-Ray Background

    NASA Technical Reports Server (NTRS)

    Venters, T. M.; Pavlidou, V.

    2012-01-01

    The intergalactic magnetic field (IGMF) may leave an imprint on the anisotropy properties of the extragalactic gamma-ray background, through its effect on electromagnetic cascades triggered by interactions between very high energy photons and the extragalactic background light. A strong IGMF will deflect secondary particles produced in these cascades and will thus tend to isotropize lower energy cascade photons, thus inducing a modulation in the anisotropy energy spectrum of the gamma-ray background. Here we present a simple, proof-of-concept calculation of the magnitude of this effect and demonstrate that the two extreme cases (zero IGMF and IGMF strong enough to completely isotropize cascade photons) would be separable by ten years of Fermi observations and reasonable model parameters for the gamma-ray background. The anisotropy energy spectrum of the Fermi gamma-ray background could thus be used as a probe of the IGMF strength.

  14. Probing the Intergalactic Magnetic Field with the Anisotropy of the Extragalactic Gamma-ray Background

    NASA Technical Reports Server (NTRS)

    Venters, T. M.; Pavlidou, V.

    2013-01-01

    The intergalactic magnetic field (IGMF) may leave an imprint on the angular anisotropy of the extragalactic gamma-ray background through its effect on electromagnetic cascades triggered by interactions between very high energy photons and the extragalactic background light. A strong IGMF will deflect secondary particles produced in these cascades and will thus tend to isotropize lower energy cascade photons, thereby inducing a modulation in the anisotropy energy spectrum of the gamma-ray background. Here we present a simple, proof-of-concept calculation of the magnitude of this effect and demonstrate that current Fermi data already seem to prefer nonnegligible IGMF values. The anisotropy energy spectrum of the Fermi gamma-ray background could thus be used as a probe of the IGMF strength.

  15. Generation of rising-tone chorus in a two-dimensional mirror field by using the general curvilinear PIC code

    NASA Astrophysics Data System (ADS)

    Ke, Yangguang; Gao, Xinliang; Lu, Quanming; Wang, Xueyi; Wang, Shui

    2017-08-01

    Recently, the generation of rising-tone chorus has been implemented with one-dimensional (1-D) particle-in-cell (PIC) simulations in an inhomogeneous background magnetic field, where both the propagation of waves and motion of electrons are simply forced to be parallel to the background magnetic field. In this paper, we have developed a two-dimensional (2-D) general curvilinear PIC simulation code and successfully reproduced rising-tone chorus waves excited from an anisotropic electron distribution in a 2-D mirror field. Our simulation results show that whistler waves are mainly generated around the magnetic equator and continuously gain growth during their propagation toward higher-latitude regions. The rising-tone chorus waves are observed off the magnetic equator, which propagate quasi-parallel to the background magnetic field with the wave normal angle smaller than 25°. Due to the propagating effect, the wave normal angle of chorus waves is increasing during their propagation toward higher-latitude regions along an enough curved field line. The chirping rate of chorus waves is found to be larger along a field line with a smaller curvature.

  16. Field background odour should be taken into account when formulating a pest attractant based on plant volatiles

    PubMed Central

    Cai, Xiaoming; Bian, Lei; Xu, Xiuxiu; Luo, Zongxiu; Li, Zhaoqun; Chen, Zongmao

    2017-01-01

    Attractants for pest monitoring and controlling can be developed based on plant volatiles. Previously, we showed that tea leafhopper (Empoasca onukii) preferred grapevine, peach plant, and tea plant odours to clean air. In this research, we formulated three blends with similar attractiveness to leafhoppers as peach, grapevine, and tea plant volatiles; these blends were composed of (Z)-3-hexenyl acetate, (E)-ocimene, (E)-4,8-dimethyl-1,3,7-nonatriene, benzaldehyde, and ethyl benzoate. Based on these five compounds, we developed two attractants, formula-P and formula-G. The specific component relative to tea plant volatiles in formula-P was benzaldehyde, and that in formula-G was ethyl benzoate. These two compounds played a role in attracting leafhoppers. In laboratory assays, the two attractants were more attractive than tea plant volatiles to the leafhoppers, and had a similar level of attractiveness. However, the leafhoppers were not attracted to formula-P in the field. A high concentration of benzaldehyde was detected in the background odour of the tea plantations. In laboratory tests, benzaldehyde at the field concentration was attractive to leafhoppers. Our results indicate that the field background odour can interfere with a point-releasing attractant when their components overlap, and that a successful attractant must differ from the field background odour. PMID:28150728

  17. Planck intermediate results: XXXIII. Signature of the magnetic field geometry of interstellar filaments in dust polarization maps

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ade, P. A. R.; Aghanim, N.; Alves, M. I. R.

    2016-02-09

    Planck observations at 353 GHz provide the first fully sampled maps of the polarized dust emission towards interstellar filaments and their backgrounds (i.e., the emission observed in the surroundings of the filaments). The data allow us to determine the intrinsic polarization properties of the filaments and therefore to provide insight into the structure of their magnetic field (B). In this paper, we present the polarization maps of three nearby (several parsecs long) star-forming filaments of moderate column density (N H about 10 22 cm -2): Musca, B211, and L1506. These three filaments are detected above the background in dust totalmore » and polarized emission. We use the spatial information to separate Stokes I, Q, and U of the filaments from those of their backgrounds, an essential step in measuring the intrinsic polarization fraction (p) and angle (ψ) of each emission component. We find that the polarization angles in the three filaments (ψ fil) are coherent along their lengths and not the same as in their backgrounds (ψ bg). The differences between ψ fil and ψ bg are 12° and 54° for Musca and L1506, respectively, and only 6° in the case of B211. These differences forMusca and L1506 are larger than the dispersions of ψ, both along the filaments and in their backgrounds. The observed changes of ψ are direct evidence of variations of the orientation of the plane of the sky (POS) projection of the magnetic field. As in previous studies, we find a decrease of several per cent in p with N H from the backgrounds to the crest of the filaments. We show that the bulk of the drop in p within the filaments cannot be explained by random fluctuations of the orientation of the magnetic field because they are too small (σ ψ< 10°). We recognize the degeneracy between the dust alignment efficiency (by, e.g., radiative torques) and the structure of the B-field in causing variations in p, but we argue that the decrease in p from the backgrounds to the filaments results in part from depolarization associated with the 3D structure of the B-field: both its orientation in the POS and with respect to the POS. We do not resolve the inner structure of the filaments, but at the smallest scales accessible with Planck (~0.2 pc), the observed changes of ψ and p hold information on the magnetic field structure within filaments. Finally, they show that both the mean field and its fluctuations in the filaments are different from those of their backgrounds, which points to a coupling between the matter and the B-field in the filament formation process.« less

  18. Interaction of the branes in the presence of the background fields: The dynamical, nonintersecting, perpendicular, wrapped-fractional configuration

    NASA Astrophysics Data System (ADS)

    Maghsoodi, Elham; Kamani, Davoud

    2017-05-01

    We shall obtain the interaction of the Dp1- and Dp2-branes in the toroidal-orbifold space-time Tn × ℝ1,d-n-5 × ℂ2/ℤ 2. The configuration of the branes is nonintersecting, perpendicular, moving-rotating, wrapped-fractional with background fields. For this, we calculate the bosonic boundary state corresponding to a dynamical fractional-wrapped Dp-brane in the presence of the Kalb-Ramond field, a U1 gauge potential and an open string tachyon field. The long-range behavior of the interaction amplitude will be extracted.

  19. Whistler mode refraction in highly nonuniform magnetic fields

    NASA Astrophysics Data System (ADS)

    Urrutia, J. M.; Stenzel, R.

    2016-12-01

    In a large laboratory plasma the propagation of whistler modes is measured in highly nonuniform magnetic fields created by a current-carrying wires. Ray tracing is not applicable since the wavelength and gradient scale length are comparable. The waves are excited with a loop antenna near the wire. The antenna launches an m=1 helicon mode in a uniform plasma. The total magnetic field consists of a weak uniform background field and a nearly circular field of a straight wire across the background field. A circular loop produces 3D null points and a 2D null line. The whistler wave propagation will be shown. It is relevant to whistler mode propagation in space plasmas near magnetic null-points, small flux ropes, lunar crustal magnetic fields and active wave injection experiments.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meddens, Marjolein B. M.; Liu, Sheng; Finnegan, Patrick S.

    Here, we have developed a method for performing light-sheet microscopy with a single high numerical aperture lens by integrating reflective side walls into a microfluidic chip. These 45° side walls generate light-sheet illumination by reflecting a vertical light-sheet into the focal plane of the objective. Light-sheet illumination of cells loaded in the channels increases image quality in diffraction limited imaging via reduction of out-of-focus background light. Single molecule super-resolution is also improved by the decreased background resulting in better localization precision and decreased photo-bleaching, leading to more accepted localizations overall and higher quality images. Moreover, 2D and 3D single moleculemore » super-resolution data can be acquired faster by taking advantage of the increased illumination intensities as compared to wide field, in the focused light-sheet.« less

  1. Behavior analysis of video object in complicated background

    NASA Astrophysics Data System (ADS)

    Zhao, Wenting; Wang, Shigang; Liang, Chao; Wu, Wei; Lu, Yang

    2016-10-01

    This paper aims to achieve robust behavior recognition of video object in complicated background. Features of the video object are described and modeled according to the depth information of three-dimensional video. Multi-dimensional eigen vector are constructed and used to process high-dimensional data. Stable object tracing in complex scenes can be achieved with multi-feature based behavior analysis, so as to obtain the motion trail. Subsequently, effective behavior recognition of video object is obtained according to the decision criteria. What's more, the real-time of algorithms and accuracy of analysis are both improved greatly. The theory and method on the behavior analysis of video object in reality scenes put forward by this project have broad application prospect and important practical significance in the security, terrorism, military and many other fields.

  2. Dim target detection method based on salient graph fusion

    NASA Astrophysics Data System (ADS)

    Hu, Ruo-lan; Shen, Yi-yan; Jiang, Jun

    2018-02-01

    Dim target detection is one key problem in digital image processing field. With development of multi-spectrum imaging sensor, it becomes a trend to improve the performance of dim target detection by fusing the information from different spectral images. In this paper, one dim target detection method based on salient graph fusion was proposed. In the method, Gabor filter with multi-direction and contrast filter with multi-scale were combined to construct salient graph from digital image. And then, the maximum salience fusion strategy was designed to fuse the salient graph from different spectral images. Top-hat filter was used to detect dim target from the fusion salient graph. Experimental results show that proposal method improved the probability of target detection and reduced the probability of false alarm on clutter background images.

  3. Validation of radiative transfer computation with Monte Carlo method for ultra-relativistic background flow

    NASA Astrophysics Data System (ADS)

    Ishii, Ayako; Ohnishi, Naofumi; Nagakura, Hiroki; Ito, Hirotaka; Yamada, Shoichi

    2017-11-01

    We developed a three-dimensional radiative transfer code for an ultra-relativistic background flow-field by using the Monte Carlo (MC) method in the context of gamma-ray burst (GRB) emission. For obtaining reliable simulation results in the coupled computation of MC radiation transport with relativistic hydrodynamics which can reproduce GRB emission, we validated radiative transfer computation in the ultra-relativistic regime and assessed the appropriate simulation conditions. The radiative transfer code was validated through two test calculations: (1) computing in different inertial frames and (2) computing in flow-fields with discontinuous and smeared shock fronts. The simulation results of the angular distribution and spectrum were compared among three different inertial frames and in good agreement with each other. If the time duration for updating the flow-field was sufficiently small to resolve a mean free path of a photon into ten steps, the results were thoroughly converged. The spectrum computed in the flow-field with a discontinuous shock front obeyed a power-law in frequency whose index was positive in the range from 1 to 10 MeV. The number of photons in the high-energy side decreased with the smeared shock front because the photons were less scattered immediately behind the shock wave due to the small electron number density. The large optical depth near the shock front was needed for obtaining high-energy photons through bulk Compton scattering. Even one-dimensional structure of the shock wave could affect the results of radiation transport computation. Although we examined the effect of the shock structure on the emitted spectrum with a large number of cells, it is hard to employ so many computational cells per dimension in multi-dimensional simulations. Therefore, a further investigation with a smaller number of cells is required for obtaining realistic high-energy photons with multi-dimensional computations.

  4. The status of augmented reality in laparoscopic surgery as of 2016.

    PubMed

    Bernhardt, Sylvain; Nicolau, Stéphane A; Soler, Luc; Doignon, Christophe

    2017-04-01

    This article establishes a comprehensive review of all the different methods proposed by the literature concerning augmented reality in intra-abdominal minimally invasive surgery (also known as laparoscopic surgery). A solid background of surgical augmented reality is first provided in order to support the survey. Then, the various methods of laparoscopic augmented reality as well as their key tasks are categorized in order to better grasp the current landscape of the field. Finally, the various issues gathered from these reviewed approaches are organized in order to outline the remaining challenges of augmented reality in laparoscopic surgery. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Quantum Approach to Informatics

    NASA Astrophysics Data System (ADS)

    Stenholm, Stig; Suominen, Kalle-Antti

    2005-08-01

    An essential overview of quantum information Information, whether inscribed as a mark on a stone tablet or encoded as a magnetic domain on a hard drive, must be stored in a physical object and thus made subject to the laws of physics. Traditionally, information processing such as computation occurred in a framework governed by laws of classical physics. However, information can also be stored and processed using the states of matter described by non-classical quantum theory. Understanding this quantum information, a fundamentally different type of information, has been a major project of physicists and information theorists in recent years, and recent experimental research has started to yield promising results. Quantum Approach to Informatics fills the need for a concise introduction to this burgeoning new field, offering an intuitive approach for readers in both the physics and information science communities, as well as in related fields. Only a basic background in quantum theory is required, and the text keeps the focus on bringing this theory to bear on contemporary informatics. Instead of proofs and other highly formal structures, detailed examples present the material, making this a uniquely accessible introduction to quantum informatics. Topics covered include: * An introduction to quantum information and the qubit * Concepts and methods of quantum theory important for informatics * The application of information concepts to quantum physics * Quantum information processing and computing * Quantum gates * Error correction using quantum-based methods * Physical realizations of quantum computing circuits A helpful and economical resource for understanding this exciting new application of quantum theory to informatics, Quantum Approach to Informatics provides students and researchers in physics and information science, as well as other interested readers with some scientific background, with an essential overview of the field.

  6. Reliable detection of fluence anomalies in EPID-based IMRT pretreatment quality assurance using pixel intensity deviations

    PubMed Central

    Gordon, J. J.; Gardner, J. K.; Wang, S.; Siebers, J. V.

    2012-01-01

    Purpose: This work uses repeat images of intensity modulated radiation therapy (IMRT) fields to quantify fluence anomalies (i.e., delivery errors) that can be reliably detected in electronic portal images used for IMRT pretreatment quality assurance. Methods: Repeat images of 11 clinical IMRT fields are acquired on a Varian Trilogy linear accelerator at energies of 6 MV and 18 MV. Acquired images are corrected for output variations and registered to minimize the impact of linear accelerator and electronic portal imaging device (EPID) positioning deviations. Detection studies are performed in which rectangular anomalies of various sizes are inserted into the images. The performance of detection strategies based on pixel intensity deviations (PIDs) and gamma indices is evaluated using receiver operating characteristic analysis. Results: Residual differences between registered images are due to interfraction positional deviations of jaws and multileaf collimator leaves, plus imager noise. Positional deviations produce large intensity differences that degrade anomaly detection. Gradient effects are suppressed in PIDs using gradient scaling. Background noise is suppressed using median filtering. In the majority of images, PID-based detection strategies can reliably detect fluence anomalies of ≥5% in ∼1 mm2 areas and ≥2% in ∼20 mm2 areas. Conclusions: The ability to detect small dose differences (≤2%) depends strongly on the level of background noise. This in turn depends on the accuracy of image registration, the quality of the reference image, and field properties. The longer term aim of this work is to develop accurate and reliable methods of detecting IMRT delivery errors and variations. The ability to resolve small anomalies will allow the accuracy of advanced treatment techniques, such as image guided, adaptive, and arc therapies, to be quantified. PMID:22894421

  7. High Sensitivity Detection of Broadband Acoustic Vibration Using Optical Demodulation Method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhen

    Measuring the high frequency acoustic vibrations represents the fundamental interest in revealing the intrinsic dynamic characteristic of board range of systems, such as the growth of the fetus, blood flow in human palms, and vibrations of carbon nanotube. However, the acoustic wave detection capability is limited by the detection bandwidth and sensitivity of the commonly used piezoelectric based ultrasound detectors. To overcome these limitations, this thesis focuses on exploring the optical demodulation method for highly sensitive detection of broadband acoustic vibration. First, a transparent optical ultrasonic detector has been developed using micro-ring resonator (MRR) made of soft polymeric materials. It outperforms the traditional piezoelectric detectors with broader detection bandwidth, miniaturized size and wide angular sensitivity. Its ease of integration into photoacoustic microscopy system has resulted in the great improvement of the imaging resolution. A theoretic framework has been developed to establish the quantitative understanding of its unique distance and angular dependent detection characteristics and was subsequently validated experimentally. The developed theoretic framework provides a guideline to fully accounts for the trade-offs between axial and lateral resolution, working distance, and the field of view in developing optimal imaging performance for a wide range of biological and clinical applications. MRR-based ultrasonic detector is further integrated into confocal fluorescence microscopy to realize the simultaneous imaging of fluorescence and optical absorption of retinal pigment epithelium, achieving multi-contrast imaging at sub-cellular level. The needs to resolve the fine details of the biological specimen with the resolution beyond the diffraction limit further motivate the development of optical demodulated ultrasonic detection method based on near-field scanning optical microscopy (NSOM). The nano-focusing probe was developed for adiabatic focusing of surface plasmon polaritons to the probe apex with high energy efficiency and the suppression of the background noise was accomplished through the implementation of the harmonic demodulation technique. Collectively, this system is capable of delivering intense near-field illumination source while effectively suppressing the background signal due to the far-field scattering and thus, allows for quantitative mapping of local evanescent field with enhanced contrast and improved resolutions. The performance of the developed NSOM system has been validated through the experimental measurements of the surface plasmon polariton mode. This new NSOM system enables optical demodulated ultrasound detection at nanoscale spatial resolution. Using it to detect the ultrasound signal within the acoustic near-field has led to the successful experimental demonstration of the sub-surface photoacoustic imaging of buried objects with sub-diffraction-limited resolution and high sensitivity. Such a new ultrasound detection method holds promising potential for super-resolution ultrasound imaging.

  8. Alfven waves in spiral interplanetary field

    NASA Technical Reports Server (NTRS)

    Whang, Y. C.

    1973-01-01

    A theoretical study is presented of the Alfven waves in the spiral interplanetary magnetic field. The Alfven waves under consideration are arbitrary, large amplitude, non-monochromatic, microscale waves of any polarization. They superpose on a mesoscale background flow of thermally anisotropic plasma. Using WKB approximation, an analytical solution for the amplitude vectors is obtained as a function of the background flow properties: density, velocity, Alfven speed, thermal anisotropy, and the spiral angel. The necessary condition for the validity of the WKB solution is discussed. The intensity of fluctuations is calculated as a function of heliocentric distance. Relative intensity of fluctuations as compared with the magnitude of the background field has its maximum in the region near l au. Thus outside of this region, the solar wind is less turbulent.

  9. Klein-Gordon oscillator with position-dependent mass in the rotating cosmic string spacetime

    NASA Astrophysics Data System (ADS)

    Wang, Bing-Qian; Long, Zheng-Wen; Long, Chao-Yun; Wu, Shu-Rui

    2018-02-01

    A spinless particle coupled covariantly to a uniform magnetic field parallel to the string in the background of the rotating cosmic string is studied. The energy levels of the electrically charged particle subject to the Klein-Gordon oscillator are analyzed. Afterwards, we consider the case of the position-dependent mass and show how these energy levels depend on the parameters in the problem. Remarkably, it shows that for the special case, the Klein-Gordon oscillator coupled covariantly to a homogeneous magnetic field with the position-dependent mass in the rotating cosmic string background has the similar behaviors to the Klein-Gordon equation with a Coulomb-type configuration in a rotating cosmic string background in the presence of an external magnetic field.

  10. Multipactor susceptibility on a dielectric with a bias dc electric field and a background gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang Peng; Lau, Y. Y.; Franzi, Matthew

    2011-05-15

    We use Monte Carlo simulations and analytical calculations to derive the condition for the onset of multipactor discharge on a dielectric surface at various combinations of the bias dc electric field, rf electric field, and background pressures of noble gases, such as Argon. It is found that the presence of a tangential bias dc electric field on the dielectric surface lowers the magnitude of rf electric field threshold to initiate multipactor, therefore plausibly offering robust protection against high power microwaves. The presence of low pressure gases may lead to a lower multipactor saturation level, however. The combined effects of tangentialmore » dc electric field and external gases on multipactor susceptibility are presented.« less

  11. 2.5D transient electromagnetic inversion with OCCAM method

    NASA Astrophysics Data System (ADS)

    Li, R.; Hu, X.

    2016-12-01

    In the application of time-domain electromagnetic method (TEM), some multidimensional inversion schemes are applied for imaging in the past few decades to overcome great error produced by 1D model inversion when the subsurface structure is complex. The current mainstream multidimensional inversion for EM data, with the finite-difference time-domain (FDTD) forward method, mainly implemented by Nonlinear Conjugate Gradient (NLCG). But the convergence rate of NLCG heavily depends on Lagrange multiplier and maybe fail to converge. We use the OCCAM inversion method to avoid the weakness. OCCAM inversion is proven to be a more stable and reliable method to image the subsurface 2.5D electrical conductivity. Firstly, we simulate the 3D transient EM fields governed by Maxwell's equations with FDTD method. Secondly, we use the OCCAM inversion scheme with the appropriate objective error functional we established to image the 2.5D structure. And the data space OCCAM's inversion (DASOCC) strategy based on OCCAM scheme were given in this paper. The sensitivity matrix is calculated with the method of time-integrated back-propagated fields. Imaging result of example model shown in Fig. 1 have proven that the OCCAM scheme is an efficient inversion method for TEM with FDTD method. The processes of the inversion iterations have shown the great ability of convergence with few iterations. Summarizing the process of the imaging, we can make the following conclusions. Firstly, the 2.5D imaging in FDTD system with OCCAM inversion demonstrates that we could get desired imaging results for the resistivity structure in the homogeneous half-space. Secondly, the imaging results usually do not over-depend on the initial model, but the iteration times can be reduced distinctly if the background resistivity of initial model get close to the truthful model. So it is batter to set the initial model based on the other geologic information in the application. When the background resistivity fit the truthful model well, the imaging of anomalous body only need a few iteration steps. Finally, the speed of imaging vertical boundaries is slower than the speed of imaging the horizontal boundaries.

  12. mHealth Series: mHealth project in Zhao County, rural China – Description of objectives, field site and methods

    PubMed Central

    van Velthoven, Michelle Helena; Li, Ye; Wang, Wei; Du, Xiaozhen; Wu, Qiong; Chen, Li; Majeed, Azeem; Rudan, Igor; Zhang, Yanfeng; Car, Josip

    2013-01-01

    Background We set up a collaboration between researchers in China and the UK that aimed to explore the use of mHealth in China. This is the first paper in a series of papers on a large mHealth project part of this collaboration. This paper included the aims and objectives of the mHealth project, our field site, and the detailed methods of two studies. Field site The field site for this mHealth project was Zhao County, which lies 280 km south of Beijing in Hebei Province, China. Methods We described the methodology of two studies: (i) a mixed methods study exploring factors influencing sample size calculations for mHealth–based health surveys and (ii) a cross–over study determining validity of an mHealth text messaging data collection tool. The first study used mixed methods, both quantitative and qualitative, including: (i) two surveys with caregivers of young children, (ii) interviews with caregivers, village doctors and participants of the cross–over study, and (iii) researchers’ views. We combined data from caregivers, village doctors and researchers to provide an in–depth understanding of factors influencing sample size calculations for mHealth–based health surveys. The second study, a cross–over study, used a randomised cross–over study design to compare the traditional face–to–face survey method to the new text messaging survey method. We assessed data equivalence (intrarater agreement), the amount of information in responses, reasons for giving different responses, the response rate, characteristics of non–responders, and the error rate. Conclusions This paper described the objectives, field site and methods of a large mHealth project part of a collaboration between researchers in China and the UK. The mixed methods study evaluating factors that influence sample size calculations could help future studies with estimating reliable sample sizes. The cross–over study comparing face–to–face and text message survey data collection could help future studies with developing their mHealth tools. PMID:24363919

  13. PRELIMINARY DATA REPORT: HUMATE INJECTION AS AN ENHANCED ATTENUATION METHOD AT THE F-AREA SEEPAGE BASINS, SAVANNAH RIVER SITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millings, M.

    2013-09-16

    A field test of a humate technology for uranium and I-129 remediation was conducted at the F-Area Field Research Site as part of the Attenuation-Based Remedies for the Subsurface Applied Field Research Initiative (ABRS AFRI) funded by the DOE Office of Soil and Groundwater Remediation. Previous studies have shown that humic acid sorbed to sediments strongly binds uranium at mildly acidic pH and potentially binds iodine-129 (I-129). Use of humate could be applicable for contaminant stabilization at a wide variety of DOE sites however pilot field-scale tests and optimization of this technology are required to move this technical approach frommore » basic science to actual field deployment and regulatory acceptance. The groundwater plume at the F-Area Field Research Site contains a large number of contaminants, the most important from a risk perspective being strontium-90 (Sr-90), uranium isotopes, I-129, tritium, and nitrate. Groundwater remains acidic, with pH as low as 3.2 near the basins and increasing to the background pH of approximately 5at the plume fringes. The field test was conducted in monitoring well FOB 16D, which historically has shown low pH and elevated concentrations of Sr-90, uranium, I-129 and tritium. The field test included three months of baseline monitoring followed by injection of a potassium humate solution and approximately four and half months of post monitoring. Samples were collected and analyzed for numerous constituents but the focus was on attenuation of uranium, Sr-90, and I-129. This report provides background information, methodology, and preliminary field results for a humate field test. Results from the field monitoring show that most of the excess humate (i.e., humate that did not sorb to the sediments) has flushed through the surrounding formation. Furthermore, the data indicate that the test was successful in loading a band of sediment surrounding the injection point to a point where pH could return to near normal during the study timeframe. Future work will involve a final report, which will include data trends, correlations and interpretations of laboratory data.« less

  14. Spacelike matching to null infinity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zenginoglu, Anil; Tiglio, Manuel

    2009-07-15

    We present two methods to include the asymptotic domain of a background spacetime in null directions for numerical solutions of evolution equations so that both the radiation extraction problem and the outer boundary problem are solved. The first method is based on the geometric conformal approach, the second is a coordinate based approach. We apply these methods to the case of a massless scalar wave equation on a Kerr spacetime. Our methods are designed to allow existing codes to reach the radiative zone by including future null infinity in the computational domain with relatively minor modifications. We demonstrate the flexibilitymore » of the methods by considering both Boyer-Lindquist and ingoing Kerr coordinates near the black hole. We also confirm numerically predictions concerning tail decay rates for scalar fields at null infinity in Kerr spacetime due to Hod for the first time.« less

  15. A Rigorous Geometric Derivation of the Chiral Anomaly in Curved Backgrounds

    NASA Astrophysics Data System (ADS)

    Bär, Christian; Strohmaier, Alexander

    2016-11-01

    We discuss the chiral anomaly for a Weyl field in a curved background and show that a novel index theorem for the Lorentzian Dirac operator can be applied to describe the gravitational chiral anomaly. A formula for the total charge generated by the gravitational and gauge field background is derived directly in Lorentzian signature and in a mathematically rigorous manner. It contains a term identical to the integrand in the Atiyah-Singer index theorem and another term involving the {η}-invariant of the Cauchy hypersurfaces.

  16. Field monitoring of plant-growth-promoting rhizobacteria by colony immunoblotting.

    PubMed

    Krishnen, Ganisan; Kecskés, Mihály L; Rose, Michael T; Geelan-Small, Peter; Amprayn, Khanok-on; Pereg, Lily; Kennedy, Ivan R

    2011-11-01

    Inoculant plant-growth-promoting bacteria are emerging as an important component of sustainable agriculture. There is a need to develop inexpensive methods for enumerating these organisms after their application in the field, to better understand their survival and impacts on yields. Immunoblotting is one potential method to measure viable cells, but the high cost of the conventionally used nylon membranes makes this method prohibitive. In this study, less expensive alternative materials such as filter papers, glossy photo papers, and transparencies for the purpose of colony immunoblotting were evaluated and the best substance was chosen for further studies. Whatman filter paper No. 541 combined with a 0.01 mol·L(-1) H(2)SO(4) rinsing step gave similar results to nylon membranes but <20% of the overall cost of the original colony immunoblotting assay. The application of the modified immunoblot method was tested on nonsterile clay soil samples that were spiked with high numbers (>10(7) CFU·g(-1)) of the plant-growth-promoting bacteria Pseudomonas fluorescens , Azospirillum brasilense , or Rhizobium leguminosarum . The modified protocol allowed the identification and recovery of over 50% of the inoculated cells of all three strains, amidst a background of the native soil microflora. Subsequently, the survival of P. fluorescens was successfully monitored for several months after application to field-grown rice at Jerilderie, New South Wales, Australia, thus validating the procedure.

  17. Unsupervised background-constrained tank segmentation of infrared images in complex background based on the Otsu method.

    PubMed

    Zhou, Yulong; Gao, Min; Fang, Dan; Zhang, Baoquan

    2016-01-01

    In an effort to implement fast and effective tank segmentation from infrared images in complex background, the threshold of the maximum between-class variance method (i.e., the Otsu method) is analyzed and the working mechanism of the Otsu method is discussed. Subsequently, a fast and effective method for tank segmentation from infrared images in complex background is proposed based on the Otsu method via constraining the complex background of the image. Considering the complexity of background, the original image is firstly divided into three classes of target region, middle background and lower background via maximizing the sum of their between-class variances. Then, the unsupervised background constraint is implemented based on the within-class variance of target region and hence the original image can be simplified. Finally, the Otsu method is applied to simplified image for threshold selection. Experimental results on a variety of tank infrared images (880 × 480 pixels) in complex background demonstrate that the proposed method enjoys better segmentation performance and even could be comparative with the manual segmentation in segmented results. In addition, its average running time is only 9.22 ms, implying the new method with good performance in real time processing.

  18. A Method for Harmonic Sources Detection based on Harmonic Distortion Power Rate

    NASA Astrophysics Data System (ADS)

    Lin, Ruixing; Xu, Lin; Zheng, Xian

    2018-03-01

    Harmonic sources detection at the point of common coupling is an essential step for harmonic contribution determination and harmonic mitigation. The harmonic distortion power rate index is proposed for harmonic source location based on IEEE Std 1459-2010 in the paper. The method only based on harmonic distortion power is not suitable when the background harmonic is large. To solve this problem, a threshold is determined by the prior information, when the harmonic distortion power is larger than the threshold, the customer side is considered as the main harmonic source, otherwise, the utility side is. A simple model of public power system was built in MATLAB/Simulink and field test results of typical harmonic loads verified the effectiveness of proposed method.

  19. Optical Molecular Imaging for Diagnosing Intestinal Diseases

    PubMed Central

    Kim, Sang-Yeob

    2013-01-01

    Real-time visualization of the molecular signature of cells can be achieved with advanced targeted imaging techniques using molecular probes and fluorescence endoscopy. This molecular optical imaging in gastrointestinal endoscopy is promising for improving the detection of neoplastic lesions, their characterization for patient stratification, and the assessment of their response to molecular targeted therapy and radiotherapy. In inflammatory bowel disease, this method can be used to detect dysplasia in the presence of background inflammation and to visualize inflammatory molecular targets for assessing disease severity and prognosis. Several preclinical and clinical trials have applied this method in endoscopy; however, this field has just started to evolve. Hence, many problems have yet to be solved to enable the clinical application of this novel method. PMID:24340254

  20. Offline Arabic handwriting recognition: a survey.

    PubMed

    Lorigo, Liana M; Govindaraju, Venu

    2006-05-01

    The automatic recognition of text on scanned images has enabled many applications such as searching for words in large volumes of documents, automatic sorting of postal mail, and convenient editing of previously printed documents. The domain of handwriting in the Arabic script presents unique technical challenges and has been addressed more recently than other domains. Many different methods have been proposed and applied to various types of images. This paper provides a comprehensive review of these methods. It is the first survey to focus on Arabic handwriting recognition and the first Arabic character recognition survey to provide recognition rates and descriptions of test data for the approaches discussed. It includes background on the field, discussion of the methods, and future research directions.

  1. Duality linking standard and tachyon scalar field cosmologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avelino, P. P.; Bazeia, D.; Losano, L.

    2010-09-15

    In this work we investigate the duality linking standard and tachyon scalar field homogeneous and isotropic cosmologies in N+1 dimensions. We determine the transformation between standard and tachyon scalar fields and between their associated potentials, corresponding to the same background evolution. We show that, in general, the duality is broken at a perturbative level, when deviations from a homogeneous and isotropic background are taken into account. However, we find that for slow-rolling fields the duality is still preserved at a linear level. We illustrate our results with specific examples of cosmological relevance, where the correspondence between scalar and tachyon scalarmore » field models can be calculated explicitly.« less

  2. Development of a Crosstalk Suppression Algorithm for KID Readout

    NASA Astrophysics Data System (ADS)

    Lee, Kyungmin; Ishitsuka, H.; Oguri, S.; Suzuki, J.; Tajima, O.; Tomita, N.; Won, Eunil; Yoshida, M.

    2018-06-01

    The GroundBIRD telescope aims to detect B-mode polarization of the cosmic microwave background radiation using the kinetic inductance detector array as a polarimeter. For the readout of the signal from detector array, we have developed a frequency division multiplexing readout system based on a digital down converter method. These techniques in general have the leakage problems caused by the crosstalks. The window function was applied in the field programmable gate arrays to mitigate the effect of these problems and tested it in algorithm level.

  3. Research on information security in big data era

    NASA Astrophysics Data System (ADS)

    Zhou, Linqi; Gu, Weihong; Huang, Cheng; Huang, Aijun; Bai, Yongbin

    2018-05-01

    Big data is becoming another hotspot in the field of information technology after the cloud computing and the Internet of Things. However, the existing information security methods can no longer meet the information security requirements in the era of big data. This paper analyzes the challenges and a cause of data security brought by big data, discusses the development trend of network attacks under the background of big data, and puts forward my own opinions on the development of security defense in technology, strategy and product.

  4. Processing method of images obtained during the TESIS/CORONAS-PHOTON experiment

    NASA Astrophysics Data System (ADS)

    Kuzin, S. V.; Shestov, S. V.; Bogachev, S. A.; Pertsov, A. A.; Ulyanov, A. S.; Reva, A. A.

    2011-04-01

    In January 2009, the CORONAS-PHOTON spacecraft was successfully launched. It includes a set of telescopes and spectroheliometers—TESIS—designed to image the solar corona in soft X-ray and EUV spectral ranges. Due to features of the reading system, to obtain physical information from these images, it is necessary to preprocess them, i.e., to remove the background, correct the white field, level, and clean. The paper discusses the algorithms and software developed and used for the preprocessing of images.

  5. The applicability of remote sensing to Earth biological problems. Part 2: The potential of remote sensing in pest management

    NASA Technical Reports Server (NTRS)

    Polhemus, J. T.

    1980-01-01

    Five troublesome insect pest groups were chosen for study. These represent a broad spectrum of life cycles, ecological indicators, pest management strategies, and remote sensing requirements. Background data, and field study results for each of these subjects is discussed for each insect group. Specific groups studied include tsetse flies, locusts, western rangeland grasshoppers, range caterpillars, and mosquitoes. It is concluded that remote sensing methods are aplicable to the pest management of the insect groups studied.

  6. Sodium MRI: Methods and applications

    PubMed Central

    Madelin, Guillaume; Lee, Jae-Seung; Regatte, Ravinder R.; Jerschow, Alexej

    2014-01-01

    Sodium NMR spectroscopy and MRI have become popular in recent years through the increased availability of high-field MRI scanners, advanced scanner hardware and improved methodology. Sodium MRI is being evaluated for stroke and tumor detection, for breast cancer studies, and for the assessment of osteoarthritis and muscle and kidney functions, to name just a few. In this article, we aim to present an up-to-date review of the theoretical background, the methodology, the challenges and limitations, and current and potential new applications of sodium MRI. PMID:24815363

  7. Identifying Flow Networks in a Karstified Aquifer by Application of the Cellular Automata-Based Deterministic Inversion Method (Lez Aquifer, France)

    NASA Astrophysics Data System (ADS)

    Fischer, P.; Jardani, A.; Wang, X.; Jourde, H.; Lecoq, N.

    2017-12-01

    The distributed modeling of flow paths within karstic and fractured fields remains a complex task because of the high dependence of the hydraulic responses to the relative locations between observational boreholes and interconnected fractures and karstic conduits that control the main flow of the hydrosystem. The inverse problem in a distributed model is one alternative approach to interpret the hydraulic test data by mapping the karstic networks and fractured areas. In this work, we developed a Bayesian inversion approach, the Cellular Automata-based Deterministic Inversion (CADI) algorithm to infer the spatial distribution of hydraulic properties in a structurally constrained model. This method distributes hydraulic properties along linear structures (i.e., flow conduits) and iteratively modifies the structural geometry of this conduit network to progressively match the observed hydraulic data to the modeled ones. As a result, this method produces a conductivity model that is composed of a discrete conduit network embedded in the background matrix, capable of producing the same flow behavior as the investigated hydrologic system. The method is applied to invert a set of multiborehole hydraulic tests collected from a hydraulic tomography experiment conducted at the Terrieu field site in the Lez aquifer, Southern France. The emergent model shows a high consistency to field observation of hydraulic connections between boreholes. Furthermore, it provides a geologically realistic pattern of flow conduits. This method is therefore of considerable value toward an enhanced distributed modeling of the fractured and karstified aquifers.

  8. ON THE ROLE OF THE BACKGROUND OVERLYING MAGNETIC FIELD IN SOLAR ERUPTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nindos, A.; Patsourakos, S.; Wiegelmann, T., E-mail: anindos@cc.uoi.gr

    2012-03-20

    The primary constraining force that inhibits global solar eruptions is provided by the overlying background magnetic field. Using magnetic field data from both the Helioseismic and Magnetic Imager aboard the Solar Dynamics Observatory and the spectropolarimeter of the Solar Optical Telescope aboard Hinode, we study the long-term evolution of the background field in active region AR11158 that produced three major coronal mass ejections (CMEs). The CME formation heights were determined using EUV data. We calculated the decay index -(z/B)({partial_derivative}B/{partial_derivative}z) of the magnetic field B (i.e., how fast the field decreases with height, z) related to each event from the timemore » of the active region emergence until well after the CMEs. At the heights of CME formation, the decay indices were 1.1-2.1. Prior to two of the events, there were extended periods (of more than 23 hr) where the related decay indices at heights above the CME formation heights either decreased (up to -15%) or exhibited small changes. The decay index related to the third event increased (up to 118%) at heights above 20 Mm within an interval that started 64 hr prior to the CME. The magnetic free energy and the accumulated helicity into the corona contributed the most to the eruptions by their increase throughout the flux emergence phase (by factors of more than five and more than two orders of magnitude, respectively). Our results indicate that the initiation of eruptions does not depend critically on the temporal evolution of the variation of the background field with height.« less

  9. Holographic anisotropic background with confinement-deconfinement phase transition

    NASA Astrophysics Data System (ADS)

    Aref'eva, Irina; Rannu, Kristina

    2018-05-01

    We present new anisotropic black brane solutions in 5D Einstein-dilaton-two-Maxwell system. The anisotropic background is specified by an arbitrary dynamical exponent ν, a nontrivial warp factor, a non-zero dilaton field, a non-zero time component of the first Maxwell field and a non-zero longitudinal magnetic component of the second Maxwell field. The blackening function supports the Van der Waals-like phase transition between small and large black holes for a suitable first Maxwell field charge. The isotropic case corresponding to ν = 1 and zero magnetic field reproduces previously known solutions. We investigate the anisotropy influence on the thermodynamic properties of our background, in particular, on the small/large black holes phase transition diagram. We discuss applications of the model to the bottom-up holographic QCD. The RG flow interpolates between the UV section with two suppressed transversal coordinates and the IR section with the suppressed time and longitudinal coordinates due to anisotropic character of our solution. We study the temporal Wilson loops, extended in longitudinal and transversal directions, by calculating the minimal surfaces of the corresponding probing open string world-sheet in anisotropic backgrounds with various temperatures and chemical potentials. We find that dynamical wall locations depend on the orientation of the quark pairs, that gives a crossover transition line between confinement/deconfinement phases in the dual gauge theory. Instability of the background leads to the appearance of the critical points ( μ ϑ,b , T ϑ,b ) depending on the orientation ϑ of quark-antiquark pairs in respect to the heavy ions collision line.

  10. A singular-value method for reconstruction of nonradial and lossy objects.

    PubMed

    Jiang, Wei; Astheimer, Jeffrey; Waag, Robert

    2012-03-01

    Efficient inverse scattering algorithms for nonradial lossy objects are presented using singular-value decomposition to form reduced-rank representations of the scattering operator. These algorithms extend eigenfunction methods that are not applicable to nonradial lossy scattering objects because the scattering operators for these objects do not have orthonormal eigenfunction decompositions. A method of local reconstruction by segregation of scattering contributions from different local regions is also presented. Scattering from each region is isolated by forming a reduced-rank representation of the scattering operator that has domain and range spaces comprised of far-field patterns with retransmitted fields that focus on the local region. Methods for the estimation of the boundary, average sound speed, and average attenuation slope of the scattering object are also given. These methods yielded approximations of scattering objects that were sufficiently accurate to allow residual variations to be reconstructed in a single iteration. Calculated scattering from a lossy elliptical object with a random background, internal features, and white noise is used to evaluate the proposed methods. Local reconstruction yielded images with spatial resolution that is finer than a half wavelength of the center frequency and reproduces sound speed and attenuation slope with relative root-mean-square errors of 1.09% and 11.45%, respectively.

  11. Fictitious domain method for fully resolved reacting gas-solid flow simulation

    NASA Astrophysics Data System (ADS)

    Zhang, Longhui; Liu, Kai; You, Changfu

    2015-10-01

    Fully resolved simulation (FRS) for gas-solid multiphase flow considers solid objects as finite sized regions in flow fields and their behaviours are predicted by solving equations in both fluid and solid regions directly. Fixed mesh numerical methods, such as fictitious domain method, are preferred in solving FRS problems and have been widely researched. However, for reacting gas-solid flows no suitable fictitious domain numerical method has been developed. This work presents a new fictitious domain finite element method for FRS of reacting particulate flows. Low Mach number reacting flow governing equations are solved sequentially on a regular background mesh. Particles are immersed in the mesh and driven by their surface forces and torques integrated on immersed interfaces. Additional treatments on energy and surface reactions are developed. Several numerical test cases validated the method and a burning carbon particles array falling simulation proved the capability for solving moving reacting particle cluster problems.

  12. Relation between residential magnetic fields, light-at-night, and nocturnal urine melatonin levels in women: Volume 1 -- Background and purpose, methods, results, discussion. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaune, W.; Davis, S.; Stevens, R.

    Scientists have postulated a link between exposure to magnetic fields and reduced blood melatonin levels. This EPRI study was designed to supplement a National Cancer Institute study (NCI-BC) of magnetic fields, light-at-night, and the risk of breast cancer. By expanding the exposure assessment of the NCI-BC and collecting data on urine melatonin levels, this project provides new insight into a possible magnetic field-melatonin link. It has been proposed that exposure to 60-Hz (power frequency) magnetic fields may increase the risk of breast cancer by suppressing the normal nocturnal rise in melatonin production in the pineal gland. It remains unknown whethermore » the human pineal gland is reproducibly responsive or sensitive to magnetic field exposure, and whether such exposures could alter elements of the endogenous hormonal environment in women that might be important in the etiology of breast cancer. The objective of this research was to investigate whether exposure to power-frequency magnetic fields and/or light-at-night is associated with levels of the primary urinary melatonin metabolite in women without a history of breast cancer.« less

  13. Open Field Release of Genetically Engineered Sterile Male Aedes aegypti in Malaysia

    PubMed Central

    Raduan, Norzahira; Kwee Wee, Lim; Hong Ming, Wong; Guat Ney, Teoh; Rahidah A.A., Siti; Salman, Sawaluddin; Subramaniam, Selvi; Nordin, Oreenaiza; Hanum A.T., Norhaida; Angamuthu, Chandru; Marlina Mansor, Suria; Lees, Rosemary S.; Naish, Neil; Scaife, Sarah; Gray, Pam; Labbé, Geneviève; Beech, Camilla; Nimmo, Derric; Alphey, Luke; Vasan, Seshadri S.; Han Lim, Lee; Wasi A., Nazni; Murad, Shahnaz

    2012-01-01

    Background Dengue is the most important mosquito-borne viral disease. In the absence of specific drugs or vaccines, control focuses on suppressing the principal mosquito vector, Aedes aegypti, yet current methods have not proven adequate to control the disease. New methods are therefore urgently needed, for example genetics-based sterile-male-release methods. However, this requires that lab-reared, modified mosquitoes be able to survive and disperse adequately in the field. Methodology/Principal Findings Adult male mosquitoes were released into an uninhabited forested area of Pahang, Malaysia. Their survival and dispersal was assessed by use of a network of traps. Two strains were used, an engineered ‘genetically sterile’ (OX513A) and a wild-type laboratory strain, to give both absolute and relative data about the performance of the modified mosquitoes. The two strains had similar maximum dispersal distances (220 m), but mean distance travelled of the OX513A strain was lower (52 vs. 100 m). Life expectancy was similar (2.0 vs. 2.2 days). Recapture rates were high for both strains, possibly because of the uninhabited nature of the site. Conclusions/Significance After extensive contained studies and regulatory scrutiny, a field release of engineered mosquitoes was safely and successfully conducted in Malaysia. The engineered strain showed similar field longevity to an unmodified counterpart, though in this setting dispersal was reduced relative to the unmodified strain. These data are encouraging for the future testing and implementation of genetic control strategies and will help guide future field use of this and other engineered strains. PMID:22970102

  14. Force-Free Magnetic Fields on AN Extreme Reissner-Nordström Spacetime and the Meissner Effect

    NASA Astrophysics Data System (ADS)

    Takamori, Yousuke; Ken-Ichi, Nakao; Hideki, Ishihara; Masashi, Kimura; Chul-Moon, Yoo

    It is known that the Meissner effect of black holes is seen in the vacuum solutions of blackhole magnetosphere: no non-monopole component of magnetic flux penetrates the event horizon if the black hole is extreme. In this article, in order to see the effects of charge currents, we study the force-free magnetic field on the extreme Reissner-Nordström background. In this case, we should solve one elliptic differential equation called the Grad-Shafranov equation which has singular points called light surfaces. In order to see the Meissner effect, we consider the region near the event horizon and try to solve the equation by Taylor expansion about the event horizon. Moreover, we assume that the small rotational velocity of the magnetic field, and then, we construct a perturbative method to solve the Grad-Shafranov equation considering the efftect of the inner light surface and study the behavior of the magnetic field near the event horizon.

  15. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  16. Background radiation measurements at high power research reactors

    DOE PAGES

    Ashenfelter, J.; Yeh, M.; Balantekin, B.; ...

    2015-10-23

    Research reactors host a wide range of activities that make use of the intense neutron fluxes generated at these facilities. Recent interest in performing measurements with relatively low event rates, e.g. reactor antineutrino detection, at these facilities necessitates a detailed understanding of background radiation fields. Both reactor-correlated and naturally occurring background sources are potentially important, even at levels well below those of importance for typical activities. Here we describe a comprehensive series of background assessments at three high-power research reactors, including γ-ray, neutron, and muon measurements. For each facility we describe the characteristics and identify the sources of the backgroundmore » fields encountered. Furthermore, the general understanding gained of background production mechanisms and their relationship to facility features will prove valuable for the planning of any sensitive measurement conducted therein.« less

  17. Contrast source inversion (CSI) method to cross-hole radio-imaging (RIM) data - Part 2: A complex synthetic example and a case study

    NASA Astrophysics Data System (ADS)

    Li, Yongxing; Smith, Richard S.

    2018-03-01

    We present two examples of using the contrast source inversion (CSI) method to invert synthetic radio-imaging (RIM) data and field data. The synthetic model has two isolated conductors (one perfect conductor and one moderate conductor) embedded in a layered background. After inversion, we can identify the two conductors on the inverted image. The shape of the perfect conductor is better resolved than the shape of the moderate conductor. The inverted conductivity values of the two conductors are approximately the same, which demonstrates that the conductivity values cannot be correctly interpreted from the CSI results. The boundaries and the tilts of the upper and the lower conductive layers on the background can also be inferred from the results, but the centre parts of conductive layers in the inversion results are more conductive than the parts close to the boreholes. We used the straight-ray tomographic imaging method and the CSI method to invert the RIM field data collected using the FARA system between two boreholes in a mining area in Sudbury, Canada. The RIM data include the amplitude and the phase data collected using three frequencies: 312.5 kHz, 625 kHz and 1250 kHz. The data close to the ground surface have high amplitude values and complicated phase fluctuations, which are inferred to be contaminated by the reflected or refracted electromagnetic (EM) fields from the ground surface, and are removed for all frequencies. Higher-frequency EM waves attenuate more quickly in the subsurface environment, and the locations where the measurements are dominated by noise are also removed. When the data are interpreted with the straight-ray method, the images differ substantially for different frequencies. In addition, there are some unexpected features in the images, which are difficult to interpret. Compared with the straight-ray imaging results, the inversion results with the CSI method are more consistent for different frequencies. On the basis of what we learnt from the synthetic study, we interpret that there is one resistive layer across the middle of the borehole plane and two more conductive areas above and below this layer. Though there are some limitations in the study, such as large transmitter steps and the precise amplitudes and dipole moments being unknown, we conclude that the CSI method provides more interpretable images compared with the straight-ray method.

  18. A Novel Interhemispheric Interaction: Modulation of Neuronal Cooperativity in the Visual Areas

    PubMed Central

    Carmeli, Cristian; Lopez-Aguado, Laura; Schmidt, Kerstin E.; De Feo, Oscar; Innocenti, Giorgio M.

    2007-01-01

    Background The cortical representation of the visual field is split along the vertical midline, with the left and the right hemi-fields projecting to separate hemispheres. Connections between the visual areas of the two hemispheres are abundant near the representation of the visual midline. It was suggested that they re-establish the functional continuity of the visual field by controlling the dynamics of the responses in the two hemispheres. Methods/Principal Findings To understand if and how the interactions between the two hemispheres participate in processing visual stimuli, the synchronization of responses to identical or different moving gratings in the two hemi-fields were studied in anesthetized ferrets. The responses were recorded by multiple electrodes in the primary visual areas and the synchronization of local field potentials across the electrodes were analyzed with a recent method derived from dynamical system theory. Inactivating the visual areas of one hemisphere modulated the synchronization of the stimulus-driven activity in the other hemisphere. The modulation was stimulus-specific and was consistent with the fine morphology of callosal axons in particular with the spatio-temporal pattern of activity that axonal geometry can generate. Conclusions/Significance These findings describe a new kind of interaction between the cerebral hemispheres and highlight the role of axonal geometry in modulating aspects of cortical dynamics responsible for stimulus detection and/or categorization. PMID:18074012

  19. Propagation peculiarities of mean field massive gravity

    DOE PAGES

    Deser, S.; Waldron, A.; Zahariade, G.

    2015-07-28

    Massive gravity (mGR) describes a dynamical “metric” on a fiducial, background one. We investigate fluctuations of the dynamics about mGR solutions, that is about its “mean field theory”. Analyzing mean field massive gravity (m¯GR) propagation characteristics is not only equivalent to studying those of the full non-linear theory, but also in direct correspondence with earlier analyses of charged higher spin systems, the oldest example being the charged, massive spin 3/2 Rarita–Schwinger (RS) theory. The fiducial and mGR mean field background metrics in the m¯GR model correspond to the RS Minkowski metric and external EM field. The common implications in bothmore » systems are that hyperbolicity holds only in a weak background-mean-field limit, immediately ruling both theories out as fundamental theories; a situation in stark contrast with general relativity (GR) which is at least a consistent classical theory. Moreover, even though both m¯GR and RS theories can still in principle be considered as predictive effective models in the weak regime, their lower helicities then exhibit superluminal behavior: lower helicity gravitons are superluminal as compared to photons propagating on either the fiducial or background metric. Thus our approach has uncovered a novel, dispersive, “crystal-like” phenomenon of differing helicities having differing propagation speeds. As a result, this applies both to m¯GR and mGR, and is a peculiar feature that is also problematic for consistent coupling to matter.« less

  20. A Two-Dimensional Variational Analysis Method for NSCAT Ambiguity Removal: Methodology, Sensitivity, and Tuning

    NASA Technical Reports Server (NTRS)

    Hoffman, R. N.; Leidner, S. M.; Henderson, J. M.; Atlas, R.; Ardizzone, J. V.; Bloom, S. C.; Atlas, Robert (Technical Monitor)

    2001-01-01

    In this study, we apply a two-dimensional variational analysis method (2d-VAR) to select a wind solution from NASA Scatterometer (NSCAT) ambiguous winds. 2d-VAR determines a "best" gridded surface wind analysis by minimizing a cost function. The cost function measures the misfit to the observations, the background, and the filtering and dynamical constraints. The ambiguity closest in direction to the minimizing analysis is selected. 2d-VAR method, sensitivity and numerical behavior are described. 2d-VAR is compared to statistical interpolation (OI) by examining the response of both systems to a single ship observation and to a swath of unique scatterometer winds. 2d-VAR is used with both NSCAT ambiguities and NSCAT backscatter values. Results are roughly comparable. When the background field is poor, 2d-VAR ambiguity removal often selects low probability ambiguities. To avoid this behavior, an initial 2d-VAR analysis, using only the two most likely ambiguities, provides the first guess for an analysis using all the ambiguities or the backscatter data. 2d-VAR and median filter selected ambiguities usually agree. Both methods require horizontal consistency, so disagreements occur in clumps, or as linear features. In these cases, 2d-VAR ambiguities are often more meteorologically reasonable and more consistent with satellite imagery.

  1. Critical solutions of topologically gauged = 8 CFTs in three dimensions

    NASA Astrophysics Data System (ADS)

    Nilsson, Bengt E. W.

    2014-04-01

    In this paper we discuss some special (critical) background solutions that arise in topological gauged = 8 three-dimensional CFTs with SO(N) gauge group. Depending on how many scalar fields are given a VEV the theory has background solutions for certain values of μl, where μ and l are parameters in the TMG Lagrangian. Apart from Minkowski, chiral round AdS 3 and null-warped AdS 3 (or Schrödinger( z = 2)) we identify also a more exotic solution recently found in TMG by Ertl, Grumiller and Johansson. We also discuss the spectrum, symmetry breaking pattern and the supermultiplet structure in the various backgrounds and argue that some properties are due to their common origin in a conformal phase. Some of the scalar fields, including all higgsed ones, turn out to satisfy three-dimensional field equations similar to those of the singleton. Finally, we note that topologically gauged = 6 ABJ(M) theories have a similar, but more restricted, set of background solutions.

  2. Effective field theory search for high-energy nuclear recoils using the XENON100 dark matter detector

    NASA Astrophysics Data System (ADS)

    Aprile, E.; Aalbers, J.; Agostini, F.; Alfonsi, M.; Amaro, F. D.; Anthony, M.; Arneodo, F.; Barrow, P.; Baudis, L.; Bauermeister, B.; Benabderrahmane, M. L.; Berger, T.; Breur, P. A.; Brown, A.; Brown, E.; Bruenner, S.; Bruno, G.; Budnik, R.; Bütikofer, L.; Calvén, J.; Cardoso, J. M. R.; Cervantes, M.; Cichon, D.; Coderre, D.; Colijn, A. P.; Conrad, J.; Cussonneau, J. P.; Decowski, M. P.; de Perio, P.; di Gangi, P.; di Giovanni, A.; Diglio, S.; Eurin, G.; Fei, J.; Ferella, A. D.; Fieguth, A.; Fulgione, W.; Gallo Rosso, A.; Galloway, M.; Gao, F.; Garbini, M.; Geis, C.; Goetzke, L. W.; Greene, Z.; Grignon, C.; Hasterok, C.; Hogenbirk, E.; Itay, R.; Kaminsky, B.; Kazama, S.; Kessler, G.; Kish, A.; Landsman, H.; Lang, R. F.; Lellouch, D.; Levinson, L.; Lin, Q.; Lindemann, S.; Lindner, M.; Lombardi, F.; Lopes, J. A. M.; Manfredini, A.; Maris, I.; Marrodán Undagoitia, T.; Masbou, J.; Massoli, F. V.; Masson, D.; Mayani, D.; Messina, M.; Micheneau, K.; Molinario, A.; Morâ, K.; Murra, M.; Naganoma, J.; Ni, K.; Oberlack, U.; Pakarha, P.; Pelssers, B.; Persiani, R.; Piastra, F.; Pienaar, J.; Pizzella, V.; Piro, M.-C.; Plante, G.; Priel, N.; Rauch, L.; Reichard, S.; Reuter, C.; Rizzo, A.; Rosendahl, S.; Rupp, N.; Dos Santos, J. M. F.; Sartorelli, G.; Scheibelhut, M.; Schindler, S.; Schreiner, J.; Schumann, M.; Scotto Lavina, L.; Selvi, M.; Shagin, P.; Silva, M.; Simgen, H.; Sivers, M. V.; Stein, A.; Thers, D.; Tiseni, A.; Trinchero, G.; Tunnell, C.; Vargas, M.; Wang, H.; Wang, Z.; Wei, Y.; Weinheimer, C.; Wulf, J.; Ye, J.; Zhang., Y.; Farmer, B.; Xenon Collaboration

    2017-08-01

    We report on weakly interacting massive particles (WIMPs) search results in the XENON100 detector using a nonrelativistic effective field theory approach. The data from science run II (34 kg ×224.6 live days) were reanalyzed, with an increased recoil energy interval compared to previous analyses, ranging from (6.6 -240 ) keVnr . The data are found to be compatible with the background-only hypothesis. We present 90% confidence level exclusion limits on the coupling constants of WIMP-nucleon effective operators using a binned profile likelihood method. We also consider the case of inelastic WIMP scattering, where incident WIMPs may up-scatter to a higher mass state, and set exclusion limits on this model as well.

  3. A New Probe of Line-of-sight Magnetic Field Tangling

    NASA Astrophysics Data System (ADS)

    Clark, S. E.

    2018-04-01

    The Galactic neutral hydrogen (H I ) sky at high Galactic latitudes is suffused with linear structure. Particularly prominent in narrow spectral intervals, these linear H I features are well aligned with the plane-of-sky magnetic field orientation as measured with optical starlight polarization and polarized thermal dust emission. We analyze the coherence of the orientation of these features with respect to line-of-sight velocity, and propose a new metric to quantify this H I coherence. We show that H I coherence is linearly correlated with the polarization fraction of 353 GHz dust emission. H I coherence constitutes a novel method for measuring the degree of magnetic field tangling along the line of sight in the diffuse interstellar medium. We propose applications of this property for H I -based models of the polarized dust emission in diffuse regions, and for studies of frequency decorrelation in the polarized dust foreground to the cosmic microwave background (CMB).

  4. Controllable rotating behavior of individual dielectric microrod in a rotating electric field.

    PubMed

    Liu, Weiyu; Ren, Yukun; Tao, Ye; Li, Yanbo; Chen, Xiaoming

    2017-06-01

    We report herein controllable rotating behavior of an individual dielectric microrod driven by a background rotating electric field. By disposing or removing structured floating microelectrode, the rigid rod suspended in electrolyte solution accordingly exhibits cofield or antifield rotating motion. In the absence of the ideally polarizable metal surface, the dielectric rod rotates opposite to propagation of electric field, with the measured rotating rate much larger than predicted by Maxwell-Wager interfacial polarization theory incorporating surface conduction of fixed bond charge. Surprisingly, with floating electrode embedded, a novel kind of cofield rotation mode occurs in the presence of induced double-layer polarization, due to the action of hydrodynamic torque from rotating induced-charge electroosmosis. This method of achieving switchable spin modes of dielectric particles would direct implications in constructing flexible electrokinetic framework for analyzing 3D profile of on-chip biomicrofluidic samples. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Evidence for dwarf stars at D of about 100 kiloparsecs near the Sextans dwarf spheroidal galaxy

    NASA Technical Reports Server (NTRS)

    Gould, Andrew; Guhathakurta, Puragra; Richstone, Douglas; Flynn, Chris

    1992-01-01

    A method is presented for detecting individual, metal-poor, dwarf stars at distances less than about 150 kpc - a method specifically designed to filter out stars from among the much more numerous faint background field galaxies on the basis of broad-band colors. This technique is applied to two fields at high Galactic latitude, for which there are deep CCD data in four bands ranging from 3600 to 9000 A. The field in Sextans probably contains more than about five dwarf stars with BJ not greater than 25.5. These are consistent with being at a common distance about 100 kpc and lie about 1.7 deg from the newly discovered dwarf galaxy in Sextans whose distance is about 85 +/- 10 kpc. The stars lie near the major axis of the galaxy and are near or beyond the tidal radius. The second field, toward the south Galactic pole, may contain up to about five extra-Galactic stars, but these show no evidence for being at a common distance. Possible applications of this type technique are discussed, and it is shown that even very low surface brightness star clusters or dwarf galaxies may be detected at distances less than about 1 Mpc.

  6. Teaching hydrogeology: a review of current practice

    NASA Astrophysics Data System (ADS)

    Gleeson, T.; Allen, D. M.; Ferguson, G.

    2012-07-01

    Hydrogeology is now taught in a broad spectrum of departments and institutions to students with diverse backgrounds. Successful instruction in hydrogeology thus requires a variety of pedagogical approaches depending on desired learning outcomes and the background of students. We review the pedagogical literature in hydrogeology to highlight recent advances and analyze a 2005 survey among 68 hydrogeology instructors. The literature and survey results suggest there are only ~ 15 topics that are considered crucial by most hydrogeologists and > 100 other topics that are considered crucial by some hydrogeologists. The crucial topics focus on properties of aquifers and fundamentals of groundwater flow, and should likely be part of all undergraduate hydrogeology courses. Other topics can supplement and support these crucial topics, depending on desired learning outcomes. Classroom settings continue to provide a venue for emphasizing fundamental knowledge. However, recent pedagogical advances are biased towards field and laboratory instruction with a goal of bolstering experiential learning. Field methods build on the fundamentals taught in the classroom and emphasize the collection of data, data uncertainty, and the development of vocational skills. Laboratory and computer-based exercises similarly build on theory, and offer an opportunity for data analysis and integration. The literature suggests curricula at all levels should ideally balance field, laboratory, and classroom pedagogy into an iterative and integrative whole. An integrated, iterative and balanced approach leads to greater student motivation and advancement of theoretical and vocational knowledge.

  7. Flowing with the changing needs of hydrogeology instruction

    NASA Astrophysics Data System (ADS)

    Gleeson, T.; Allen, D. M.; Ferguson, G.

    2012-01-01

    Hydrogeology is now taught in a broad spectrum of departments and institutions to students with diverse backgrounds. Successful instruction in hydrogeology thus requires a variety of pedagogical approaches depending on desired learning outcomes and the diverse background of students. We review the pedagogical literature in hydrogeology to highlight recent advances and analyze a 2005 survey of 68 hydrogeology instructors. The literature and survey results suggest there are ~15 topics that are considered crucial by most hydrogeologists and >100 other topics that are considered crucial by some hydrogeologists. The crucial topics focus on properties of aquifers and fundamentals of groundwater flow, and should likely be part of all undergraduate hydrogeology courses. Other topics can supplement and support these crucial topics, depending on desired learning outcomes. Classroom settings continue to provide a venue for emphasizing fundamental knowledge. However, recent pedagogical advances are biased towards field and laboratory instruction with a goal of bolstering experiential learning. Field methods build on the fundamentals taught in the classroom and emphasize the collection of data, data uncertainty, and the development of vocational skills. Laboratory and computer-based exercises similarly build on theory, and offer an opportunity for data analysis and integration. The literature suggests curricula at all levels should ideally balance field, laboratory, and classroom pedagogy into an iterative and integrative whole. An integrated approach leads to greater student motivation and advancement of theoretical and vocational knowledge.

  8. [Psychotherapy: Quo vadis?

    PubMed

    Meinlschmidt, Gunther; Tegethoff, Marion

    2017-08-01

    Background: The science and practice of psychotherapy is continuously developing. The goal of this article is to describe new impulses, guiding current advancements in the field. Methods: This paper provides a selective narrative review, synthesizing and condensing relevant literature identified through various sources, including MEDLINE, EMBASE, PsycINFO, and "Web of Science", as well as citation tracking, to elaborate key developments in the field of psychotherapy Results: We describe several dynamics: 1) Following up the so-called "third wave of cognitive behavioral therapy", new interventions arise that have at their core fostering interpersonal virtues, such as compassion, forgiveness, and gratitude; 2) Based on technological quantum leaps, new interventions arise that exploit current developments in the field of new media, information, and communication technologies, as well as brain imaging, such as digital interventions for mental disorders and new forms of neurofeedback; 3) Inspired by the field of positive psychology, there is a revival of the promotion of strength and resilience in therapeutic contexts; 4) In light of the new paradigm "precision medicine", the issue of differential and adaptive indication of psychotherapy, addressed with new methods, regains relevance and drives a new field of "precision psychotherapy". 5) Last but not least, the "embodied turn" opens the door for body psychotherapy to gain relevance in academic psychotherapy. Conclusion: These and further developments, such as the use of systemic and network approaches as well as machine learning techniques, outline the vivid activities in the field of psychotherapy. Georg Thieme Verlag KG Stuttgart · New York.

  9. Photoacoustic infrared spectroscopy for conducting gas tracer tests and measuring water saturations in landfills

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jung, Yoojin; Han, Byunghyun; Mostafid, M. Erfan

    2012-02-15

    Highlights: Black-Right-Pointing-Pointer Photoacoustic infrared spectroscopy tested for measuring tracer gas in landfills. Black-Right-Pointing-Pointer Measurement errors for tracer gases were 1-3% in landfill gas. Black-Right-Pointing-Pointer Background signals from landfill gas result in elevated limits of detection. Black-Right-Pointing-Pointer Technique is much less expensive and easier to use than GC. - Abstract: Gas tracer tests can be used to determine gas flow patterns within landfills, quantify volatile contaminant residence time, and measure water within refuse. While gas chromatography (GC) has been traditionally used to analyze gas tracers in refuse, photoacoustic spectroscopy (PAS) might allow real-time measurements with reduced personnel costs and greater mobilitymore » and ease of use. Laboratory and field experiments were conducted to evaluate the efficacy of PAS for conducting gas tracer tests in landfills. Two tracer gases, difluoromethane (DFM) and sulfur hexafluoride (SF{sub 6}), were measured with a commercial PAS instrument. Relative measurement errors were invariant with tracer concentration but influenced by background gas: errors were 1-3% in landfill gas but 4-5% in air. Two partitioning gas tracer tests were conducted in an aerobic landfill, and limits of detection (LODs) were 3-4 times larger for DFM with PAS versus GC due to temporal changes in background signals. While higher LODs can be compensated by injecting larger tracer mass, changes in background signals increased the uncertainty in measured water saturations by up to 25% over comparable GC methods. PAS has distinct advantages over GC with respect to personnel costs and ease of use, although for field applications GC analyses of select samples are recommended to quantify instrument interferences.« less

  10. Measurements of SWIR backgrounds using the swux unit of measure

    NASA Astrophysics Data System (ADS)

    Richards, A.; Hübner, M.; Vollmer, M.

    2018-04-01

    The SWIR waveband between 0.8μm-1.8μm is getting increasingly exploited by imaging systems in a variety of different applications, including persistent imaging for security and surveillance of high-value assets, handheld tactical imagers, range-gated imaging systems and imaging LADAR for driverless vehicles. The vast majority of these applications utilize lattice-matched InGaAs detectors in their imaging sensors, and these sensors are rapidly falling in price, leading to their widening adoption. As these sensors are used in novel applications and locations, it is important that ambient SWIR backgrounds be understood and characterized for a variety of different field conditions, primarily for the purposes of system performance modeling of SNR and range metrics. SWIR irradiance backgrounds do not consistently track visible-light illumination at all. There is currently little of this type of information in the open literature, particularly measurements of SWIR backgrounds in urban areas, natural areas, or indoors. This paper presents field measurements done with an InGaAs detector calibrated in the swux unit of InGaAs-band-specific irradiance proposed by two of the authors in 2017. Simultaneous measurements of illuminance levels (in lux) at these sites are presented, as well as visible and InGaAs camera images of the scenery at some of these measurement sites. The swux and lux measurement hardware is described, along with the methods used to calibrate it. Finally, the swux levels during the partial and total phases of the total solar eclipse of 2017 are presented, along with curves fitted to the data from a theoretical model, based on obscuration of the sun by the moon. The apparent differences between photometric and swux measurements will be discussed.

  11. Exposure assessment of extremely low frequency electric fields in Tehran, Iran, 2010.

    PubMed

    Nassiri, Parvin; Esmaeilpour, Mohammad Reza Monazzam; Gharachahi, Ehsan; Haghighat, Gholamali; Yunesian, Masoud; Zaredar, Narges

    2013-01-01

    Extremely Low-Frequency (ELF) electric and magnetic fields belonging to the nonionizing electromagnetic radiation spectrum have a frequency of 50 - 60 Hz. All people are exposed to a complex set of electric and magnetic fields that spread throughout the environment. The current study was carried out to assess people's exposure to an ELF electric field in the Tehran metropolitan area in 2010. The measurement of the electronic fields was performed using an HI-3604 power frequency field strength measurement device. A total number of 2,753 measurements were performed. Afterward, the data obtained were transferred to the base map using Arc View Version 3.2 and Arc Map Version 9.3. Finally, an interpolation method was applied to expand the intensity of the electric field to the entire city. Based on the results obtained, the electric field was divided into three parts with various intensities including 0-5 V m, 5-15 V m, and >15 V m. It should be noted that the status of high voltage transmission lines, electric substations, and specific points including schools and hospitals were also marked on the map. Minimum and maximum electric field intensities were measured tantamount to 0.31 V m and 19.80 V m, respectively. In all measurements, the electric field was much less than the amount provided in the ICNIRP Guide. The results revealed that 141 hospitals and 6,905 schools are situated in an area with electric field intensity equal to 0-5 V m, while 15 hospitals and 95 schools are located in zones of 5-15 V m and more than 15 V m. Examining high voltage transmission lines and electric substations in Tehran and its suburbs suggested that the impact of the lines on the background electric field of the city was low. Accordingly, 0.97 km of Tehran located on the city border adjacent to the high voltage transmission lines have an electric field in the range of 5 to 15 V m. The noted range is much lower than the available standards. In summary, it can be concluded that the public is not exposed to a risky background electric field in metropolitan Tehran. The result of comparing sensitive recipients showed that the schools have a more desirable status than the hospitals. Nonetheless, epidemiologic studies can lead to more understanding of the impact on public health.

  12. Susceptibility of the QCD vacuum to CP-odd electromagnetic background fields.

    PubMed

    D'Elia, Massimo; Mariti, Marco; Negro, Francesco

    2013-02-22

    We investigate two flavor quantum chromodynamics (QCD) in the presence of CP-odd electromagnetic background fields and determine, by means of lattice QCD simulations, the induced effective θ term to first order in E[over →] · B[over →]. We employ a rooted staggered discretization and study lattice spacings down to 0.1 fm and Goldstone pion masses around 480 MeV. In order to deal with a positive measure, we consider purely imaginary electric fields and real magnetic fields, and then exploit the analytic continuation. Our results are relevant to a description of the effective pseudoscalar quantum electrodynamics-QCD interactions.

  13. Chameleon scalar fields in relativistic gravitational backgrounds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsujikawa, Shinji; Tamaki, Takashi; Tavakol, Reza, E-mail: shinji@rs.kagu.tus.ac.jp, E-mail: tamaki@gravity.phys.waseda.ac.jp, E-mail: r.tavakol@qmul.ac.uk

    2009-05-15

    We study the field profile of a scalar field {phi} that couples to a matter fluid (dubbed a chameleon field) in the relativistic gravitational background of a spherically symmetric spacetime. Employing a linear expansion in terms of the gravitational potential {Phi}{sub c} at the surface of a compact object with a constant density, we derive the thin-shell field profile both inside and outside the object, as well as the resulting effective coupling with matter, analytically. We also carry out numerical simulations for the class of inverse power-law potentials V({phi}) = M{sup 4+n}{phi}{sup -n} by employing the information provided by ourmore » analytical solutions to set the boundary conditions around the centre of the object and show that thin-shell solutions in fact exist if the gravitational potential {Phi}{sub c} is smaller than 0.3, which marginally covers the case of neutron stars. Thus the chameleon mechanism is present in the relativistic gravitational backgrounds, capable of reducing the effective coupling. Since thin-shell solutions are sensitive to the choice of boundary conditions, our analytic field profile is very helpful to provide appropriate boundary conditions for {Phi}{sub c}{approx}« less

  14. Internal wave energy flux from density perturbations in nonlinear stratifications

    NASA Astrophysics Data System (ADS)

    Lee, Frank M.; Allshouse, Michael R.; Swinney, Harry L.; Morrison, P. J.

    2017-11-01

    Tidal flow over the topography at the bottom of the ocean, whose density varies with depth, generates internal gravity waves that have a significant impact on the energy budget of the ocean. Thus, understanding the energy flux (J = p v) is important, but it is difficult to measure simultaneously the pressure and velocity perturbation fields, p and v . In a previous work, a Green's-function-based method was developed to calculate the instantaneous p, v , and thus J , given a density perturbation field for a constant buoyancy frequency N. Here we extend the previous analytic Green's function work to include nonuniform N profiles, namely the tanh-shaped and linear cases, because background density stratifications that occur in the ocean and some experiments are nonlinear. In addition, we present a finite-difference method for the general case where N has an arbitrary profile. Each method is validated against numerical simulations. The methods we present can be applied to measured density perturbation data by using our MATLAB graphical user interface EnergyFlux. PJM was supported by the U.S. Department of Energy Contract DE-FG05-80ET-53088. HLS and MRA were supported by ONR Grant No. N000141110701.

  15. Liver Cirrhosis: Evaluation, Nutritional Status, and Prognosis

    PubMed Central

    Nishikawa, Hiroki; Osaki, Yukio

    2015-01-01

    The liver is the major organ for the metabolism of three major nutrients: protein, fat, and carbohydrate. Chronic hepatitis C virus infection is the major cause of chronic liver disease. Liver cirrhosis (LC) results from different mechanisms of liver injury that lead to necroinflammation and fibrosis. LC has been seen to be not a single disease entity but one that can be graded into distinct clinical stages related to clinical outcome. Several noninvasive methods have been developed for assessing liver fibrosis and these methods have been used for predicting prognosis in patients with LC. On the other hand, subjects with LC often have protein-energy malnutrition (PEM) and poor physical activity. These conditions often result in sarcopenia, which is the loss of skeletal muscle volume and increased muscle weakness. Recent studies have demonstrated that PEM and sarcopenia are predictive factors for poorer survival in patients with LC. Based on these backgrounds, several methods for evaluating nutritional status in patients with chronic liver disease have been developed and they have been preferably used in the clinical field practice. In this review, we will summarize the current knowledge in the field of LC from the viewpoints of diagnostic method, nutritional status, and clinical outcomes. PMID:26494949

  16. Combined target factor analysis and Bayesian soft-classification of interference-contaminated samples: forensic fire debris analysis.

    PubMed

    Williams, Mary R; Sigman, Michael E; Lewis, Jennifer; Pitan, Kelly McHugh

    2012-10-10

    A bayesian soft classification method combined with target factor analysis (TFA) is described and tested for the analysis of fire debris data. The method relies on analysis of the average mass spectrum across the chromatographic profile (i.e., the total ion spectrum, TIS) from multiple samples taken from a single fire scene. A library of TIS from reference ignitable liquids with assigned ASTM classification is used as the target factors in TFA. The class-conditional distributions of correlations between the target and predicted factors for each ASTM class are represented by kernel functions and analyzed by bayesian decision theory. The soft classification approach assists in assessing the probability that ignitable liquid residue from a specific ASTM E1618 class, is present in a set of samples from a single fire scene, even in the presence of unspecified background contributions from pyrolysis products. The method is demonstrated with sample data sets and then tested on laboratory-scale burn data and large-scale field test burns. The overall performance achieved in laboratory and field test of the method is approximately 80% correct classification of fire debris samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rotti, Aditya; Huffenberger, Kevin, E-mail: adityarotti@gmail.com, E-mail: khuffenberger@fsu.edu

    Isotropy-violation statistics can highlight polarized galactic foregrounds that contaminate primordial B -modes in the Cosmic Microwave Background (CMB). We propose a particular isotropy-violation test and apply it to polarized Planck 353 GHz data, constructing a map that indicates B -mode foreground dust power over the sky. We build our main isotropy test in harmonic space via the bipolar spherical harmonic basis, and our method helps us to identify the least-contaminated directions. By this measure, there are regions of low foreground in and around the BICEP field, near the South Galactic Pole, and in the Northern Galactic Hemisphere. There is alsomore » a possible foreground feature in the BICEP field. We compare our results to those based on the local power spectrum, which is computed on discs using a version of the method of Planck Int. XXX (2016). The discs method is closely related to our isotropy-violation diagnostic. We pay special care to the treatment of noise, including chance correlations with the foregrounds. Currently we use our isotropy tool to assess the cleanest portions of the sky, but in the future such methods will allow isotropy-based null tests for foreground contamination in maps purported to measure primordial B -modes, particularly in cases of limited frequency coverage.« less

  18. Assessment of oxygen and carbogen therapy effect in Meniere's disease according to clinical and electroencephalographic data

    NASA Technical Reports Server (NTRS)

    Boronoyev, A. B.

    1980-01-01

    The method of constructing fields on the basis of EEG data gives a quantitative characterization of bioelectrical activity. Fields of average rates of the change of potentials in healthy people have a well defined configuration, where the greatest rates are found in the occipital zones, lower in the frontal and parietal, and least in the temporal zones. In response to functional loads the form of the field remains the same because of a synchronous change in the average rates in both hemispheres of the cerebrum to the same extent. The configuration of the fields of background bioelectric activity of the cerebrum in people with Meniere's disease is not uniform. On the basis of this investigation, a clear correlation was found between the subjective sensations of patients during oxygen and carbogen therapy and the changes in the spatial characteristics of the field of potentials of the cerebrum. This correlation makes it possible to objectively identify the nature of the vascular disturbances in Meniere's disease, develop a pathogenetic treatment plan, and evaluate its effectiveness.

  19. A novel automated method for doing registration and 3D reconstruction from multi-modal RGB/IR image sequences

    NASA Astrophysics Data System (ADS)

    Kirby, Richard; Whitaker, Ross

    2016-09-01

    In recent years, the use of multi-modal camera rigs consisting of an RGB sensor and an infrared (IR) sensor have become increasingly popular for use in surveillance and robotics applications. The advantages of using multi-modal camera rigs include improved foreground/background segmentation, wider range of lighting conditions under which the system works, and richer information (e.g. visible light and heat signature) for target identification. However, the traditional computer vision method of mapping pairs of images using pixel intensities or image features is often not possible with an RGB/IR image pair. We introduce a novel method to overcome the lack of common features in RGB/IR image pairs by using a variational methods optimization algorithm to map the optical flow fields computed from different wavelength images. This results in the alignment of the flow fields, which in turn produce correspondences similar to those found in a stereo RGB/RGB camera rig using pixel intensities or image features. In addition to aligning the different wavelength images, these correspondences are used to generate dense disparity and depth maps. We obtain accuracies similar to other multi-modal image alignment methodologies as long as the scene contains sufficient depth variations, although a direct comparison is not possible because of the lack of standard image sets from moving multi-modal camera rigs. We test our method on synthetic optical flow fields and on real image sequences that we created with a multi-modal binocular stereo RGB/IR camera rig. We determine our method's accuracy by comparing against a ground truth.

  20. Effects of in-plane magnetic field on the transport of 2D electron vortices in non-uniform plasmas

    NASA Astrophysics Data System (ADS)

    Angus, Justin; Richardson, Andrew; Schumer, Joseph; Pulsed Power Team

    2015-11-01

    The formation of electron vortices in current-carrying plasmas is observed in 2D particle-in-cell (PIC) simulations of the plasma-opening switch. In the presence of a background density gradient in Cartesian systems, vortices drift in the direction found by crossing the magnetic field with the background density gradient as a result of the Hall effect. However, most of the 2D simulations where electron vortices are seen and studied only allow for in-plane currents and thus only an out-of-plane magnetic field. Here we present results of numerical simulations of 2D, seeded electron vortices in an inhomogeneous background using the generalized 2D electron-magneto-hydrodynamic model that additionally allows for in-plane components of the magnetic field. By seeding vortices with a varying axial component of the velocity field, so that the vortex becomes a corkscrew, it is found that a pitch angle of around 20 degrees is sufficient to completely prevent the vortex from propagating due to the Hall effect for typical plasma parameters. This work is supported by the NRL Base Program.

  1. Quantitative trace analysis of polyfluorinated alkyl substances (PFAS) in ambient air samples from Mace Head (Ireland): A method intercomparison

    NASA Astrophysics Data System (ADS)

    Jahnke, Annika; Barber, Jonathan L.; Jones, Kevin C.; Temme, Christian

    A method intercomparison study of analytical methods for the determination of neutral, volatile polyfluorinated alkyl substances (PFAS) was carried out in March, 2006. Environmental air samples were collected in triplicate at the European background site Mace Head on the west coast of Ireland, a site dominated by 'clean' westerly winds coming across the Atlantic. Extraction and analysis were performed at two laboratories active in PFAS research using their in-house methods. Airborne polyfluorinated telomer alcohols (FTOHs), fluorooctane sulfonamides and sulfonamidoethanols (FOSAs/FOSEs) as well as additional polyfluorinated compounds were investigated. Different native and isotope-labelled internal standards (IS) were applied at various steps in the analytical procedure to evaluate the different quantification strategies. Field blanks revealed no major blank problems. European background concentrations observed at Mace Head were found to be in a similar range to Arctic data reported in the literature. Due to trace-levels at the remote site, only FTOH data sets were complete and could therefore be compared between the laboratories. Additionally, FOSEs could partly be included. Data comparison revealed that despite the challenges inherent in analysis of airborne PFAS and the low concentrations, all methods applied in this study obtained similar results. However, application of isotope-labelled IS early in the analytical procedure leads to more precise results and is therefore recommended.

  2. The Storage Ring Proton EDM Experiment

    NASA Astrophysics Data System (ADS)

    Semertzidis, Yannis; Storage Ring Proton EDM Collaboration

    2014-09-01

    The storage ring pEDM experiment utilizes an all-electric storage ring to store ~1011 longitudinally polarized protons simultaneously in clock-wise and counter-clock-wise directions for 103 seconds. The radial E-field acts on the proton EDM for the duration of the storage time to precess its spin in the vertical plane. The ring lattice is optimized to reduce intra-beam scattering, increase the statistical sensitivity and reduce the systematic errors of the method. The main systematic error is a net radial B-field integrated around the ring causing an EDM-like vertical spin precession. The counter-rotating beams sense this integrated field and are vertically shifted by an amount, which depends on the strength of the vertical focusing in the ring, thus creating a radial B-field. Modulating the vertical focusing at 10 kHz makes possible the detection of this radial B-field by a SQUID-magnetometer (SQUID-based BPM). For a total number of n SQUID-based BPMs distributed around the ring the effectiveness of the method is limited to the N = n /2 harmonic of the background radial B-field due to the Nyquist sampling theorem limit. This limitation establishes the requirement to reduce the maximum radial B-field to 0.1-1 nT everywhere around the ring by layers of mu-metal and aluminum vacuum tube. The metho's sensitivity is 10-29 e .cm , more than three orders of magnitude better than the present neutron EDM experimental limit, making it sensitive to SUSY-like new physics mass scale up to 300 TeV.

  3. A calibration method for patient specific IMRT QA using a single therapy verification film

    PubMed Central

    Shukla, Arvind Kumar; Oinam, Arun S.; Kumar, Sanjeev; Sandhu, I.S.; Sharma, S.C.

    2013-01-01

    Aim The aim of the present study is to develop and verify the single film calibration procedure used in intensity-modulated radiation therapy (IMRT) quality assurance. Background Radiographic films have been regularly used in routine commissioning of treatment modalities and verification of treatment planning system (TPS). The radiation dosimetery based on radiographic films has ability to give absolute two-dimension dose distribution and prefer for the IMRT quality assurance. However, the single therapy verification film gives a quick and significant reliable method for IMRT verification. Materials and methods A single extended dose rate (EDR 2) film was used to generate the sensitometric curve of film optical density and radiation dose. EDR 2 film was exposed with nine 6 cm × 6 cm fields of 6 MV photon beam obtained from a medical linear accelerator at 5-cm depth in solid water phantom. The nine regions of single film were exposed with radiation doses raging from 10 to 362 cGy. The actual dose measurements inside the field regions were performed using 0.6 cm3 ionization chamber. The exposed film was processed after irradiation using a VIDAR film scanner and the value of optical density was noted for each region. Ten IMRT plans of head and neck carcinoma were used for verification using a dynamic IMRT technique, and evaluated using the gamma index method with TPS calculated dose distribution. Results Sensitometric curve has been generated using a single film exposed at nine field region to check quantitative dose verifications of IMRT treatments. The radiation scattered factor was observed to decrease exponentially with the increase in the distance from the centre of each field region. The IMRT plans based on calibration curve were verified using the gamma index method and found to be within acceptable criteria. Conclusion The single film method proved to be superior to the traditional calibration method and produce fast daily film calibration for highly accurate IMRT verification. PMID:24416558

  4. GOODS-Herschel: identification of the individual galaxies responsible for the 80-290 μm cosmic infrared background

    NASA Astrophysics Data System (ADS)

    Leiton, R.; Elbaz, D.; Okumura, K.; Hwang, H. S.; Magdis, G.; Magnelli, B.; Valtchanov, I.; Dickinson, M.; Béthermin, M.; Schreiber, C.; Charmandaris, V.; Dole, H.; Juneau, S.; Le Borgne, D.; Pannella, M.; Pope, A.; Popesso, P.

    2015-07-01

    Aims: We propose a new method of pushing Herschel to its faintest detection limits using universal trends in the redshift evolution of the far infrared over 24 μm colours in the well-sampled GOODS-North field. An extension to other fields with less multi-wavelength information is presented. This method is applied here to raise the contribution of individually detected Herschel sources to the cosmic infrared background (CIRB) by a factor 5 close to its peak at 250 μm and more than 3 in the 350 and 500 μm bands. Methods: We produce realistic mock Herschel images of the deep PACS and SPIRE images of the GOODS-North field from the GOODS-Herschel key program and use them to quantify the confusion noise at the position of individual sources, i.e., estimate a "local confusion noise". Two methods are used to identify sources with reliable photometric accuracy extracted using 24 μm prior positions. The clean index (CI), previously defined but validated here with simulations, which measures the presence of bright 24 μm neighbours and the photometric accuracy index (PAI) directly extracted from the mock Herschel images. Results: Both methods converge to comparable depths and fractions of the CIRB resolved into sources individually detected with Herschel. After correction for completeness, thanks to our mock Herschel images, individually detected sources make up as much as 54% and 60% of the CIRB in the PACS bands down to 1.1 mJy at 100 μm and 2.2 mJy at 160 μm and 55, 33, and 13% of the CIRB in the SPIRE bands down to 2.5, 5, and 9 mJy at 250 μm, 350 μm, and 500 μm, respectively. The latter depths improve the detection limits of Herschel by factors of 5 at 250 μm, and 3 at 350 μm and 500 μm as compared to the standard confusion limit. Interestingly, the dominant contributors to the CIRB in all Herschel bands appear to be distant siblings of the Milky Way (z ~ 0.96 for λ< 300 μm) with a stellar mass of M⋆ ~ 9 × 1010M⊙.

  5. Physical retrieval of precipitation water contents from Special Sensor Microwave/Imager (SSM/I) data. Part 2: Retrieval method and applications (report version)

    NASA Technical Reports Server (NTRS)

    Olson, William S.

    1990-01-01

    A physical retrieval method for estimating precipitating water distributions and other geophysical parameters based upon measurements from the DMSP-F8 SSM/I is developed. Three unique features of the retrieval method are (1) sensor antenna patterns are explicitly included to accommodate varying channel resolution; (2) precipitation-brightness temperature relationships are quantified using the cloud ensemble/radiative parameterization; and (3) spatial constraints are imposed for certain background parameters, such as humidity, which vary more slowly in the horizontal than the cloud and precipitation water contents. The general framework of the method will facilitate the incorporation of measurements from the SSMJT, SSM/T-2 and geostationary infrared measurements, as well as information from conventional sources (e.g., radiosondes) or numerical forecast model fields.

  6. Mathematical design of a novel input/instruction device using a moving acoustic emitter

    NASA Astrophysics Data System (ADS)

    Wang, Xianchao; Guo, Yukun; Li, Jingzhi; Liu, Hongyu

    2017-10-01

    This paper is concerned with the mathematical design of a novel input/instruction device using a moving emitter. The emitter acts as a point source and can be installed on a digital pen or worn on the finger of the human being who desires to interact/communicate with the computer. The input/instruction can be recognized by identifying the moving trajectory of the emitter performed by the human being from the collected wave field data. The identification process is modelled as an inverse source problem where one intends to identify the trajectory of a moving point source. There are several salient features of our study which distinguish our result from the existing ones in the literature. First, the point source is moving in an inhomogeneous background medium, which models the human body. Second, the dynamical wave field data are collected in a limited aperture. Third, the reconstruction method is independent of the background medium, and it is totally direct without any matrix inversion. Hence, it is efficient and robust with respect to the measurement noise. Both theoretical justifications and computational experiments are presented to verify our novel findings.

  7. Promoting medical competencies through international exchange programs: benefits on communication and effective doctor-patient relationships

    PubMed Central

    2014-01-01

    Background Universities are increasingly organizing international exchange programs to meet the requirements of growing globalisation in the field of health care. Analyses based on the programs’ fundamental theoretical background are needed to confirm the learning value for participants. This study investigated the extent of sociocultural learning in an exchange program and how sociocultural learning affects the acquisition of domain-specific competencies. Methods Sociocultural learning theories were applied to study the learning effect for German medical students from the LMU Munich, Munich, Germany, of participation in the medical exchange program with Jimma University, Jimma, Ethiopia. First, we performed a qualitative study consisting of interviews with five of the first program participants. The results were used to develop a questionnaire for the subsequent, quantitative study, in which 29 program participants and 23 matched controls performed self-assessments of competencies as defined in the Tuning Project for Health Professionals. The two interrelated studies were combined to answer three different research questions. Results The participants rated their competence significantly higher than the control group in the fields of doctor-patient relationships and communication in a medical context. Participant responses in the two interrelated studies supported the link between the findings and the suggested theoretical background. Conclusion Overall, we found that the exchange program affected the areas of doctor-patient relationships and effective communication in a medical context. Vygotsky’s sociocultural learning theory contributed to explaining the learning mechanisms of the exchange program. PMID:24589133

  8. A scene model of exosolar systems for use in planetary detection and characterisation simulations

    NASA Astrophysics Data System (ADS)

    Belu, A.; Thiébaut, E.; Ollivier, M.; Lagache, G.; Selsis, F.; Vakili, F.

    2007-12-01

    Context: Instrumental projects that will improve the direct optical finding and characterisation of exoplanets have advanced sufficiently to trigger organized investigation and development of corresponding signal processing algorithms. The first step is the availability of field-of-view (FOV) models. These can then be submitted to various instrumental models, which in turn produce simulated data, enabling the testing of processing algorithms. Aims: We aim to set the specifications of a physical model for typical FOVs of these instruments. Methods: The dynamic in resolution and flux between the various sources present in such a FOV imposes a multiscale, independent layer approach. From review of current literature and through extrapolations from currently available data and models, we derive the features of each source-type in the field of view likely to pass the instrumental filter at exo-Earth level. Results: Stellar limb darkening is shown to cause bias in leakage calibration if unaccounted for. Occurrence of perturbing background stars or galaxies in the typical FOV is unlikely. We extract galactic interstellar medium background emissions for current target lists. Galactic background can be considered uniform over the FOV, and it should show no significant drift with parallax. Our model specifications have been embedded into a Java simulator, soon to be made open-source. We have also designed an associated FITS input/output format standard that we present here. Work supported in part by the ESA/ESTEC contract 18701/04/NL/HB, led by Thales Alenia Space.

  9. Evolving Waves and Turbulence in the Outer Corona and Inner Heliosphere: The Accelerating Expanding Box

    NASA Astrophysics Data System (ADS)

    Tenerani, Anna; Velli, Marco

    2017-07-01

    Alfvénic fluctuations in the solar wind display many properties reflecting an ongoing nonlinear cascade, e.g., a well-defined spectrum in frequency, together with some characteristics more commonly associated with the linear propagation of waves from the Sun, such as the variation of fluctuation amplitude with distance, dominated by solar wind expansion effects. Therefore, both nonlinearities and expansion must be included simultaneously in any successful model of solar wind turbulence evolution. Because of the disparate spatial scales involved, direct numerical simulations of turbulence in the solar wind represent an arduous task, especially if one wants to go beyond the incompressible approximation. Indeed, most simulations neglect solar wind expansion effects entirely. Here we develop a numerical model to simulate turbulent fluctuations from the outer corona to 1 au and beyond, including the sub-Alfvénic corona. The accelerating expanding box (AEB) extends the validity of previous expanding box models by taking into account both the acceleration of the solar wind and the inhomogeneity of background density and magnetic field. Our method incorporates a background accelerating wind within a magnetic field that naturally follows the Parker spiral evolution using a two-scale analysis in which the macroscopic spatial effect coupling fluctuations with background gradients becomes a time-dependent coupling term in a homogeneous box. In this paper we describe the AEB model in detail and discuss its main properties, illustrating its validity by studying Alfvén wave propagation across the Alfvén critical point.

  10. Noctilucent cloud polarimetry: Twilight measurements in a wide range of scattering angles

    NASA Astrophysics Data System (ADS)

    Ugolnikov, Oleg S.; Maslov, Igor A.; Kozelov, Boris V.; Dlugach, Janna M.

    2016-06-01

    Wide-field polarization measurements of the twilight sky background during several nights with bright and extended noctilucent clouds in central and northern Russia in 2014 and 2015 are used to build the phase dependence of the degree of polarization of sunlight scattered by cloud particles in a wide range of scattering angles (from 40° to 130°). This range covers the linear polarization maximum near 90° and large-angle slope of the curve. The polarization in this angle range is most sensitive to the particle size. The method of separation of scattering on cloud particles from the twilight background is presented. Results are compared with T-matrix simulations for different sizes and shapes of ice particles; the best-fit model radius of particles (0.06 μm) and maximum radius (about 0.1 μm) are estimated.

  11. Standoff determination of the particle size and concentration of small optical depth clouds based on double scattering measurements: concept and experimental validation with bioaerosols.

    PubMed

    Roy, Gilles; Roy, Nathalie

    2008-03-20

    A multiple-field-of-view (MFOV) lidar is used to characterize size and optical depth of low concentration of bioaerosol clouds. The concept relies on the measurement of the forward scattered light by using the background aerosols at various distances at the back of a subvisible cloud. It also relies on the subtraction of the background aerosol forward scattering contribution and on the partial attenuation of the first-order backscattering. The validity of the concept developed to retrieve the effective diameter and the optical depth of low concentration bioaerosol clouds with good precision is demonstrated using simulation results and experimental MFOV lidar measurements. Calculations are also done to show that the method presented can be extended to small optical depth cloud retrieval.

  12. Tackling higher derivative ghosts with the Euclidean path integral

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fontanini, Michele; Department of Physics, Syracuse University, Syracuse, New York 13244; Trodden, Mark

    2011-05-15

    An alternative to the effective field theory approach to treat ghosts in higher derivative theories is to attempt to integrate them out via the Euclidean path integral formalism. It has been suggested that this method could provide a consistent framework within which we might tolerate the ghost degrees of freedom that plague, among other theories, the higher derivative gravity models that have been proposed to explain cosmic acceleration. We consider the extension of this idea to treating a class of terms with order six derivatives, and find that for a general term the Euclidean path integral approach works in themore » most trivial background, Minkowski. Moreover we see that even in de Sitter background, despite some difficulties, it is possible to define a probability distribution for tensorial perturbations of the metric.« less

  13. Electromagnetic radiation due to naked singularity formation in self-similar gravitational collapse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitsuda, Eiji; Yoshino, Hirotaka; Tomimatsu, Akira

    Dynamical evolution of test fields in background geometry with a naked singularity is an important problem relevant to the Cauchy horizon instability and the observational signatures different from black hole formation. In this paper we study electromagnetic perturbations generated by a given current distribution in collapsing matter under a spherically symmetric self-similar background. Using the Green's function method, we construct the formula to evaluate the outgoing energy flux observed at the future null infinity. The contributions from 'quasinormal' modes of the self-similar system as well as 'high-frequency' waves are clarified. We find a characteristic power-law time evolution of the outgoingmore » energy flux which appears just before naked singularity formation and give the criteria as to whether or not the outgoing energy flux diverges at the future Cauchy horizon.« less

  14. Background-Oriented Schlieren (BOS) for Scramjet Inlet-isolator Investigation

    NASA Astrophysics Data System (ADS)

    Che Idris, Azam; Rashdan Saad, Mohd; Hing Lo, Kin; Kontis, Konstantinos

    2018-05-01

    Background-oriented Schlieren (BOS) technique is a recently invented non-intrusive flow diagnostic method which has yet to be fully explored in its capabilities. In this paper, BOS technique has been applied for investigating the general flow field characteristics inside a generic scramjet inlet-isolator with Mach 5 flow. The difficulty in finding the delicate balance between measurement sensitivity and measurement area image focusing has been demonstrated. The differences between direct cross-correlation (DCC) and Fast Fourier Transform (FFT) raw data processing algorithm have also been demonstrated. As an exploratory study of BOS capability, this paper found that BOS is simple yet robust enough to be used to visualize complex flow in a scramjet inlet in hypersonic flow. However, in this case its quantitative data can be strongly affected by 3-dimensionality thus obscuring the density value with significant errors.

  15. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder.

    PubMed

    Chen, Shuoyang; Xu, Tingfa; Li, Daqun; Zhang, Jizhou; Jiang, Shenwang

    2016-10-21

    During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as "frame difference" and "optical flow", may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a "multi-block temporal-analyzing LBP (Local Binary Pattern)" algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor) and FPGA (Field Programmable Gate Array) platforms and the high-precision intelligent holder.

  16. Single objective light-sheet microscopy for high-speed whole-cell 3D super-resolution

    PubMed Central

    Meddens, Marjolein B. M.; Liu, Sheng; Finnegan, Patrick S.; Edwards, Thayne L.; James, Conrad D.; Lidke, Keith A.

    2016-01-01

    We have developed a method for performing light-sheet microscopy with a single high numerical aperture lens by integrating reflective side walls into a microfluidic chip. These 45° side walls generate light-sheet illumination by reflecting a vertical light-sheet into the focal plane of the objective. Light-sheet illumination of cells loaded in the channels increases image quality in diffraction limited imaging via reduction of out-of-focus background light. Single molecule super-resolution is also improved by the decreased background resulting in better localization precision and decreased photo-bleaching, leading to more accepted localizations overall and higher quality images. Moreover, 2D and 3D single molecule super-resolution data can be acquired faster by taking advantage of the increased illumination intensities as compared to wide field, in the focused light-sheet. PMID:27375939

  17. An Interactive Image Segmentation Method in Hand Gesture Recognition

    PubMed Central

    Chen, Disi; Li, Gongfa; Sun, Ying; Kong, Jianyi; Jiang, Guozhang; Tang, Heng; Ju, Zhaojie; Yu, Hui; Liu, Honghai

    2017-01-01

    In order to improve the recognition rate of hand gestures a new interactive image segmentation method for hand gesture recognition is presented, and popular methods, e.g., Graph cut, Random walker, Interactive image segmentation using geodesic star convexity, are studied in this article. The Gaussian Mixture Model was employed for image modelling and the iteration of Expectation Maximum algorithm learns the parameters of Gaussian Mixture Model. We apply a Gibbs random field to the image segmentation and minimize the Gibbs Energy using Min-cut theorem to find the optimal segmentation. The segmentation result of our method is tested on an image dataset and compared with other methods by estimating the region accuracy and boundary accuracy. Finally five kinds of hand gestures in different backgrounds are tested on our experimental platform, and the sparse representation algorithm is used, proving that the segmentation of hand gesture images helps to improve the recognition accuracy. PMID:28134818

  18. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  19. Analyzing the Structure and Content of Public Health Messages

    PubMed Central

    Morrison, Frances P.; Kukafka, Rita; Johnson, Stephen B.

    2005-01-01

    Background Health messages are crucial to the field of public health in effecting behavior change, but little research is available to assist writers in composing the overall structure of a message. In order to develop software to assist individuals in constructing effective messages, the structure of existing health messages must be understood, and an appropriate method for analyzing health message structure developed. Methods 72 messages from expert sources were used for development of the method, which was then tested for reproducibility using ten randomly selected health messages. Four raters analyzed the messages and inter-coder agreement was calculated. Results A method for analyzing the structure of the messages was developed using sublanguage analysis and discourse analysis. Overall kappa between four coders was 0.69. Conclusion A novel framework for characterizing health message structure and a method for analyzing messages appears to be reproducible and potentially useful for creating an authoring tool. PMID:16779098

  20. Steady-State Ion Beam Modeling with MICHELLE

    NASA Astrophysics Data System (ADS)

    Petillo, John

    2003-10-01

    There is a need to efficiently model ion beam physics for ion implantation, chemical vapor deposition, and ion thrusters. Common to all is the need for three-dimensional (3D) simulation of volumetric ion sources, ion acceleration, and optics, with the ability to model charge exchange of the ion beam with a background neutral gas. The two pieces of physics stand out as significant are the modeling of the volumetric source and charge exchange. In the MICHELLE code, the method for modeling the plasma sheath in ion sources assumes that the electron distribution function is a Maxwellian function of electrostatic potential over electron temperature. Charge exchange is the process by which a neutral background gas with a "fast" charged particle streaming through exchanges its electron with the charged particle. An efficient method for capturing this is essential, and the model presented is based on semi-empirical collision cross section functions. This appears to be the first steady-state 3D algorithm of its type to contain multiple generations of charge exchange, work with multiple species and multiple charge state beam/source particles simultaneously, take into account the self-consistent space charge effects, and track the subsequent fast neutral particles. The solution used by MICHELLE is to combine finite element analysis with particle-in-cell (PIC) methods. The basic physics model is based on the equilibrium steady-state application of the electrostatic particle-in-cell (PIC) approximation employing a conformal computational mesh. The foundation stems from the same basic model introduced in codes such as EGUN. Here, Poisson's equation is used to self-consistently include the effects of space charge on the fields, and the relativistic Lorentz equation is used to integrate the particle trajectories through those fields. The presentation will consider the complexity of modeling ion thrusters.

Top