Sample records for background methods results

  1. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    NASA Astrophysics Data System (ADS)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 0.9998, 0.9915, 0.9895, and 0.9940 respectively). The proposed spline interpolation method exhibits better linear correlation and smaller error in the results of the quantitative analysis of Cu compared with polynomial fitting, Lorentz fitting and model-free methods, The simulation and quantitative experimental results show that the spline interpolation method can effectively detect and correct the continuous background.

  2. Background estimation and player detection in badminton video clips using histogram of pixel values along temporal dimension

    NASA Astrophysics Data System (ADS)

    Peng, Yahui; Ma, Xiao; Gao, Xinyu; Zhou, Fangxu

    2015-12-01

    Computer vision is an important tool for sports video processing. However, its application in badminton match analysis is very limited. In this study, we proposed a straightforward but robust histogram-based background estimation and player detection methods for badminton video clips, and compared the results with the naive averaging method and the mixture of Gaussians methods, respectively. The proposed method yielded better background estimation results than the naive averaging method and more accurate player detection results than the mixture of Gaussians player detection method. The preliminary results indicated that the proposed histogram-based method could estimate the background and extract the players accurately. We conclude that the proposed method can be used for badminton player tracking and further studies are warranted for automated match analysis.

  3. Unsupervised background-constrained tank segmentation of infrared images in complex background based on the Otsu method.

    PubMed

    Zhou, Yulong; Gao, Min; Fang, Dan; Zhang, Baoquan

    2016-01-01

    In an effort to implement fast and effective tank segmentation from infrared images in complex background, the threshold of the maximum between-class variance method (i.e., the Otsu method) is analyzed and the working mechanism of the Otsu method is discussed. Subsequently, a fast and effective method for tank segmentation from infrared images in complex background is proposed based on the Otsu method via constraining the complex background of the image. Considering the complexity of background, the original image is firstly divided into three classes of target region, middle background and lower background via maximizing the sum of their between-class variances. Then, the unsupervised background constraint is implemented based on the within-class variance of target region and hence the original image can be simplified. Finally, the Otsu method is applied to simplified image for threshold selection. Experimental results on a variety of tank infrared images (880 × 480 pixels) in complex background demonstrate that the proposed method enjoys better segmentation performance and even could be comparative with the manual segmentation in segmented results. In addition, its average running time is only 9.22 ms, implying the new method with good performance in real time processing.

  4. Infrared images target detection based on background modeling in the discrete cosine domain

    NASA Astrophysics Data System (ADS)

    Ye, Han; Pei, Jihong

    2018-02-01

    Background modeling is the critical technology to detect the moving target for video surveillance. Most background modeling techniques are aimed at land monitoring and operated in the spatial domain. A background establishment becomes difficult when the scene is a complex fluctuating sea surface. In this paper, the background stability and separability between target are analyzed deeply in the discrete cosine transform (DCT) domain, on this basis, we propose a background modeling method. The proposed method models each frequency point as a single Gaussian model to represent background, and the target is extracted by suppressing the background coefficients. Experimental results show that our approach can establish an accurate background model for seawater, and the detection results outperform other background modeling methods in the spatial domain.

  5. Characterization of background concentrations of contaminants using a mixture of normal distributions.

    PubMed

    Qian, Song S; Lyons, Regan E

    2006-10-01

    We present a Bayesian approach for characterizing background contaminant concentration distributions using data from sites that may have been contaminated. Our method, focused on estimation, resolves several technical problems of the existing methods sanctioned by the U.S. Environmental Protection Agency (USEPA) (a hypothesis testing based method), resulting in a simple and quick procedure for estimating background contaminant concentrations. The proposed Bayesian method is applied to two data sets from a federal facility regulated under the Resource Conservation and Restoration Act. The results are compared to background distributions identified using existing methods recommended by the USEPA. The two data sets represent low and moderate levels of censorship in the data. Although an unbiased estimator is elusive, we show that the proposed Bayesian estimation method will have a smaller bias than the EPA recommended method.

  6. Background feature descriptor for offline handwritten numeral recognition

    NASA Astrophysics Data System (ADS)

    Ming, Delie; Wang, Hao; Tian, Tian; Jie, Feiran; Lei, Bo

    2011-11-01

    This paper puts forward an offline handwritten numeral recognition method based on background structural descriptor (sixteen-value numerical background expression). Through encoding the background pixels in the image according to a certain rule, 16 different eigenvalues were generated, which reflected the background condition of every digit, then reflected the structural features of the digits. Through pattern language description of images by these features, automatic segmentation of overlapping digits and numeral recognition can be realized. This method is characterized by great deformation resistant ability, high recognition speed and easy realization. Finally, the experimental results and conclusions are presented. The experimental results of recognizing datasets from various practical application fields reflect that with this method, a good recognition effect can be achieved.

  7. A new background subtraction method for energy dispersive X-ray fluorescence spectra using a cubic spline interpolation

    NASA Astrophysics Data System (ADS)

    Yi, Longtao; Liu, Zhiguo; Wang, Kai; Chen, Man; Peng, Shiqi; Zhao, Weigang; He, Jialin; Zhao, Guangcui

    2015-03-01

    A new method is presented to subtract the background from the energy dispersive X-ray fluorescence (EDXRF) spectrum using a cubic spline interpolation. To accurately obtain interpolation nodes, a smooth fitting and a set of discriminant formulations were adopted. From these interpolation nodes, the background is estimated by a calculated cubic spline function. The method has been tested on spectra measured from a coin and an oil painting using a confocal MXRF setup. In addition, the method has been tested on an existing sample spectrum. The result confirms that the method can properly subtract the background.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Xi; Mou, Xuanqin; Nishikawa, Robert M.

    Purpose: Small calcifications are often the earliest and the main indicator of breast cancer. Dual-energy digital mammography (DEDM) has been considered as a promising technique to improve the detectability of calcifications since it can be used to suppress the contrast between adipose and glandular tissues of the breast. X-ray scatter leads to erroneous calculations of the DEDM image. Although the pinhole-array interpolation method can estimate scattered radiations, it requires extra exposures to measure the scatter and apply the correction. The purpose of this work is to design an algorithmic method for scatter correction in DEDM without extra exposures.Methods: In thismore » paper, a scatter correction method for DEDM was developed based on the knowledge that scattered radiation has small spatial variation and that the majority of pixels in a mammogram are noncalcification pixels. The scatter fraction was estimated in the DEDM calculation and the measured scatter fraction was used to remove scatter from the image. The scatter correction method was implemented on a commercial full-field digital mammography system with breast tissue equivalent phantom and calcification phantom. The authors also implemented the pinhole-array interpolation scatter correction method on the system. Phantom results for both methods are presented and discussed. The authors compared the background DE calcification signals and the contrast-to-noise ratio (CNR) of calcifications in the three DE calcification images: image without scatter correction, image with scatter correction using pinhole-array interpolation method, and image with scatter correction using the authors' algorithmic method.Results: The authors' results show that the resultant background DE calcification signal can be reduced. The root-mean-square of background DE calcification signal of 1962 μm with scatter-uncorrected data was reduced to 194 μm after scatter correction using the authors' algorithmic method. The range of background DE calcification signals using scatter-uncorrected data was reduced by 58% with scatter-corrected data by algorithmic method. With the scatter-correction algorithm and denoising, the minimum visible calcification size can be reduced from 380 to 280 μm.Conclusions: When applying the proposed algorithmic scatter correction to images, the resultant background DE calcification signals can be reduced and the CNR of calcifications can be improved. This method has similar or even better performance than pinhole-array interpolation method in scatter correction for DEDM; moreover, this method is convenient and requires no extra exposure to the patient. Although the proposed scatter correction method is effective, it is validated by a 5-cm-thick phantom with calcifications and homogeneous background. The method should be tested on structured backgrounds to more accurately gauge effectiveness.« less

  9. Advanced Background Subtraction Applied to Aeroacoustic Wind Tunnel Testing

    NASA Technical Reports Server (NTRS)

    Bahr, Christopher J.; Horne, William C.

    2015-01-01

    An advanced form of background subtraction is presented and applied to aeroacoustic wind tunnel data. A variant of this method has seen use in other fields such as climatology and medical imaging. The technique, based on an eigenvalue decomposition of the background noise cross-spectral matrix, is robust against situations where isolated background auto-spectral levels are measured to be higher than levels of combined source and background signals. It also provides an alternate estimate of the cross-spectrum, which previously might have poor definition for low signal-to-noise ratio measurements. Simulated results indicate similar performance to conventional background subtraction when the subtracted spectra are weaker than the true contaminating background levels. Superior performance is observed when the subtracted spectra are stronger than the true contaminating background levels. Experimental results show limited success in recovering signal behavior for data where conventional background subtraction fails. They also demonstrate the new subtraction technique's ability to maintain a proper coherence relationship in the modified cross-spectral matrix. Beam-forming and de-convolution results indicate the method can successfully separate sources. Results also show a reduced need for the use of diagonal removal in phased array processing, at least for the limited data sets considered.

  10. Latent variable method for automatic adaptation to background states in motor imagery BCI

    NASA Astrophysics Data System (ADS)

    Dagaev, Nikolay; Volkova, Ksenia; Ossadtchi, Alexei

    2018-02-01

    Objective. Brain-computer interface (BCI) systems are known to be vulnerable to variabilities in background states of a user. Usually, no detailed information on these states is available even during the training stage. Thus there is a need in a method which is capable of taking background states into account in an unsupervised way. Approach. We propose a latent variable method that is based on a probabilistic model with a discrete latent variable. In order to estimate the model’s parameters, we suggest to use the expectation maximization algorithm. The proposed method is aimed at assessing characteristics of background states without any corresponding data labeling. In the context of asynchronous motor imagery paradigm, we applied this method to the real data from twelve able-bodied subjects with open/closed eyes serving as background states. Main results. We found that the latent variable method improved classification of target states compared to the baseline method (in seven of twelve subjects). In addition, we found that our method was also capable of background states recognition (in six of twelve subjects). Significance. Without any supervised information on background states, the latent variable method provides a way to improve classification in BCI by taking background states into account at the training stage and then by making decisions on target states weighted by posterior probabilities of background states at the prediction stage.

  11. A novel method to remove GPR background noise based on the similarity of non-neighboring regions

    NASA Astrophysics Data System (ADS)

    Montiel-Zafra, V.; Canadas-Quesada, F. J.; Vera-Candeas, P.; Ruiz-Reyes, N.; Rey, J.; Martinez, J.

    2017-09-01

    Ground penetrating radar (GPR) is a non-destructive technique that has been widely used in many areas of research, such as landmine detection or subsurface anomalies, where it is required to locate targets embedded within a background medium. One of the major challenges in the research of GPR data remains the improvement of the image quality of stone materials by means of detection of true anisotropies since most of the errors are caused by an incorrect interpretation by the users. However, it is complicated due to the interference of the horizontal background noise, e.g., the air-ground interface, that reduces the high-resolution quality of radargrams. Thus, weak or deep anisotropies are often masked by this type of noise. In order to remove the background noise obtained by GPR, this work proposes a novel background removal method assuming that the horizontal noise shows repetitive two-dimensional regions along the movement of the GPR antenna. Specifically, the proposed method, based on the non-local similarity of regions over the distance, computes similarities between different regions of the same depth in order to identify most repetitive regions using a criterion to avoid closer regions. Evaluations are performed using a set of synthetic and real GPR data. Experimental results show that the proposed method obtains promising results compared to the classic background removal techniques and the most recently published background removal methods.

  12. Noise covariance incorporated MEG-MUSIC algorithm: a method for multiple-dipole estimation tolerant of the influence of background brain activity.

    PubMed

    Sekihara, K; Poeppel, D; Marantz, A; Koizumi, H; Miyashita, Y

    1997-09-01

    This paper proposes a method of localizing multiple current dipoles from spatio-temporal biomagnetic data. The method is based on the multiple signal classification (MUSIC) algorithm and is tolerant of the influence of background brain activity. In this method, the noise covariance matrix is estimated using a portion of the data that contains noise, but does not contain any signal information. Then, a modified noise subspace projector is formed using the generalized eigenvectors of the noise and measured-data covariance matrices. The MUSIC localizer is calculated using this noise subspace projector and the noise covariance matrix. The results from a computer simulation have verified the effectiveness of the method. The method was then applied to source estimation for auditory-evoked fields elicited by syllable speech sounds. The results strongly suggest the method's effectiveness in removing the influence of background activity.

  13. Autonomous rock detection on mars through region contrast

    NASA Astrophysics Data System (ADS)

    Xiao, Xueming; Cui, Hutao; Yao, Meibao; Tian, Yang

    2017-08-01

    In this paper, we present a new autonomous rock detection approach through region contrast. Unlike current state-of-art pixel-level rock segmenting methods, new method deals with this issue in region level, which will significantly reduce the computational cost. Image is firstly splitted into homogeneous regions based on intensity information and spatial layout. Considering the high-water memory constraints of onboard flight processor, only low-level features, average intensity and variation of superpixel, are measured. Region contrast is derived as the integration of intensity contrast and smoothness measurement. Rocks are then segmented from the resulting contrast map by an adaptive threshold. Since the merely intensity-based method may cause false detection in background areas with different illuminations from surroundings, a more reliable method is further proposed by introducing spatial factor and background similarity to the region contrast. Spatial factor demonstrates the locality of contrast, while background similarity calculates the probability of each subregion belonging to background. Our method is efficient in dealing with large images and only few parameters are needed. Preliminary experimental results show that our algorithm outperforms edge-based methods in various grayscale rover images.

  14. Discoveries far from the lamppost with matrix elements and ranking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Debnath, Dipsikha; Gainer, James S.; Matchev, Konstantin T.

    2015-04-01

    The prevalence of null results in searches for new physics at the LHC motivates the effort to make these searches as model-independent as possible. We describe procedures for adapting the Matrix Element Method for situations where the signal hypothesis is not known a priori. We also present general and intuitive approaches for performing analyses and presenting results, which involve the flattening of background distributions using likelihood information. The first flattening method involves ranking events by background matrix element, the second involves quantile binning with respect to likelihood (and other) variables, and the third method involves reweighting histograms by the inversemore » of the background distribution.« less

  15. Background-Modeling-Based Adaptive Prediction for Surveillance Video Coding.

    PubMed

    Zhang, Xianguo; Huang, Tiejun; Tian, Yonghong; Gao, Wen

    2014-02-01

    The exponential growth of surveillance videos presents an unprecedented challenge for high-efficiency surveillance video coding technology. Compared with the existing coding standards that were basically developed for generic videos, surveillance video coding should be designed to make the best use of the special characteristics of surveillance videos (e.g., relative static background). To do so, this paper first conducts two analyses on how to improve the background and foreground prediction efficiencies in surveillance video coding. Following the analysis results, we propose a background-modeling-based adaptive prediction (BMAP) method. In this method, all blocks to be encoded are firstly classified into three categories. Then, according to the category of each block, two novel inter predictions are selectively utilized, namely, the background reference prediction (BRP) that uses the background modeled from the original input frames as the long-term reference and the background difference prediction (BDP) that predicts the current data in the background difference domain. For background blocks, the BRP can effectively improve the prediction efficiency using the higher quality background as the reference; whereas for foreground-background-hybrid blocks, the BDP can provide a better reference after subtracting its background pixels. Experimental results show that the BMAP can achieve at least twice the compression ratio on surveillance videos as AVC (MPEG-4 Advanced Video Coding) high profile, yet with a slightly additional encoding complexity. Moreover, for the foreground coding performance, which is crucial to the subjective quality of moving objects in surveillance videos, BMAP also obtains remarkable gains over several state-of-the-art methods.

  16. Structured background grids for generation of unstructured grids by advancing front method

    NASA Technical Reports Server (NTRS)

    Pirzadeh, Shahyar

    1991-01-01

    A new method of background grid construction is introduced for generation of unstructured tetrahedral grids using the advancing-front technique. Unlike the conventional triangular/tetrahedral background grids which are difficult to construct and usually inadequate in performance, the new method exploits the simplicity of uniform Cartesian meshes and provides grids of better quality. The approach is analogous to solving a steady-state heat conduction problem with discrete heat sources. The spacing parameters of grid points are distributed over the nodes of a Cartesian background grid by interpolating from a few prescribed sources and solving a Poisson equation. To increase the control over the grid point distribution, a directional clustering approach is used. The new method is convenient to use and provides better grid quality and flexibility. Sample results are presented to demonstrate the power of the method.

  17. Target detection in GPR data using joint low-rank and sparsity constraints

    NASA Astrophysics Data System (ADS)

    Bouzerdoum, Abdesselam; Tivive, Fok Hing Chi; Abeynayake, Canicious

    2016-05-01

    In ground penetrating radars, background clutter, which comprises the signals backscattered from the rough, uneven ground surface and the background noise, impairs the visualization of buried objects and subsurface inspections. In this paper, a clutter mitigation method is proposed for target detection. The removal of background clutter is formulated as a constrained optimization problem to obtain a low-rank matrix and a sparse matrix. The low-rank matrix captures the ground surface reflections and the background noise, whereas the sparse matrix contains the target reflections. An optimization method based on split-Bregman algorithm is developed to estimate these two matrices from the input GPR data. Evaluated on real radar data, the proposed method achieves promising results in removing the background clutter and enhancing the target signature.

  18. Background Noise Reduction Using Adaptive Noise Cancellation Determined by the Cross-Correlation

    NASA Technical Reports Server (NTRS)

    Spalt, Taylor B.; Brooks, Thomas F.; Fuller, Christopher R.

    2012-01-01

    Background noise due to flow in wind tunnels contaminates desired data by decreasing the Signal-to-Noise Ratio. The use of Adaptive Noise Cancellation to remove background noise at measurement microphones is compromised when the reference sensor measures both background and desired noise. The technique proposed modifies the classical processing configuration based on the cross-correlation between the reference and primary microphone. Background noise attenuation is achieved using a cross-correlation sample width that encompasses only the background noise and a matched delay for the adaptive processing. A present limitation of the method is that a minimum time delay between the background noise and desired signal must exist in order for the correlated parts of the desired signal to be separated from the background noise in the crosscorrelation. A simulation yields primary signal recovery which can be predicted from the coherence of the background noise between the channels. Results are compared with two existing methods.

  19. The beam stop array method to measure object scatter in digital breast tomosynthesis

    NASA Astrophysics Data System (ADS)

    Lee, Haeng-hwa; Kim, Ye-seul; Park, Hye-Suk; Kim, Hee-Joung; Choi, Jae-Gu; Choi, Young-Wook

    2014-03-01

    Scattered radiation is inevitably generated in the object. The distribution of the scattered radiation is influenced by object thickness, filed size, object-to-detector distance, and primary energy. One of the investigations to measure scatter intensities involves measuring the signal detected under the shadow of the lead discs of a beam-stop array (BSA). The measured scatter by BSA includes not only the scattered radiation within the object (object scatter), but also the external scatter source. The components of external scatter source include the X-ray tube, detector, collimator, x-ray filter, and BSA. Excluding background scattered radiation can be applied to different scanner geometry by simple parameter adjustments without prior knowledge of the scanned object. In this study, a method using BSA to differentiate scatter in phantom (object scatter) from external background was used. Furthermore, this method was applied to BSA algorithm to correct the object scatter. In order to confirm background scattered radiation, we obtained the scatter profiles and scatter fraction (SF) profiles in the directions perpendicular to the chest wall edge (CWE) with and without scattering material. The scatter profiles with and without the scattering material were similar in the region between 127 mm and 228 mm from chest wall. This result indicated that the measured scatter by BSA included background scatter. Moreover, the BSA algorithm with the proposed method could correct the object scatter because the total radiation profiles of object scatter correction corresponded to original image in the region between 127 mm and 228 mm from chest wall. As a result, the BSA method to measure object scatter could be used to remove background scatter. This method could apply for different scanner geometry after background scatter correction. In conclusion, the BSA algorithm with the proposed method is effective to correct object scatter.

  20. A novel infrared small moving target detection method based on tracking interest points under complicated background

    NASA Astrophysics Data System (ADS)

    Dong, Xiabin; Huang, Xinsheng; Zheng, Yongbin; Bai, Shengjian; Xu, Wanying

    2014-07-01

    Infrared moving target detection is an important part of infrared technology. We introduce a novel infrared small moving target detection method based on tracking interest points under complicated background. Firstly, Difference of Gaussians (DOG) filters are used to detect a group of interest points (including the moving targets). Secondly, a sort of small targets tracking method inspired by Human Visual System (HVS) is used to track these interest points for several frames, and then the correlations between interest points in the first frame and the last frame are obtained. Last, a new clustering method named as R-means is proposed to divide these interest points into two groups according to the correlations, one is target points and another is background points. In experimental results, the target-to-clutter ratio (TCR) and the receiver operating characteristics (ROC) curves are computed experimentally to compare the performances of the proposed method and other five sophisticated methods. From the results, the proposed method shows a better discrimination of targets and clutters and has a lower false alarm rate than the existing moving target detection methods.

  1. Beam-induced and cosmic-ray backgrounds observed in the ATLAS detector during the LHC 2012 proton-proton running period

    NASA Astrophysics Data System (ADS)

    Aad, G.; Abbott, B.; Abdallah, J.; Abdinov, O.; Abeloos, B.; Aben, R.; Abolins, M.; AbouZeid, O. S.; Abraham, N. L.; Abramowicz, H.; Abreu, H.; Abreu, R.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Adelman, J.; Adomeit, S.; Adye, T.; Affolder, A. A.; Agatonovic-Jovin, T.; Agricola, J.; Aguilar-Saavedra, J. A.; Ahlen, S. P.; Ahmadov, F.; Aielli, G.; Akerstedt, H.; Åkesson, T. P. A.; Akimov, A. V.; Alberghi, G. L.; Albert, J.; Albrand, S.; Alconada Verzini, M. J.; Aleksa, M.; Aleksandrov, I. N.; Alexa, C.; Alexander, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alison, J.; Alkire, S. P.; Allbrooke, B. M. M.; Allen, B. W.; Allport, P. P.; Aloisio, A.; Alonso, A.; Alonso, F.; Alpigiani, C.; Alvarez Gonzalez, B.; Álvarez Piqueras, D.; Alviggi, M. G.; Amadio, B. T.; Amako, K.; Amaral Coutinho, Y.; Amelung, C.; Amidei, D.; Amor Dos Santos, S. P.; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anders, J. K.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Angelidakis, S.; Angelozzi, I.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Arabidze, G.; Arai, Y.; Araque, J. P.; Arce, A. T. H.; Arduh, F. A.; Arduini, G.; Arguin, J.-F.; Argyropoulos, S.; Arik, M.; Armbruster, A. J.; Armitage, L. J.; Arnaez, O.; Arnold, H.; Arratia, M.; Arslan, O.; Artamonov, A.; Artoni, G.; Artz, S.; Asai, S.; Asbah, N.; Ashkenazi, A.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Atkinson, M.; Atlay, N. B.; Augsten, K.; Avolio, G.; Axen, B.; Ayoub, M. K.; Azuelos, G.; Baak, M. A.; Baas, A. E.; Baca, M. J.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Baines, J. T.; Baker, O. K.; Baldin, E. M.; Balek, P.; Balestri, T.; Balli, F.; Balunas, W. K.; Banas, E.; Banerjee, Sw.; Bannoura, A. A. E.; Barak, L.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barklow, T.; Barlow, N.; Barnes, S. L.; Barnett, B. M.; Barnett, R. M.; Barnovska, Z.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barranco Navarro, L.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Basalaev, A.; Bassalat, A.; Basye, A.; Bates, R. L.; Batista, S. J.; Batley, J. R.; Battaglia, M.; Bauce, M.; Bauer, F.; Bawa, H. S.; Beacham, J. B.; Beattie, M. D.; Beau, T.; Beauchemin, P. H.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, M.; Beckingham, M.; Becot, C.; Beddall, A. J.; Beddall, A.; Bednyakov, V. A.; Bedognetti, M.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, J. K.; Belanger-Champagne, C.; Bell, A. S.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belotskiy, K.; Beltramello, O.; Belyaev, N. L.; Benary, O.; Benchekroun, D.; Bender, M.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez, J.; Benitez Garcia, J. A.; Benjamin, D. P.; Bensinger, J. R.; Bentvelsen, S.; Beresford, L.; Beretta, M.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Beringer, J.; Berlendis, S.; Bernard, N. R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertoli, G.; Bertolucci, F.; Bertram, I. A.; Bertsche, C.; Bertsche, D.; Besjes, G. J.; Bessidskaia Bylund, O.; Bessner, M.; Besson, N.; Betancourt, C.; Bethke, S.; Bevan, A. J.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Biedermann, D.; Bielski, R.; Biesuz, N. V.; Biglietti, M.; Bilbao De Mendizabal, J.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Biondi, S.; Bjergaard, D. M.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blanco, J. E.; Blazek, T.; Bloch, I.; Blocker, C.; Blum, W.; Blumenschein, U.; Blunier, S.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Bock, C.; Boehler, M.; Boerner, D.; Bogaerts, J. A.; Bogavac, D.; Bogdanchikov, A. G.; Bohm, C.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bomben, M.; Bona, M.; Boonekamp, M.; Borisov, A.; Borissov, G.; Bortfeldt, J.; Bortoletto, D.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Bossio Sola, J. D.; Boudreau, J.; Bouffard, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Boutle, S. K.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bracinik, J.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Breaden Madden, W. D.; Brendlinger, K.; Brennan, A. J.; Brenner, L.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Britzger, D.; Brochu, F. M.; Brock, I.; Brock, R.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Broughton, J. H.; Bruce, R.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Bruni, A.; Bruni, G.; Brunt, BH; Bruschi, M.; Bruscino, N.; Bryant, P.; Bryngemark, L.; Buanes, T.; Buat, Q.; Buchholz, P.; Buckley, A. G.; Budagov, I. A.; Buehrer, F.; Bugge, M. K.; Bulekov, O.; Bullock, D.; Burckhart, H.; Burdin, S.; Burgard, C. D.; Burghgrave, B.; Burka, K.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, D.; Büscher, V.; Bussey, P.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Butti, P.; Buttinger, W.; Buzatu, A.; Buzykaev, A. R.; Cabrera Urbán, S.; Caforio, D.; Cairo, V. M.; Cakir, O.; Calace, N.; Calafiura, P.; Calandri, A.; Calderini, G.; Calfayan, P.; Caloba, L. P.; Calvet, D.; Calvet, S.; Calvet, T. P.; Camacho Toro, R.; Camarda, S.; Camarri, P.; Cameron, D.; Caminal Armadans, R.; Camincher, C.; Campana, S.; Campanelli, M.; Campoverde, A.; Canale, V.; Canepa, A.; Cano Bret, M.; Cantero, J.; Cantrill, R.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Carbone, R. M.; Cardarelli, R.; Cardillo, F.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Casolino, M.; Casper, D. W.; Castaneda-Miranda, E.; Castelli, A.; Castillo Gimenez, V.; Castro, N. F.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Caudron, J.; Cavaliere, V.; Cavallaro, E.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerda Alberich, L.; Cerio, B. C.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cerv, M.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, S. K.; Chan, Y. L.; Chang, P.; Chapman, J. D.; Charlton, D. G.; Chatterjee, A.; Chau, C. C.; Chavez Barajas, C. A.; Che, S.; Cheatham, S.; Chegwidden, A.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, S.; Chen, S.; Chen, X.; Chen, Y.; Cheng, H. C.; Cheng, H. J.; Cheng, Y.; Cheplakov, A.; Cheremushkina, E.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiarelli, G.; Chiodini, G.; Chisholm, A. S.; Chitan, A.; Chizhov, M. V.; Choi, K.; Chomont, A. R.; Chouridou, S.; Chow, B. K. B.; Christodoulou, V.; Chromek-Burckhart, D.; Chudoba, J.; Chuinard, A. J.; Chwastowski, J. J.; Chytka, L.; Ciapetti, G.; Ciftci, A. K.; Cinca, D.; Cindro, V.; Cioara, I. A.; Ciocio, A.; Cirotto, F.; Citron, Z. H.; Ciubancan, M.; Clark, A.; Clark, B. L.; Clark, M. R.; Clark, P. J.; Clarke, R. N.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coffey, L.; Colasurdo, L.; Cole, B.; Cole, S.; Colijn, A. P.; Collot, J.; Colombo, T.; Compostella, G.; Conde Muiño, P.; Coniavitis, E.; Connell, S. H.; Connelly, I. A.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Cottin, G.; Cowan, G.; Cox, B. E.; Cranmer, K.; Crawley, S. J.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Cribbs, W. A.; Crispin Ortuzar, M.; Cristinziani, M.; Croft, V.; Crosetti, G.; Cuhadar Donszelmann, T.; Cummings, J.; Curatolo, M.; Cúth, J.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; D'Auria, S.; D'Onofrio, M.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dai, T.; Dale, O.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Dandoy, J. R.; Dang, N. P.; Daniells, A. C.; Dann, N. S.; Danninger, M.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darmora, S.; Dassoulas, J.; Dattagupta, A.; Davey, W.; David, C.; Davidek, T.; Davies, M.; Davison, P.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Benedetti, A.; De Castro, S.; De Cecco, S.; De Groot, N.; de Jong, P.; De la Torre, H.; De Lorenzi, F.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dedovich, D. V.; Deigaard, I.; Del Peso, J.; Del Prete, T.; Delgove, D.; Deliot, F.; Delitzsch, C. M.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Dell'Orso, M.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; DeMarco, D. A.; Demers, S.; Demichev, M.; Demilly, A.; Denisov, S. P.; Denysiuk, D.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deterre, C.; Dette, K.; Deviveiros, P. O.; Dewhurst, A.; Dhaliwal, S.; Di Ciaccio, A.; Di Ciaccio, L.; Di Clemente, W. K.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaconu, C.; Diamond, M.; Dias, F. A.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Diglio, S.; Dimitrievska, A.; Dingfelder, J.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; Djuvsland, J. I.; do Vale, M. A. B.; Dobos, D.; Dobre, M.; Doglioni, C.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dova, M. T.; Doyle, A. T.; Drechsler, E.; Dris, M.; Du, Y.; Duarte-Campderros, J.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Durglishvili, A.; Duschinger, D.; Dutta, B.; Dyndal, M.; Eckardt, C.; Ecker, K. M.; Edgar, R. C.; Edson, W.; Edwards, N. C.; Eifert, T.; Eigen, G.; Einsweiler, K.; Ekelof, T.; El Kacimi, M.; Ellajosyula, V.; Ellert, M.; Elles, S.; Ellinghaus, F.; Elliot, A. A.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Ennis, J. S.; Erdmann, J.; Ereditato, A.; Ernis, G.; Ernst, J.; Ernst, M.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Esposito, B.; Etienvre, A. I.; Etzion, E.; Evans, H.; Ezhilov, A.; Fabbri, F.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Falla, R. J.; Faltova, J.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farina, C.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Faucci Giannelli, M.; Favareto, A.; Fawcett, W. J.; Fayard, L.; Fedin, O. L.; Fedorko, W.; Feigl, S.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Feremenga, L.; Fernandez Martinez, P.; Fernandez Perez, S.; Ferrando, J.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, A.; Fischer, C.; Fischer, J.; Fisher, W. C.; Flaschel, N.; Fleck, I.; Fleischmann, P.; Fletcher, G. T.; Fletcher, G.; Fletcher, R. R. M.; Flick, T.; Floderus, A.; Flores Castillo, L. R.; Flowerdew, M. J.; Forcolin, G. T.; Formica, A.; Forti, A.; Foster, A. G.; Fournier, D.; Fox, H.; Fracchia, S.; Francavilla, P.; Franchini, M.; Francis, D.; Franconi, L.; Franklin, M.; Frate, M.; Fraternali, M.; Freeborn, D.; Fressard-Batraneanu, S. M.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fusayasu, T.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gach, G. P.; Gadatsch, S.; Gadomski, S.; Gagliardi, G.; Gagnon, L. G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gao, J.; Gao, Y.; Gao, Y. S.; Garay Walls, F. M.; García, C.; García Navarro, J. E.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gascon Bravo, A.; Gatti, C.; Gaudiello, A.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Gecse, Z.; Gee, C. N. P.; Geich-Gimbel, Ch.; Geisler, M. P.; Gemme, C.; Genest, M. H.; Geng, C.; Gentile, S.; George, S.; Gerbaudo, D.; Gershon, A.; Ghasemi, S.; Ghazlane, H.; Ghneimat, M.; Giacobbe, B.; Giagu, S.; Giannetti, P.; Gibbard, B.; Gibson, S. M.; Gignac, M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gilles, G.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giorgi, F. M.; Giorgi, F. M.; Giraud, P. F.; Giromini, P.; Giugni, D.; Giuli, F.; Giuliani, C.; Giulini, M.; Gjelsten, B. K.; Gkaitatzis, S.; Gkialas, I.; Gkougkousis, E. L.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glaysher, P. C. F.; Glazov, A.; Goblirsch-Kolb, M.; Godlewski, J.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gonçalo, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, L.; Gongadze, A.; González de la Hoz, S.; Gonzalez Parra, G.; Gonzalez-Sevilla, S.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Goudet, C. R.; Goujdami, D.; Goussiou, A. G.; Govender, N.; Gozani, E.; Graber, L.; Grabowska-Bold, I.; Gradin, P. O. J.; Grafström, P.; Gramling, J.; Gramstad, E.; Grancagnolo, S.; Gratchev, V.; Gray, H. M.; Graziani, E.; Greenwood, Z. D.; Grefe, C.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Grevtsov, K.; Griffiths, J.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grivaz, J.-F.; Groh, S.; Grohs, J. P.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Grout, Z. J.; Guan, L.; Guan, W.; Guenther, J.; Guescini, F.; Guest, D.; Gueta, O.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Guo, J.; Guo, Y.; Gupta, S.; Gustavino, G.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haddad, N.; Hadef, A.; Haefner, P.; Hageböck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Haley, J.; Hall, D.; Halladjian, G.; Hallewell, G. D.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamilton, A.; Hamity, G. N.; Hamnett, P. G.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Haney, B.; Hanke, P.; Hanna, R.; Hansen, J. B.; Hansen, J. D.; Hansen, M. C.; Hansen, P. H.; Hara, K.; Hard, A. S.; Harenberg, T.; Hariri, F.; Harkusha, S.; Harrington, R. D.; Harrison, P. F.; Hartjes, F.; Hasegawa, M.; Hasegawa, Y.; Hasib, A.; Hassani, S.; Haug, S.; Hauser, R.; Hauswald, L.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayden, D.; Hays, C. P.; Hays, J. M.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heim, T.; Heinemann, B.; Heinrich, J. J.; Heinrich, L.; Heinz, C.; Hejbal, J.; Helary, L.; Hellman, S.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Heng, Y.; Henkelmann, S.; Henriques Correia, A. M.; Henrot-Versille, S.; Herbert, G. H.; Hernández Jiménez, Y.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hetherly, J. W.; Hickling, R.; Higón-Rodriguez, E.; Hill, E.; Hill, J. C.; Hiller, K. H.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hinman, R. R.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoenig, F.; Hohlfeld, M.; Hohn, D.; Holmes, T. R.; Homann, M.; Hong, T. M.; Hooberman, B. H.; Hopkins, W. H.; Horii, Y.; Horton, A. J.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hrynevich, A.; Hsu, C.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, Q.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Ideal, E.; Idrissi, Z.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikeno, M.; Ilchenko, Y.; Iliadis, D.; Ilic, N.; Ince, T.; Introzzi, G.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Irles Quiles, A.; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ito, F.; Ponce, J. M. Iturbe; Iuppa, R.; Ivarsson, J.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jabbar, S.; Jackson, B.; Jackson, M.; Jackson, P.; Jain, V.; Jakobi, K. B.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansky, R.; Janssen, J.; Janus, M.; Jarlskog, G.; Javadov, N.; Javůrek, T.; Jeanneau, F.; Jeanty, L.; Jejelava, J.; Jeng, G.-Y.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Ji, H.; Jia, J.; Jiang, H.; Jiang, Y.; Jiggins, S.; Jimenez Pena, J.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Johansson, P.; Johns, K. A.; Johnson, W. J.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, S.; Jones, T. J.; Jongmanns, J.; Jorge, P. M.; Jovicevic, J.; Ju, X.; Juste Rozas, A.; Köhler, M. K.; Kaczmarska, A.; Kado, M.; Kagan, H.; Kagan, M.; Kahn, S. J.; Kajomovitz, E.; Kalderon, C. W.; Kaluza, A.; Kama, S.; Kamenshchikov, A.; Kanaya, N.; Kaneti, S.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kaplan, L. S.; Kapliy, A.; Kar, D.; Karakostas, K.; Karamaoun, A.; Karastathis, N.; Kareem, M. J.; Karentzos, E.; Karnevskiy, M.; Karpov, S. N.; Karpova, Z. M.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kasahara, K.; Kashif, L.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Kato, C.; Katre, A.; Katzy, J.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Keeler, R.; Kehoe, R.; Keller, J. S.; Kempster, J. J.; Kentaro, K.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Keyes, R. A.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharlamov, A. G.; Khoo, T. J.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kido, S.; Kim, H. Y.; Kim, S. H.; Kim, Y. K.; Kimura, N.; Kind, O. M.; King, B. T.; King, M.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kiss, F.; Kiuchi, K.; Kivernyk, O.; Kladiva, E.; Klein, M. H.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klioutchnikova, T.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Knapik, J.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, A.; Kobayashi, D.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Koi, T.; Kolanoski, H.; Kolb, M.; Koletsou, I.; Komar, A. A.; Komori, Y.; Kondo, T.; Kondrashova, N.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Kortner, O.; Kortner, S.; Kosek, T.; Kostyukhin, V. V.; Kotwal, A.; Kourkoumeli-Charalampidi, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewska, A. B.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kramarenko, V. A.; Kramberger, G.; Krasnopevtsev, D.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kretz, M.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, P.; Krizka, K.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Krumnack, N.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kucuk, H.; Kuday, S.; Kuechler, J. T.; Kuehn, S.; Kugel, A.; Kuger, F.; Kuhl, A.; Kuhl, T.; Kukhtin, V.; Kukla, R.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunigo, T.; Kupco, A.; Kurashige, H.; Kurochkin, Y. A.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwan, T.; Kyriazopoulos, D.; La Rosa, A.; La Rosa Navarro, J. L.; La Rotonda, L.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Lammers, S.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, J. C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Lasagni Manghi, F.; Lassnig, M.; Laurelli, P.; Lavrijsen, W.; Law, A. T.; Laycock, P.; Lazovich, T.; Lazzaroni, M.; Le Dortz, O.; Le Guirriec, E.; Le Menedeu, E.; Le Quilleuc, E. P.; LeBlanc, M.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmann Miotto, G.; Lei, X.; Leight, W. A.; Leisos, A.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Leney, K. J. C.; Lenz, T.; Lenzi, B.; Leone, R.; Leone, S.; Leonidopoulos, C.; Leontsinis, S.; Lerner, G.; Leroy, C.; Lesage, A. A. J.; Lester, C. G.; Levchenko, M.; Levêque, J.; Levin, D.; Levinson, L. J.; Levy, M.; Leyko, A. M.; Leyton, M.; Li, B.; Li, H.; Li, H. L.; Li, L.; Li, L.; Li, Q.; Li, S.; Li, X.; Li, Y.; Liang, Z.; Liao, H.; Liberti, B.; Liblong, A.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Lin, S. C.; Lin, T. H.; Lindquist, B. E.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, H.; Liu, H.; Liu, J.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y. L.; Liu, Y.; Livan, M.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loew, K. M.; Loginov, A.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Long, B. A.; Long, J. D.; Long, R. E.; Longo, L.; Looper, K. A.; Lopes, L.; Lopez Mateos, D.; Lopez Paredes, B.; Lopez Paz, I.; Lopez Solis, A.; Lorenz, J.; Martinez, N. Lorenzo; Losada, M.; Lösel, P. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lu, H.; Lu, N.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Luedtke, C.; Luehring, F.; Lukas, W.; Luminari, L.; Lundberg, O.; Lund-Jensen, B.; Lynn, D.; Lysak, R.; Lytken, E.; Lyubushkin, V.; Ma, H.; Ma, L. L.; Ma, Y.; Maccarrone, G.; Macchiolo, A.; Macdonald, C. M.; Maček, B.; Machado Miguens, J.; Madaffari, D.; Madar, R.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeda, J.; Maeland, S.; Maeno, T.; Maevskiy, A.; Magradze, E.; Mahlstedt, J.; Maiani, C.; Maidantchik, C.; Maier, A. A.; Maier, T.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyukov, S.; Mamuzic, J.; Mancini, G.; Mandelli, B.; Mandelli, L.; Mandić, I.; Maneira, J.; Filho, L. Manhaes de Andrade; Manjarres Ramos, J.; Mann, A.; Mansoulie, B.; Mantifel, R.; Mantoani, M.; Manzoni, S.; Mapelli, L.; Marceca, G.; March, L.; Marchiori, G.; Marcisovsky, M.; Marjanovic, M.; Marley, D. E.; Marroquim, F.; Marsden, S. P.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, T. A.; Martin, V. J.; dit Latour, B. Martin; Martinez, M.; Martin-Haugh, S.; Martoiu, V. S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massa, L.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Mättig, P.; Mattmann, J.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazza, S. M.; McFadden, N. C.; McGoldrick, G.; McKee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McClymont, L. I.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; McMahon, S. J.; McPherson, R. A.; Medinnis, M.; Meehan, S.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Mellado Garcia, B. R.; Meloni, F.; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Theenhausen, H. Meyer Zu; Middleton, R. P.; Miglioranzi, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Milesi, M.; Milic, A.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Minaenko, A. A.; Minami, Y.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mistry, K. P.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Miucci, A.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Mochizuki, K.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Monden, R.; Mondragon, M. C.; Mönig, K.; Monk, J.; Monnier, E.; Montalbano, A.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Morange, N.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Mori, D.; Mori, T.; Morii, M.; Morinaga, M.; Morisbak, V.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Mortensen, S. S.; Morvaj, L.; Mosidze, M.; Moss, J.; Motohashi, K.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Muanza, S.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, R. S. P.; Mueller, T.; Muenstermann, D.; Mullen, P.; Mullier, G. A.; Munoz Sanchez, F. J.; Murillo Quijada, J. A.; Murray, W. J.; Musheghyan, H.; Muskinja, M.; Myagkov, A. G.; Myska, M.; Nachman, B. P.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagano, K.; Nagasaka, Y.; Nagata, K.; Nagel, M.; Nagy, E.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Naranjo Garcia, R. F.; Narayan, R.; Narrias Villar, D. I.; Naryshkin, I.; Naumann, T.; Navarro, G.; Nayyar, R.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Nef, P. D.; Negri, A.; Negrini, M.; Nektarijevic, S.; Nellist, C.; Nelson, A.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neves, R. M.; Nevski, P.; Newman, P. R.; Nguyen, D. H.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolopoulos, K.; Nilsen, J. K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Nooney, T.; Norberg, S.; Nordberg, M.; Norjoharuddeen, N.; Novgorodova, O.; Nowak, S.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nurse, E.; Nuti, F.; O'grady, F.; O'Neil, D. C.; O'Rourke, A. A.; O'Shea, V.; Oakham, F. G.; Oberlack, H.; Obermann, T.; Ocariz, J.; Ochi, A.; Ochoa, I.; Ochoa-Ricoux, J. P.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohman, H.; Oide, H.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Oleiro Seabra, L. F.; Olivares Pino, S. A.; Oliveira Damazio, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onogi, K.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Orr, R. S.; Osculati, B.; Ospanov, R.; Garzon, G. Otero y.; Otono, H.; Ouchrif, M.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Owen, R. E.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Padilla Aranda, C.; Pagáčová, M.; Pagan Griso, S.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Palka, M.; Pallin, D.; Palm, M.; Palma, A.; Panagiotopoulou, E. St.; Pandini, C. E.; Panduro Vazquez, J. G.; Pani, P.; Panitkin, S.; Pantea, D.; Paolozzi, L.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, A. J.; Parker, M. A.; Parker, K. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pascuzzi, V. R.; Pasqualucci, E.; Passaggio, S.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Pauly, T.; Pearce, J.; Pearson, B.; Pedersen, L. E.; Pedersen, M.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Penc, O.; Peng, C.; Peng, H.; Penwell, J.; Peralva, B. S.; Perego, M. M.; Perepelitsa, D. V.; Perez Codina, E.; Perini, L.; Pernegger, H.; Perrella, S.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petroff, P.; Petrolo, E.; Petrov, M.; Petrucci, F.; Pettersson, N. E.; Peyaud, A.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Pickering, M. A.; Piegaia, R.; Pilcher, J. E.; Pilkington, A. D.; Pin, A. W. J.; Pina, J.; Pinamonti, M.; Pinfold, J. L.; Pingel, A.; Pires, S.; Pirumov, H.; Pitt, M.; Plazak, L.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Pluth, D.; Poettgen, R.; Poggioli, L.; Pohl, D.; Polesello, G.; Poley, A.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Pozo Astigarraga, M. E.; Pralavorio, P.; Pranko, A.; Prell, S.; Price, D.; Price, L. E.; Primavera, M.; Prince, S.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopopescu, S.; Proudfoot, J.; Przybycien, M.; Puddu, D.; Puldon, D.; Purohit, M.; Puzo, P.; Qian, J.; Qin, G.; Qin, Y.; Quadt, A.; Quayle, W. B.; Queitsch-Maitland, M.; Quilty, D.; Raddum, S.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Rados, P.; Ragusa, F.; Rahal, G.; Raine, J. A.; Rajagopalan, S.; Rammensee, M.; Rangel-Smith, C.; Ratti, M. G.; Rauscher, F.; Rave, S.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Readioff, N. P.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Rehnisch, L.; Reichert, J.; Reisin, H.; Rembser, C.; Ren, H.; Rescigno, M.; Resconi, S.; Rezanova, O. L.; Reznicek, P.; Rezvani, R.; Richter, R.; Richter, S.; Richter-Was, E.; Ricken, O.; Ridel, M.; Rieck, P.; Riegel, C. J.; Rieger, J.; Rifki, O.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ristić, B.; Ritsch, E.; Riu, I.; Rizatdinova, F.; Rizvi, E.; Rizzi, C.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Roda, C.; Rodina, Y.; Rodriguez Perez, A.; Rodriguez Rodriguez, D.; Roe, S.; Rogan, C. S.; RØhne, O.; Romaniouk, A.; Romano, M.; Romano Saez, S. M.; Romero Adam, E.; Rompotis, N.; Ronzani, M.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, P.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, J. H. N.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Russell, H. L.; Rutherfoord, J. P.; Ruthmann, N.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryu, S.; Ryzhov, A.; Saavedra, A. F.; Sabato, G.; Sacerdoti, S.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Saha, P.; Sahinsoy, M.; Saimpert, M.; Saito, T.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Salazar Loyola, J. E.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sammel, D.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sandbach, R. L.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sannino, M.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Santoyo Castillo, I.; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarrazin, B.; Sasaki, O.; Sasaki, Y.; Sato, K.; Sauvage, G.; Sauvan, E.; Savage, G.; Savard, P.; Sawyer, C.; Sawyer, L.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Scarfone, V.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaefer, R.; Schaeffer, J.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Schiavi, C.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmieden, K.; Schmitt, C.; Schmitt, S.; Schmitz, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schopf, E.; Schorlemmer, A. L. S.; Schott, M.; Schovancova, J.; Schramm, S.; Schreyer, M.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwanenberger, C.; Schwartzman, A.; Schwarz, T. A.; Schwegler, Ph.; Schweiger, H.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Sciolla, G.; Scuri, F.; Scutti, F.; Searcy, J.; Seema, P.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekhon, K.; Sekula, S. J.; Seliverstov, D. M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Sessa, M.; Seuster, R.; Severini, H.; Sfiligoj, T.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shaikh, N. W.; Shan, L. Y.; Shang, R.; Shank, J. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Shaw, S. M.; Shcherbakova, A.; Shehu, C. Y.; Sherwood, P.; Shi, L.; Shimizu, S.; Shimmin, C. O.; Shimojima, M.; Shiyakova, M.; Shmeleva, A.; Shoaleh Saadi, D.; Shochet, M. J.; Shojaii, S.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Sicho, P.; Sidebo, P. E.; Sidiropoulou, O.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silva, J.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simon, D.; Simon, M.; Sinervo, P.; Sinev, N. B.; Sioli, M.; Siragusa, G.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinner, M. B.; Skottowe, H. P.; Skubic, P.; Slater, M.; Slavicek, T.; Slawinska, M.; Sliwa, K.; Slovak, R.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, M. N. K.; Smith, R. W.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snyder, S.; Sobie, R.; Socher, F.; Soffer, A.; Soh, D. A.; Sokhrannyi, G.; Solans Sanchez, C. A.; Solar, M.; Soldatov, E. Yu.; Soldevila, U.; Solodkov, A. A.; Soloshenko, A.; Solovyanov, O. V.; Solovyev, V.; Sommer, P.; Son, H.; Song, H. Y.; Sood, A.; Sopczak, A.; Sopko, V.; Sorin, V.; Sosa, D.; Sotiropoulou, C. L.; Soualah, R.; Soukharev, A. M.; South, D.; Sowden, B. C.; Spagnolo, S.; Spalla, M.; Spangenberg, M.; Spanò, F.; Sperlich, D.; Spettel, F.; Spighi, R.; Spigo, G.; Spiller, L. A.; Spousta, M.; St. Denis, R. D.; Stabile, A.; Stahlman, J.; Stamen, R.; Stamm, S.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, G. H.; Stark, J.; Staroba, P.; Starovoitov, P.; Stärz, S.; Staszewski, R.; Steinberg, P.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoicea, G.; Stolte, P.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Stramaglia, M. E.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Strubig, A.; Stucci, S. A.; Stugu, B.; Styles, N. A.; Su, D.; Su, J.; Subramaniam, R.; Suchek, S.; Sugaya, Y.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, S.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, S.; Svatos, M.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Taccini, C.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tannenwald, B. B.; Tapia Araya, S.; Tapprogge, S.; Tarem, S.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, A. C.; Taylor, G. N.; Taylor, P. T. E.; Taylor, W.; Teischinger, F. A.; Teixeira-Dias, P.; Temming, K. K.; Temple, D.; Ten Kate, H.; Teng, P. K.; Teoh, J. J.; Tepel, F.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Theveneaux-Pelzer, T.; Thomas, J. P.; Thomas-Wilsker, J.; Thompson, E. N.; Thompson, P. D.; Thompson, R. J.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Tibbetts, M. J.; Ticse Torres, R. E.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tipton, P.; Tisserant, S.; Todome, K.; Todorov, T.; Todorova-Nova, S.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tolley, E.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Tong, B.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Trefzger, T.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Trischuk, W.; Trocmé, B.; Trofymov, A.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; Truong, L.; Trzebinski, M.; Trzupek, A.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsui, K. M.; Tsukerman, I. I.; Tsulaia, V.; Tsuno, S.; Tsybychev, D.; Tudorache, A.; Tudorache, V.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turgeman, D.; Turra, R.; Turvey, A. J.; Tuts, P. M.; Tyndel, M.; Ucchielli, G.; Ueda, I.; Ueno, R.; Ughetto, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Unverdorben, C.; Urban, J.; Urquijo, P.; Urrejola, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valderanis, C.; Valdes Santurio, E.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Vallecorsa, S.; Valls Ferrer, J. A.; Van Den Wollenberg, W.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vanguri, R.; Vaniachine, A.; Vankov, P.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vasquez, J. G.; Vazeille, F.; Vazquez Schroeder, T.; Veatch, J.; Veloce, L. M.; Veloso, F.; Veneziano, S.; Ventura, A.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Boeriu, O. E. Vickey; Viehhauser, G. H. A.; Viel, S.; Vigani, L.; Vigne, R.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Vittori, C.; Vivarelli, I.; Vlachos, S.; Vlasak, M.; Vogel, M.; Vokac, P.; Volpi, G.; Volpi, M.; von der Schmitt, H.; von Toerne, E.; Vorobel, V.; Vorobev, K.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, P.; Wagner, W.; Wahlberg, H.; Wahrmund, S.; Wakabayashi, J.; Walder, J.; Walker, R.; Walkowiak, W.; Wallangen, V.; Wang, C.; Wang, C.; Wang, F.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, T.; Wang, X.; Wanotayaroj, C.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Washbrook, A.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weinert, B.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; Whallon, N. L.; Wharton, A. M.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wildauer, A.; Wilk, F.; Wilkens, H. G.; Williams, H. H.; Williams, S.; Willis, C.; Willocq, S.; Wilson, J. A.; Wingerter-Seez, I.; Winklmeier, F.; Winston, O. J.; Winter, B. T.; Wittgen, M.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wu, M.; Wu, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yakabe, R.; Yamaguchi, D.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, S.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, Y.; Yang, Z.; Yao, W.-M.; Yap, Y. C.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yeletskikh, I.; Yen, A. L.; Yildirim, E.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J. M.; Yu, J.; Yuan, L.; Yuen, S. P. Y.; Yusuff, I.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zakharchuk, N.; Zalieckas, J.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zeng, J. C.; Zeng, Q.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zhang, D.; Zhang, F.; Zhang, G.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, R.; Zhang, R.; Zhang, X.; Zhang, Z.; Zhao, X.; Zhao, Y.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, C.; Zhou, L.; Zhou, L.; Zhou, M.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zhukov, K.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, S.; Zinonos, Z.; Zinser, M.; Ziolkowski, M.; Živković, L.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zurzolo, G.; Zwalinski, L.

    2016-05-01

    This paper discusses various observations on beam-induced and cosmic-ray backgrounds in the ATLAS detector during the LHC 2012 proton-proton run. Building on published results based on 2011 data, the correlations between background and residual pressure of the beam vacuum are revisited. Ghost charge evolution over 2012 and its role for backgrounds are evaluated. New methods to monitor ghost charge with beam-gas rates are presented and observations of LHC abort gap population by ghost charge are discussed in detail. Fake jets from colliding bunches and from ghost charge are analysed with improved methods, showing that ghost charge in individual radio-frequency buckets of the LHC can be resolved. Some results of two short periods of dedicated cosmic-ray background data-taking are shown; in particular cosmic-ray muon induced fake jet rates are compared to Monte Carlo simulations and to the fake jet rates from beam background. A thorough analysis of a particular LHC fill, where abnormally high background was observed, is presented. Correlations between backgrounds and beam intensity losses in special fills with very high β* are studied.

  2. Beam-induced and cosmic-ray backgrounds observed in the ATLAS detector during the LHC 2012 proton-proton running period

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2016-05-20

    This paper discusses various observations on beam-induced and cosmic-ray backgrounds in the ATLAS detector during the LHC 2012 proton-proton run. Building on published results based on 2011 data, the correlations between background and residual pressure of the beam vacuum are revisited. Ghost charge evolution over 2012 and its role for backgrounds are evaluated. New methods to monitor ghost charge with beam-gas rates are presented and observations of LHC abort gap population by ghost charge are discussed in detail. Fake jets from colliding bunches and from ghost charge are analysed with improved methods, showing that ghost charge in individual radio-frequency bucketsmore » of the LHC can be resolved. Some results of two short periods of dedicated cosmic-ray background data-taking are shown; in particular cosmic-ray muon induced fake jet rates are compared to Monte Carlo simulations and to the fake jet rates from beam background. A thorough analysis of a particular LHC fill, where abnormally high background was observed, is presented. Correlations between backgrounds and beam intensity losses in special fills with very high β* are studied.« less

  3. The migration background in multicultural care settings – results of a qualitative and quantitative survey of elderly migrants from Turkey

    PubMed

    Krobisch, Verena; Sonntag, Pia-Theresa; Gül, Kübra; Aronson, Polina; Schenk, Liane

    2016-11-01

    Background: Migration is associated with an increase of multicultural care settings. The acceptance of such care relations from user’s point of view has been rarely explored yet. Aim: It is examined, if and how elderly migrants from Turkey consider a common migration background respectively a common socio-cultural background of caregivers as relevant. Method: In terms of data triangulation results of a qualitative study and a quantitative study on care expectations of elderly migrants from Turkey have been merged. Data was collected by means of guideline-based and standardised interviews. Analysis included the documentary method according to Bohnsack as well as descriptive and multivariate methods. Results: Cultural and migration-related aspects are considered relevant by the vast majority of respondents. Turkish language skills of caregivers are important to more than three-quarters. According to qualitative results, the possibility to objective as well as culturally shaped intuitive communication in the mother tongue is crucial. Correspondingly, a low level of German language skills and a Turkish ethnic identity are associated with a need of migration-sensitive care. Conclusions: A common socio-cultural background with caregivers and the common mother tongue appear to be prerequisites of good care for elderly migrants from Turkey. Further research should examine the conditions under which multicultural care settings are accepted in this group.

  4. Effect of background dielectric on TE-polarized photonic bandgap of metallodielectric photonic crystals using Dirichlet-to-Neumann map method.

    PubMed

    Sedghi, Aliasghar; Rezaei, Behrooz

    2016-11-20

    Using the Dirichlet-to-Neumann map method, we have calculated the photonic band structure of two-dimensional metallodielectric photonic crystals having the square and triangular lattices of circular metal rods in a dielectric background. We have selected the transverse electric mode of electromagnetic waves, and the resulting band structures showed the existence of photonic bandgap in these structures. We theoretically study the effect of background dielectric on the photonic bandgap.

  5. The location and recognition of anti-counterfeiting code image with complex background

    NASA Astrophysics Data System (ADS)

    Ni, Jing; Liu, Quan; Lou, Ping; Han, Ping

    2017-07-01

    The order of cigarette market is a key issue in the tobacco business system. The anti-counterfeiting code, as a kind of effective anti-counterfeiting technology, can identify counterfeit goods, and effectively maintain the normal order of market and consumers' rights and interests. There are complex backgrounds, light interference and other problems in the anti-counterfeiting code images obtained by the tobacco recognizer. To solve these problems, the paper proposes a locating method based on Susan operator, combined with sliding window and line scanning,. In order to reduce the interference of background and noise, we extract the red component of the image and convert the color image into gray image. For the confusing characters, recognition results correction based on the template matching method has been adopted to improve the recognition rate. In this method, the anti-counterfeiting code can be located and recognized correctly in the image with complex background. The experiment results show the effectiveness and feasibility of the approach.

  6. Recursive least squares background prediction of univariate syndromic surveillance data

    PubMed Central

    2009-01-01

    Background Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Methods Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. Results We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. Conclusion The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the residuals of the RLS forecasts. We compare the detection performance of this algorithm to the W2 method recently implemented at CDC. The modified RLS method gives consistently better sensitivity at multiple background alert rates, and we recommend that it should be considered for routine application in bio-surveillance systems. PMID:19149886

  7. Fluorescence background removal method for biological Raman spectroscopy based on empirical mode decomposition.

    PubMed

    Leon-Bejarano, Maritza; Dorantes-Mendez, Guadalupe; Ramirez-Elias, Miguel; Mendez, Martin O; Alba, Alfonso; Rodriguez-Leyva, Ildefonso; Jimenez, M

    2016-08-01

    Raman spectroscopy of biological tissue presents fluorescence background, an undesirable effect that generates false Raman intensities. This paper proposes the application of the Empirical Mode Decomposition (EMD) method to baseline correction. EMD is a suitable approach since it is an adaptive signal processing method for nonlinear and non-stationary signal analysis that does not require parameters selection such as polynomial methods. EMD performance was assessed through synthetic Raman spectra with different signal to noise ratio (SNR). The correlation coefficient between synthetic Raman spectra and the recovered one after EMD denoising was higher than 0.92. Additionally, twenty Raman spectra from skin were used to evaluate EMD performance and the results were compared with Vancouver Raman algorithm (VRA). The comparison resulted in a mean square error (MSE) of 0.001554. High correlation coefficient using synthetic spectra and low MSE in the comparison between EMD and VRA suggest that EMD could be an effective method to remove fluorescence background in biological Raman spectra.

  8. Urban Background Study Webinar

    EPA Pesticide Factsheets

    This webinar presented the methodology developed for collecting a city-wide or urban area background data set, general results of southeastern cities data collected to date, and a case study that used this sampling method.

  9. Limitations of the background field method applied to Rayleigh-Bénard convection

    NASA Astrophysics Data System (ADS)

    Nobili, Camilla; Otto, Felix

    2017-09-01

    We consider Rayleigh-Bénard convection as modeled by the Boussinesq equations, in the case of infinite Prandtl numbers and with no-slip boundary condition. There is a broad interest in bounds of the upwards heat flux, as given by the Nusselt number Nu, in terms of the forcing via the imposed temperature difference, as given by the Rayleigh number in the turbulent regime Ra ≫ 1 . In several studies, the background field method applied to the temperature field has been used to provide upper bounds on Nu in terms of Ra. In these applications, the background field method comes in the form of a variational problem where one optimizes a stratified temperature profile subject to a certain stability condition; the method is believed to capture the marginal stability of the boundary layer. The best available upper bound via this method is Nu ≲Ra/1 3 ( ln R a )/1 15 ; it proceeds via the construction of a stable temperature background profile that increases logarithmically in the bulk. In this paper, we show that the background temperature field method cannot provide a tighter upper bound in terms of the power of the logarithm. However, by another method, one does obtain the tighter upper bound Nu ≲ Ra /1 3 ( ln ln Ra ) /1 3 so that the result of this paper implies that the background temperature field method is unphysical in the sense that it cannot provide the optimal bound.

  10. Salient object detection based on discriminative boundary and multiple cues integration

    NASA Astrophysics Data System (ADS)

    Jiang, Qingzhu; Wu, Zemin; Tian, Chang; Liu, Tao; Zeng, Mingyong; Hu, Lei

    2016-01-01

    In recent years, many saliency models have achieved good performance by taking the image boundary as the background prior. However, if all boundaries of an image are equally and artificially selected as background, misjudgment may happen when the object touches the boundary. We propose an algorithm called weighted contrast optimization based on discriminative boundary (wCODB). First, a background estimation model is reliably constructed through discriminating each boundary via Hausdorff distance. Second, the background-only weighted contrast is improved by fore-background weighted contrast, which is optimized through weight-adjustable optimization framework. Then to objectively estimate the quality of a saliency map, a simple but effective metric called spatial distribution of saliency map and mean saliency in covered window ratio (MSR) is designed. Finally, in order to further promote the detection result using MSR as the weight, we propose a saliency fusion framework to integrate three other cues-uniqueness, distribution, and coherence from three representative methods into our wCODB model. Extensive experiments on six public datasets demonstrate that our wCODB performs favorably against most of the methods based on boundary, and the integrated result outperforms all state-of-the-art methods.

  11. A method for Removing Surface Contamination on Ultra-pure Copper Spectrometer Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoppe, Eric W.; Seifert, Allen; Aalseth, Craig E.

    Spectrometers for the lowest-level radiometric measurements require materials of extreme radiopurity. Measurements of rare nuclear decays, e.g. neutrinoless double-beta decay, can require construction and shielding materials with bulk radiopurity reaching one micro-Becquerel per kilogram or less. When such extreme material purity is achieved, surface contamination, particularly solid daughters in the natural radon decay chains, can become the limiting background. High-purity copper is an important material for ultra-low-background spectrometers and thus is the focus of this work. A method for removing surface contamination at very low levels without attacking the bulk material is described. An assay method using a low-background proportionalmore » counter made of the material under examination is employed, and the resulting preliminary result of achievable surface contamination levels is presented.« less

  12. VNIR hyperspectral background characterization methods in adverse weather conditions

    NASA Astrophysics Data System (ADS)

    Romano, João M.; Rosario, Dalton; Roth, Luz

    2009-05-01

    Hyperspectral technology is currently being used by the military to detect regions of interest where potential targets may be located. Weather variability, however, may affect the ability for an algorithm to discriminate possible targets from background clutter. Nonetheless, different background characterization approaches may facilitate the ability for an algorithm to discriminate potential targets over a variety of weather conditions. In a previous paper, we introduced a new autonomous target size invariant background characterization process, the Autonomous Background Characterization (ABC) or also known as the Parallel Random Sampling (PRS) method, features a random sampling stage, a parallel process to mitigate the inclusion by chance of target samples into clutter background classes during random sampling; and a fusion of results at the end. In this paper, we will demonstrate how different background characterization approaches are able to improve performance of algorithms over a variety of challenging weather conditions. By using the Mahalanobis distance as the standard algorithm for this study, we compare the performance of different characterization methods such as: the global information, 2 stage global information, and our proposed method, ABC, using data that was collected under a variety of adverse weather conditions. For this study, we used ARDEC's Hyperspectral VNIR Adverse Weather data collection comprised of heavy, light, and transitional fog, light and heavy rain, and low light conditions.

  13. Effects of placement point of background music on shopping website.

    PubMed

    Lai, Chien-Jung; Chiang, Chia-Chi

    2012-01-01

    Consumer on-line behaviors are more important than ever due to highly growth of on-line shopping. The purposes of this study were to design placement methods of background music for shopping website and examine the effect on browsers' emotional and cognitive response. Three placement points of background music during the browsing, i.e. 2 min., 4 min., and 6 min. from the start of browsing were considered for entry points. Both browsing without music (no music) and browsing with constant music volume (full music) were treated as control groups. Participants' emotional state, approach-avoidance behavior intention, and action to adjust music volume were collected. Results showed that participants had a higher level of pleasure, arousal and approach behavior intention for the three placement points than for no music and full music. Most of the participants for full music (5/6) adjusted the background music. Only 16.7% (3/18) participants for other levels turn off the background music. The results indicate that playing background music after the start of browsing is benefit for on-line shopping atmosphere. It is inappropriate to place background music at the start of browsing shopping website. The marketer must manipulated placement methods of background music for a web store carefully.

  14. Assessment of ambient background concentrations of elements in soil using combined survey and open-source data.

    PubMed

    Mikkonen, Hannah G; Clarke, Bradley O; Dasika, Raghava; Wallis, Christian J; Reichman, Suzie M

    2017-02-15

    Understanding ambient background concentrations in soil, at a local scale, is an essential part of environmental risk assessment. Where high resolution geochemical soil surveys have not been undertaken, soil data from alternative sources, such as environmental site assessment reports, can be used to support an understanding of ambient background conditions. Concentrations of metals/metalloids (As, Mn, Ni, Pb and Zn) were extracted from open-source environmental site assessment reports, for soils derived from the Newer Volcanics basalt, of Melbourne, Victoria, Australia. A manual screening method was applied to remove samples that were indicated to be contaminated by point sources and hence not representative of ambient background conditions. The manual screening approach was validated by comparison to data from a targeted background soil survey. Statistical methods for exclusion of contaminated samples from background soil datasets were compared to the manual screening method. The statistical methods tested included the Median plus Two Median Absolute Deviations, the upper whisker of a normal and log transformed Tukey boxplot, the point of inflection on a cumulative frequency plot and the 95th percentile. We have demonstrated that where anomalous sample results cannot be screened using site information, the Median plus Two Median Absolute Deviations is a conservative method for derivation of ambient background upper concentration limits (i.e. expected maximums). The upper whisker of a boxplot and the point of inflection on a cumulative frequency plot, were also considered adequate methods for deriving ambient background upper concentration limits, where the percentage of contaminated samples is <25%. Median ambient background concentrations of metals/metalloids in the Newer Volcanic soils of Melbourne were comparable to ambient background concentrations in Europe and the United States, except for Ni, which was naturally enriched in the basalt-derived soils of Melbourne. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Neural networks for Higgs physics

    NASA Astrophysics Data System (ADS)

    Tentindo-Repond, Silvia; Bhat, Pushpalatha C.; Prosper, Harrison B.

    2001-08-01

    The main application of neural networks (NN) in Higgs physics so far has been to optimize the signal over background ratio. The positive result obtained imply that the use of NN will lead to a big reduction in the integrated luminosity required for the discovery of the Higgs in RunII. Neural Networks have also been recently used in Higgs physics to set up tagging algorithms to identify the heavy flavor content of jets. Whereas in the previous studies the NN b-tagging methods used are channel-independent, a channel-dependent method has been used in the present work. The signal pp¯→WH→lνbb¯ has been studied against the dominant background pp¯→Wbb¯, in an attempt to improve the signal over background ratio by trying to push the invariant mass of the background events further away from the signal. This result would get the equivalent effect of an improved mass resolution.

  16. Simple automatic strategy for background drift correction in chromatographic data analysis.

    PubMed

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-03

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Evaluation of methods to reduce background using the Python-based ELISA_QC program.

    PubMed

    Webster, Rose P; Cohen, Cinder F; Saeed, Fatima O; Wetzel, Hanna N; Ball, William J; Kirley, Terence L; Norman, Andrew B

    2018-05-01

    Almost all immunological approaches [immunohistochemistry, enzyme-linked immunosorbent assay (ELISA), Western blot], that are used to quantitate specific proteins have had to address high backgrounds due to non-specific reactivity. We report here for the first time a quantitative comparison of methods for reduction of the background of commercial biotinylated antibodies using the Python-based ELISA_QC program. This is demonstrated using a recombinant humanized anti-cocaine monoclonal antibody. Several approaches, such as adjustment of the incubation time and the concentration of blocking agent, as well as the dilution of secondary antibodies, have been explored to address this issue. In this report, systematic comparisons of two different methods, contrasted with other more traditional methods to address this problem are provided. Addition of heparin (HP) at 1 μg/ml to the wash buffer prior to addition of the secondary biotinylated antibody reduced the elevated background absorbance values (from a mean of 0.313 ± 0.015 to 0.137 ± 0.002). A novel immunodepletion (ID) method also reduced the background (from a mean of 0.331 ± 0.010 to 0.146 ± 0.013). Overall, the ID method generated more similar results at each concentration of the ELISA standard curve to that using the standard lot 1 than the HP method, as analyzed by the Python-based ELISA_QC program. We conclude that the ID method, while more laborious, provides the best solution to resolve the high background seen with specific lots of biotinylated secondary antibody. Copyright © 2018. Published by Elsevier B.V.

  18. An Effective Method for Modeling Two-dimensional Sky Background of LAMOST

    NASA Astrophysics Data System (ADS)

    Haerken, Hasitieer; Duan, Fuqing; Zhang, Jiannan; Guo, Ping

    2017-06-01

    Each CCD of LAMOST accommodates 250 spectra, while about 40 are used to observe sky background during real observations. How to estimate the unknown sky background information hidden in the observed 210 celestial spectra by using the known 40 sky spectra is the problem we solve. In order to model the sky background, usually a pre-observation is performed with all fibers observing sky background. We use the observed 250 skylight spectra as training data, where those observed by the 40 fibers are considered as a base vector set. The Locality-constrained Linear Coding (LLC) technique is utilized to represent the skylight spectra observed by the 210 fibers with the base vector set. We also segment each spectrum into small parts, and establish the local sky background model for each part. Experimental results validate the proposed method, and show the local model is better than the global model.

  19. Use of an OSSE to Evaluate Background Error Covariances Estimated by the 'NMC Method'

    NASA Technical Reports Server (NTRS)

    Errico, Ronald M.; Prive, Nikki C.; Gu, Wei

    2014-01-01

    The NMC method has proven utility for prescribing approximate background-error covariances required by variational data assimilation systems. Here, untunedNMCmethod estimates are compared with explicitly determined error covariances produced within an OSSE context by exploiting availability of the true simulated states. Such a comparison provides insights into what kind of rescaling is required to render the NMC method estimates usable. It is shown that rescaling of variances and directional correlation lengths depends greatly on both pressure and latitude. In particular, some scaling coefficients appropriate in the Tropics are the reciprocal of those in the Extratropics. Also, the degree of dynamic balance is grossly overestimated by the NMC method. These results agree with previous examinations of the NMC method which used ensembles as an alternative for estimating background-error statistics.

  20. Comparison of the results of refractometric measurements in the process of diffusion, obtained by means of the backgroundoriented schlieren method and the holographic interferometry method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraiskii, A V; Mironova, T V

    2015-08-31

    The results of the study of interdiffusion of two liquids, obtained using the holographic recording scheme with a nonstationary reference wave with the frequency linearly varying in space and time are compared with the results of correlation processing of digital photographs, made with a random background screen. The spatio-temporal behaviour of the signal in four basic representations ('space – temporal frequency', 'space – time', 'spatial frequency – temporal frequency' and 'spatial frequency – time') is found in the holographic experiment and calculated (in the appropriate coordinates) based on the background-oriented schlieren method. Practical coincidence of the results of the correlationmore » analysis and the holographic double-exposure interferometry is demonstrated. (interferometry)« less

  1. A New Moving Object Detection Method Based on Frame-difference and Background Subtraction

    NASA Astrophysics Data System (ADS)

    Guo, Jiajia; Wang, Junping; Bai, Ruixue; Zhang, Yao; Li, Yong

    2017-09-01

    Although many methods of moving object detection have been proposed, moving object extraction is still the core in video surveillance. However, with the complex scene in real world, false detection, missed detection and deficiencies resulting from cavities inside the body still exist. In order to solve the problem of incomplete detection for moving objects, a new moving object detection method combined an improved frame-difference and Gaussian mixture background subtraction is proposed in this paper. To make the moving object detection more complete and accurate, the image repair and morphological processing techniques which are spatial compensations are applied in the proposed method. Experimental results show that our method can effectively eliminate ghosts and noise and fill the cavities of the moving object. Compared to other four moving object detection methods which are GMM, VIBE, frame-difference and a literature's method, the proposed method improve the efficiency and accuracy of the detection.

  2. Enhanced identification and biological validation of differential gene expression via Illumina whole-genome expression arrays through the use of the model-based background correction methodology

    PubMed Central

    Ding, Liang-Hao; Xie, Yang; Park, Seongmi; Xiao, Guanghua; Story, Michael D.

    2008-01-01

    Despite the tremendous growth of microarray usage in scientific studies, there is a lack of standards for background correction methodologies, especially in single-color microarray platforms. Traditional background subtraction methods often generate negative signals and thus cause large amounts of data loss. Hence, some researchers prefer to avoid background corrections, which typically result in the underestimation of differential expression. Here, by utilizing nonspecific negative control features integrated into Illumina whole genome expression arrays, we have developed a method of model-based background correction for BeadArrays (MBCB). We compared the MBCB with a method adapted from the Affymetrix robust multi-array analysis algorithm and with no background subtraction, using a mouse acute myeloid leukemia (AML) dataset. We demonstrated that differential expression ratios obtained by using the MBCB had the best correlation with quantitative RT–PCR. MBCB also achieved better sensitivity in detecting differentially expressed genes with biological significance. For example, we demonstrated that the differential regulation of Tnfr2, Ikk and NF-kappaB, the death receptor pathway, in the AML samples, could only be detected by using data after MBCB implementation. We conclude that MBCB is a robust background correction method that will lead to more precise determination of gene expression and better biological interpretation of Illumina BeadArray data. PMID:18450815

  3. Complex background suppression using global-local registration strategy for the detection of small-moving target on moving platform

    NASA Astrophysics Data System (ADS)

    Zou, Tianhao; Zuo, Zhengrong

    2018-02-01

    Target detection is a very important and basic problem of computer vision and image processing. The most often case we meet in real world is a detection task for a moving-small target on moving platform. The commonly used methods, such as Registration-based suppression, can hardly achieve a desired result. To crack this hard nut, we introduce a Global-local registration based suppression method. Differ from the traditional ones, the proposed Global-local Registration Strategy consider both the global consistency and the local diversity of the background, obtain a better performance than normal background suppression methods. In this paper, we first discussed the features about the small-moving target detection on unstable platform. Then we introduced a new strategy and conducted an experiment to confirm its noisy stability. In the end, we confirmed the background suppression method based on global-local registration strategy has a better perform in moving target detection on moving platform.

  4. Alcohol and cannabis use among adolescents in Flemish secondary school in Brussels: effects of type of education

    PubMed Central

    2012-01-01

    Background Research regarding socio-economic differences in alcohol and drug use in adolescence yields mixed results. This study hypothesizes that (1) when using education type as a proxy of one's social status, clear differences will exist between students from different types of education, regardless of students' familial socio-economic background; (2) and that the effects of education type differ according to their cultural background. Methods Data from the Brussels youth monitor were used, a school survey administered among 1,488 adolescents from the 3rd to 6th year of Flemish secondary education. Data were analyzed using multilevel logistic regression models. Results Controlling for their familial background, the results show that native students in lower educational tracks use alcohol and cannabis more often than students in upper educational tracks. Such a relationship was not found for students from another ethnic background. Conclusion Results from this study indicate that research into health risks should take into account both adolescents' familial background and individual social position as different components of youngsters' socio-economic background. PMID:22433291

  5. Spectral anomaly methods for aerial detection using KUT nuisance rejection

    NASA Astrophysics Data System (ADS)

    Detwiler, R. S.; Pfund, D. M.; Myjak, M. J.; Kulisek, J. A.; Seifert, C. E.

    2015-06-01

    This work discusses the application and optimization of a spectral anomaly method for the real-time detection of gamma radiation sources from an aerial helicopter platform. Aerial detection presents several key challenges over ground-based detection. For one, larger and more rapid background fluctuations are typical due to higher speeds, larger field of view, and geographically induced background changes. As well, the possible large altitude or stand-off distance variations cause significant steps in background count rate as well as spectral changes due to increased gamma-ray scatter with detection at higher altitudes. The work here details the adaptation and optimization of the PNNL-developed algorithm Nuisance-Rejecting Spectral Comparison Ratios for Anomaly Detection (NSCRAD), a spectral anomaly method previously developed for ground-based applications, for an aerial platform. The algorithm has been optimized for two multi-detector systems; a NaI(Tl)-detector-based system and a CsI detector array. The optimization here details the adaptation of the spectral windows for a particular set of target sources to aerial detection and the tailoring for the specific detectors. As well, the methodology and results for background rejection methods optimized for the aerial gamma-ray detection using Potassium, Uranium and Thorium (KUT) nuisance rejection are shown. Results indicate that use of a realistic KUT nuisance rejection may eliminate metric rises due to background magnitude and spectral steps encountered in aerial detection due to altitude changes and geographically induced steps such as at land-water interfaces.

  6. A salient region detection model combining background distribution measure for indoor robots.

    PubMed

    Li, Na; Xu, Hui; Wang, Zhenhua; Sun, Lining; Chen, Guodong

    2017-01-01

    Vision system plays an important role in the field of indoor robot. Saliency detection methods, capturing regions that are perceived as important, are used to improve the performance of visual perception system. Most of state-of-the-art methods for saliency detection, performing outstandingly in natural images, cannot work in complicated indoor environment. Therefore, we propose a new method comprised of graph-based RGB-D segmentation, primary saliency measure, background distribution measure, and combination. Besides, region roundness is proposed to describe the compactness of a region to measure background distribution more robustly. To validate the proposed approach, eleven influential methods are compared on the DSD and ECSSD dataset. Moreover, we build a mobile robot platform for application in an actual environment, and design three different kinds of experimental constructions that are different viewpoints, illumination variations and partial occlusions. Experimental results demonstrate that our model outperforms existing methods and is useful for indoor mobile robots.

  7. Minimizing effects of methodological decisions on interpretation and prediction in species distribution studies: An example with background selection

    USGS Publications Warehouse

    Jarnevich, Catherine S.; Talbert, Marian; Morisette, Jeffrey T.; Aldridge, Cameron L.; Brown, Cynthia; Kumar, Sunil; Manier, Daniel; Talbert, Colin; Holcombe, Tracy R.

    2017-01-01

    Evaluating the conditions where a species can persist is an important question in ecology both to understand tolerances of organisms and to predict distributions across landscapes. Presence data combined with background or pseudo-absence locations are commonly used with species distribution modeling to develop these relationships. However, there is not a standard method to generate background or pseudo-absence locations, and method choice affects model outcomes. We evaluated combinations of both model algorithms (simple and complex generalized linear models, multivariate adaptive regression splines, Maxent, boosted regression trees, and random forest) and background methods (random, minimum convex polygon, and continuous and binary kernel density estimator (KDE)) to assess the sensitivity of model outcomes to choices made. We evaluated six questions related to model results, including five beyond the common comparison of model accuracy assessment metrics (biological interpretability of response curves, cross-validation robustness, independent data accuracy and robustness, and prediction consistency). For our case study with cheatgrass in the western US, random forest was least sensitive to background choice and the binary KDE method was least sensitive to model algorithm choice. While this outcome may not hold for other locations or species, the methods we used can be implemented to help determine appropriate methodologies for particular research questions.

  8. Red lesion detection using background estimation and lesions characteristics in diabetic retinal image

    NASA Astrophysics Data System (ADS)

    Zhang, Dongbo; Peng, Yinghui; Yi, Yao; Shang, Xingyu

    2013-10-01

    Detection of red lesions [hemorrhages (HRs) and microaneurysms (MAs)] is crucial for the diagnosis of early diabetic retinopathy. A method based on background estimation and adapted to specific characteristics of HRs and MAs is proposed. Candidate red lesions are located by background estimation and Mahalanobis distance measure and then some adaptive postprocessing techniques, which include vessel detection, nonvessel exclusion based on shape analysis, and noise points exclusion by double-ring filter (only used for MAs detection), are conducted to remove nonlesion pixels. The method is evaluated on our collected image dataset, and experimental results show that it is better than or approximate to other previous approaches. It is effective to reduce the false-positive and false-negative results that arise from incomplete and inaccurate vessel structure.

  9. Research on the generation of the background with sea and sky in infrared scene

    NASA Astrophysics Data System (ADS)

    Dong, Yan-zhi; Han, Yan-li; Lou, Shu-li

    2008-03-01

    It is important for scene generation to keep the texture of infrared images in simulation of anti-ship infrared imaging guidance. We studied the fractal method and applied it to the infrared scene generation. We adopted the method of horizontal-vertical (HV) partition to encode the original image. Basing on the properties of infrared image with sea-sky background, we took advantage of Local Iteration Function System (LIFS) to decrease the complexity of computation and enhance the processing rate. Some results were listed. The results show that the fractal method can keep the texture of infrared image better and can be used in the infrared scene generation widely in future.

  10. Seasonal changes in background levels of deuterium and oxygen-18 prove water drinking by harp seals, which affects the use of the doubly labelled water method.

    PubMed

    Nordøy, Erling S; Lager, Anne R; Schots, Pauke C

    2017-12-01

    The aim of this study was to monitor seasonal changes in stable isotopes of pool freshwater and harp seal ( Phoca groenlandica ) body water, and to study whether these potential seasonal changes might bias results obtained using the doubly labelled water (DLW) method when measuring energy expenditure in animals with access to freshwater. Seasonal changes in the background levels of deuterium and oxygen-18 in the body water of four captive harp seals and in the freshwater pool in which they were kept were measured over a time period of 1 year. The seals were offered daily amounts of capelin and kept under a seasonal photoperiod of 69°N. Large seasonal variations of deuterium and oxygen-18 in the pool water were measured, and the isotope abundance in the body water showed similar seasonal changes to the pool water. This shows that the seals were continuously equilibrating with the surrounding water as a result of significant daily water drinking. Variations in background levels of deuterium and oxygen-18 in freshwater sources may be due to seasonal changes in physical processes such as precipitation and evaporation that cause fractionation of isotopes. Rapid and abrupt changes in the background levels of deuterium and oxygen-18 may complicate calculation of energy expenditure by use of the DLW method. It is therefore strongly recommended that analysis of seasonal changes in background levels of isotopes is performed before the DLW method is applied on (free-ranging) animals, and to use a control group in order to correct for changes in background levels. © 2017. Published by The Company of Biologists Ltd.

  11. Using ontologies to model human navigation behavior in information networks: A study based on Wikipedia.

    PubMed

    Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis; Nyulas, Csongor; Tudorache, Tania; Noy, Natalya F; Musen, Mark A

    The need to examine the behavior of different user groups is a fundamental requirement when building information systems. In this paper, we present Ontology-based Decentralized Search (OBDS), a novel method to model the navigation behavior of users equipped with different types of background knowledge. Ontology-based Decentralized Search combines decentralized search, an established method for navigation in social networks, and ontologies to model navigation behavior in information networks. The method uses ontologies as an explicit representation of background knowledge to inform the navigation process and guide it towards navigation targets. By using different ontologies, users equipped with different types of background knowledge can be represented. We demonstrate our method using four biomedical ontologies and their associated Wikipedia articles. We compare our simulation results with base line approaches and with results obtained from a user study. We find that our method produces click paths that have properties similar to those originating from human navigators. The results suggest that our method can be used to model human navigation behavior in systems that are based on information networks, such as Wikipedia. This paper makes the following contributions: (i) To the best of our knowledge, this is the first work to demonstrate the utility of ontologies in modeling human navigation and (ii) it yields new insights and understanding about the mechanisms of human navigation in information networks.

  12. Using ontologies to model human navigation behavior in information networks: A study based on Wikipedia

    PubMed Central

    Lamprecht, Daniel; Strohmaier, Markus; Helic, Denis; Nyulas, Csongor; Tudorache, Tania; Noy, Natalya F.; Musen, Mark A.

    2015-01-01

    The need to examine the behavior of different user groups is a fundamental requirement when building information systems. In this paper, we present Ontology-based Decentralized Search (OBDS), a novel method to model the navigation behavior of users equipped with different types of background knowledge. Ontology-based Decentralized Search combines decentralized search, an established method for navigation in social networks, and ontologies to model navigation behavior in information networks. The method uses ontologies as an explicit representation of background knowledge to inform the navigation process and guide it towards navigation targets. By using different ontologies, users equipped with different types of background knowledge can be represented. We demonstrate our method using four biomedical ontologies and their associated Wikipedia articles. We compare our simulation results with base line approaches and with results obtained from a user study. We find that our method produces click paths that have properties similar to those originating from human navigators. The results suggest that our method can be used to model human navigation behavior in systems that are based on information networks, such as Wikipedia. This paper makes the following contributions: (i) To the best of our knowledge, this is the first work to demonstrate the utility of ontologies in modeling human navigation and (ii) it yields new insights and understanding about the mechanisms of human navigation in information networks. PMID:26568745

  13. An Improved Text Localization Method for Natural Scene Images

    NASA Astrophysics Data System (ADS)

    Jiang, Mengdi; Cheng, Jianghua; Chen, Minghui; Ku, Xishu

    2018-01-01

    In order to extract text information effectively from natural scene image with complex background, multi-orientation perspective and multilingual languages, we present a new method based on the improved Stroke Feature Transform (SWT). Firstly, The Maximally Stable Extremal Region (MSER) method is used to detect text candidate regions. Secondly, the SWT algorithm is used in the candidate regions, which can improve the edge detection compared with tradition SWT method. Finally, the Frequency-tuned (FT) visual saliency is introduced to remove non-text candidate regions. The experiment results show that, the method can achieve good robustness for complex background with multi-orientation perspective, various characters and font sizes.

  14. Background recovery via motion-based robust principal component analysis with matrix factorization

    NASA Astrophysics Data System (ADS)

    Pan, Peng; Wang, Yongli; Zhou, Mingyuan; Sun, Zhipeng; He, Guoping

    2018-03-01

    Background recovery is a key technique in video analysis, but it still suffers from many challenges, such as camouflage, lighting changes, and diverse types of image noise. Robust principal component analysis (RPCA), which aims to recover a low-rank matrix and a sparse matrix, is a general framework for background recovery. The nuclear norm is widely used as a convex surrogate for the rank function in RPCA, which requires computing the singular value decomposition (SVD), a task that is increasingly costly as matrix sizes and ranks increase. However, matrix factorization greatly reduces the dimension of the matrix for which the SVD must be computed. Motion information has been shown to improve low-rank matrix recovery in RPCA, but this method still finds it difficult to handle original video data sets because of its batch-mode formulation and implementation. Hence, in this paper, we propose a motion-assisted RPCA model with matrix factorization (FM-RPCA) for background recovery. Moreover, an efficient linear alternating direction method of multipliers with a matrix factorization (FL-ADM) algorithm is designed for solving the proposed FM-RPCA model. Experimental results illustrate that the method provides stable results and is more efficient than the current state-of-the-art algorithms.

  15. Recovery of intrinsic fluorescence from single-point interstitial measurements for quantification of doxorubicin concentration

    PubMed Central

    Baran, Timothy M.; Foster, Thomas H.

    2014-01-01

    Background and Objective We developed a method for the recovery of intrinsic fluorescence from single-point measurements in highly scattering and absorbing samples without a priori knowledge of the sample optical properties. The goal of the study was to demonstrate accurate recovery of fluorophore concentration in samples with widely varying background optical properties, while simultaneously recovering the optical properties. Materials and Methods Tissue-simulating phantoms containing doxorubicin, MnTPPS, and Intralipid-20% were created, and fluorescence measurements were performed using a single isotropic probe. The resulting spectra were analyzed using a forward-adjoint fluorescence model in order to recover the fluorophore concentration and background optical properties. Results We demonstrated recovery of doxorubicin concentration with a mean error of 11.8%. The concentration of the background absorber was recovered with an average error of 23.2% and the scattering spectrum was recovered with a mean error of 19.8%. Conclusion This method will allow for the determination of local concentrations of fluorescent drugs, such as doxorubicin, from minimally invasive fluorescence measurements. This is particularly interesting in the context of transarterial chemoembolization (TACE) treatment of liver cancer. PMID:24037853

  16. Feature Transformation Detection Method with Best Spectral Band Selection Process for Hyper-spectral Imaging

    NASA Astrophysics Data System (ADS)

    Chen, Hai-Wen; McGurr, Mike; Brickhouse, Mark

    2015-11-01

    We present a newly developed feature transformation (FT) detection method for hyper-spectral imagery (HSI) sensors. In essence, the FT method, by transforming the original features (spectral bands) to a different feature domain, may considerably increase the statistical separation between the target and background probability density functions, and thus may significantly improve the target detection and identification performance, as evidenced by the test results in this paper. We show that by differentiating the original spectral, one can completely separate targets from the background using a single spectral band, leading to perfect detection results. In addition, we have proposed an automated best spectral band selection process with a double-threshold scheme that can rank the available spectral bands from the best to the worst for target detection. Finally, we have also proposed an automated cross-spectrum fusion process to further improve the detection performance in lower spectral range (<1000 nm) by selecting the best spectral band pair with multivariate analysis. Promising detection performance has been achieved using a small background material signature library for concept-proving, and has then been further evaluated and verified using a real background HSI scene collected by a HYDICE sensor.

  17. Ship Detection from Ocean SAR Image Based on Local Contrast Variance Weighted Information Entropy

    PubMed Central

    Huang, Yulin; Pei, Jifang; Zhang, Qian; Gu, Qin; Yang, Jianyu

    2018-01-01

    Ship detection from synthetic aperture radar (SAR) images is one of the crucial issues in maritime surveillance. However, due to the varying ocean waves and the strong echo of the sea surface, it is very difficult to detect ships from heterogeneous and strong clutter backgrounds. In this paper, an innovative ship detection method is proposed to effectively distinguish the vessels from complex backgrounds from a SAR image. First, the input SAR image is pre-screened by the maximally-stable extremal region (MSER) method, which can obtain the ship candidate regions with low computational complexity. Then, the proposed local contrast variance weighted information entropy (LCVWIE) is adopted to evaluate the complexity of those candidate regions and the dissimilarity between the candidate regions with their neighborhoods. Finally, the LCVWIE values of the candidate regions are compared with an adaptive threshold to obtain the final detection result. Experimental results based on measured ocean SAR images have shown that the proposed method can obtain stable detection performance both in strong clutter and heterogeneous backgrounds. Meanwhile, it has a low computational complexity compared with some existing detection methods. PMID:29652863

  18. An Improved Method for Demonstrating Visual Selection by Wild Birds.

    ERIC Educational Resources Information Center

    Allen, J. A.; And Others

    1990-01-01

    An activity simulating natural selection in which wild birds are predators, green and brown pastry "baits" are prey, and trays containing colored stones as the backgrounds is presented. Two different methods of measuring selection are used to describe the results. The materials and methods, results, and discussion are included. (KR)

  19. PCA-based approach for subtracting thermal background emission in high-contrast imaging data

    NASA Astrophysics Data System (ADS)

    Hunziker, S.; Quanz, S. P.; Amara, A.; Meyer, M. R.

    2018-03-01

    Aims.Ground-based observations at thermal infrared wavelengths suffer from large background radiation due to the sky, telescope and warm surfaces in the instrument. This significantly limits the sensitivity of ground-based observations at wavelengths longer than 3 μm. The main purpose of this work is to analyse this background emission in infrared high-contrast imaging data as illustrative of the problem, show how it can be modelled and subtracted and demonstrate that it can improve the detection of faint sources, such as exoplanets. Methods: We used principal component analysis (PCA) to model and subtract the thermal background emission in three archival high-contrast angular differential imaging datasets in the M' and L' filter. We used an M' dataset of β Pic to describe in detail how the algorithm works and explain how it can be applied. The results of the background subtraction are compared to the results from a conventional mean background subtraction scheme applied to the same dataset. Finally, both methods for background subtraction are compared by performing complete data reductions. We analysed the results from the M' dataset of HD 100546 only qualitatively. For the M' band dataset of β Pic and the L' band dataset of HD 169142, which was obtained with an angular groove phase mask vortex vector coronagraph, we also calculated and analysed the achieved signal-to-noise ratio (S/N). Results: We show that applying PCA is an effective way to remove spatially and temporarily varying thermal background emission down to close to the background limit. The procedure also proves to be very successful at reconstructing the background that is hidden behind the point spread function. In the complete data reductions, we find at least qualitative improvements for HD 100546 and HD 169142, however, we fail to find a significant increase in S/N of β Pic b. We discuss these findings and argue that in particular datasets with strongly varying observing conditions or infrequently sampled sky background will benefit from the new approach.

  20. Detecting background changes in environments with dynamic foreground by separating probability distribution function mixtures using Pearson's method of moments

    NASA Astrophysics Data System (ADS)

    Jenkins, Colleen; Jordan, Jay; Carlson, Jeff

    2007-02-01

    This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.

  1. Scene Segmentation For Autonomous Robotic Navigation Using Sequential Laser Projected Structured Light

    NASA Astrophysics Data System (ADS)

    Brown, C. David; Ih, Charles S.; Arce, Gonzalo R.; Fertell, David A.

    1987-01-01

    Vision systems for mobile robots or autonomous vehicles navigating in an unknown terrain environment must provide a rapid and accurate method of segmenting the scene ahead into regions of pathway and background. A major distinguishing feature between the pathway and background is the three dimensional texture of these two regions. Typical methods of textural image segmentation are very computationally intensive, often lack the required robustness, and are incapable of sensing the three dimensional texture of various regions of the scene. A method is presented where scanned laser projected lines of structured light, viewed by a stereoscopically located single video camera, resulted in an image in which the three dimensional characteristics of the scene were represented by the discontinuity of the projected lines. This image was conducive to processing with simple regional operators to classify regions as pathway or background. Design of some operators and application methods, and demonstration on sample images are presented. This method provides rapid and robust scene segmentation capability that has been implemented on a microcomputer in near real time, and should result in higher speed and more reliable robotic or autonomous navigation in unstructured environments.

  2. Neyman Pearson detection of K-distributed random variables

    NASA Astrophysics Data System (ADS)

    Tucker, J. Derek; Azimi-Sadjadi, Mahmood R.

    2010-04-01

    In this paper a new detection method for sonar imagery is developed in K-distributed background clutter. The equation for the log-likelihood is derived and compared to the corresponding counterparts derived for the Gaussian and Rayleigh assumptions. Test results of the proposed method on a data set of synthetic underwater sonar images is also presented. This database contains images with targets of different shapes inserted into backgrounds generated using a correlated K-distributed model. Results illustrating the effectiveness of the K-distributed detector are presented in terms of probability of detection, false alarm, and correct classification rates for various bottom clutter scenarios.

  3. Data-driven approach for creating synthetic electronic medical records.

    PubMed

    Buczak, Anna L; Babin, Steven; Moniz, Linda

    2010-10-14

    New algorithms for disease outbreak detection are being developed to take advantage of full electronic medical records (EMRs) that contain a wealth of patient information. However, due to privacy concerns, even anonymized EMRs cannot be shared among researchers, resulting in great difficulty in comparing the effectiveness of these algorithms. To bridge the gap between novel bio-surveillance algorithms operating on full EMRs and the lack of non-identifiable EMR data, a method for generating complete and synthetic EMRs was developed. This paper describes a novel methodology for generating complete synthetic EMRs both for an outbreak illness of interest (tularemia) and for background records. The method developed has three major steps: 1) synthetic patient identity and basic information generation; 2) identification of care patterns that the synthetic patients would receive based on the information present in real EMR data for similar health problems; 3) adaptation of these care patterns to the synthetic patient population. We generated EMRs, including visit records, clinical activity, laboratory orders/results and radiology orders/results for 203 synthetic tularemia outbreak patients. Validation of the records by a medical expert revealed problems in 19% of the records; these were subsequently corrected. We also generated background EMRs for over 3000 patients in the 4-11 yr age group. Validation of those records by a medical expert revealed problems in fewer than 3% of these background patient EMRs and the errors were subsequently rectified. A data-driven method was developed for generating fully synthetic EMRs. The method is general and can be applied to any data set that has similar data elements (such as laboratory and radiology orders and results, clinical activity, prescription orders). The pilot synthetic outbreak records were for tularemia but our approach may be adapted to other infectious diseases. The pilot synthetic background records were in the 4-11 year old age group. The adaptations that must be made to the algorithms to produce synthetic background EMRs for other age groups are indicated.

  4. Large scale study of multiple-molecule queries

    PubMed Central

    2009-01-01

    Background In ligand-based screening, as well as in other chemoinformatics applications, one seeks to effectively search large repositories of molecules in order to retrieve molecules that are similar typically to a single molecule lead. However, in some case, multiple molecules from the same family are available to seed the query and search for other members of the same family. Multiple-molecule query methods have been less studied than single-molecule query methods. Furthermore, the previous studies have relied on proprietary data and sometimes have not used proper cross-validation methods to assess the results. In contrast, here we develop and compare multiple-molecule query methods using several large publicly available data sets and background. We also create a framework based on a strict cross-validation protocol to allow unbiased benchmarking for direct comparison in future studies across several performance metrics. Results Fourteen different multiple-molecule query methods were defined and benchmarked using: (1) 41 publicly available data sets of related molecules with similar biological activity; and (2) publicly available background data sets consisting of up to 175,000 molecules randomly extracted from the ChemDB database and other sources. Eight of the fourteen methods were parameter free, and six of them fit one or two free parameters to the data using a careful cross-validation protocol. All the methods were assessed and compared for their ability to retrieve members of the same family against the background data set by using several performance metrics including the Area Under the Accumulation Curve (AUAC), Area Under the Curve (AUC), F1-measure, and BEDROC metrics. Consistent with the previous literature, the best parameter-free methods are the MAX-SIM and MIN-RANK methods, which score a molecule to a family by the maximum similarity, or minimum ranking, obtained across the family. One new parameterized method introduced in this study and two previously defined methods, the Exponential Tanimoto Discriminant (ETD), the Tanimoto Power Discriminant (TPD), and the Binary Kernel Discriminant (BKD), outperform most other methods but are more complex, requiring one or two parameters to be fit to the data. Conclusion Fourteen methods for multiple-molecule querying of chemical databases, including novel methods, (ETD) and (TPD), are validated using publicly available data sets, standard cross-validation protocols, and established metrics. The best results are obtained with ETD, TPD, BKD, MAX-SIM, and MIN-RANK. These results can be replicated and compared with the results of future studies using data freely downloadable from http://cdb.ics.uci.edu/. PMID:20298525

  5. Stacked Multilayer Self-Organizing Map for Background Modeling.

    PubMed

    Zhao, Zhenjie; Zhang, Xuebo; Fang, Yongchun

    2015-09-01

    In this paper, a new background modeling method called stacked multilayer self-organizing map background model (SMSOM-BM) is proposed, which presents several merits such as strong representative ability for complex scenarios, easy to use, and so on. In order to enhance the representative ability of the background model and make the parameters learned automatically, the recently developed idea of representative learning (or deep learning) is elegantly employed to extend the existing single-layer self-organizing map background model to a multilayer one (namely, the proposed SMSOM-BM). As a consequence, the SMSOM-BM gains several merits including strong representative ability to learn background model of challenging scenarios, and automatic determination for most network parameters. More specifically, every pixel is modeled by a SMSOM, and spatial consistency is considered at each layer. By introducing a novel over-layer filtering process, we can train the background model layer by layer in an efficient manner. Furthermore, for real-time performance consideration, we have implemented the proposed method using NVIDIA CUDA platform. Comparative experimental results show superior performance of the proposed approach.

  6. Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment

    ERIC Educational Resources Information Center

    Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…

  7. ENSO Bred Vectors in Coupled Ocean-Atmosphere General Circulation Models

    NASA Technical Reports Server (NTRS)

    Yang, S. C.; Cai, Ming; Kalnay, E.; Rienecker, M.; Yuan, G.; Toth, ZA.

    2004-01-01

    The breeding method has been implemented in the NASA Seasonal-to-Interannual Prediction Project (NSIPP) Coupled General Circulation Model (CGCM) with the goal of improving operational seasonal to interannual climate predictions through ensemble forecasting and data assimilation. The coupled instability as cap'tured by the breeding method is the first attempt to isolate the evolving ENSO instability and its corresponding global atmospheric response in a fully coupled ocean-atmosphere GCM. Our results show that the growth rate of the coupled bred vectors (BV) peaks at about 3 months before a background ENSO event. The dominant growing BV modes are reminiscent of the background ENSO anomalies and show a strong tropical response with wind/SST/thermocline interrelated in a manner similar to the background ENSO mode. They exhibit larger amplitudes in the eastern tropical Pacific, reflecting the natural dynamical sensitivity associated with the presence of the shallow thermocline. Moreover, the extratropical perturbations associated with these coupled BV modes reveal the variations related to the atmospheric teleconnection patterns associated with background ENSO variability, e.g. over the North Pacific and North America. A similar experiment was carried out with the NCEP/CFS03 CGCM. Comparisons between bred vectors from the NSIPP CGCM and NCEP/CFS03 CGCM demonstrate the robustness of the results. Our results strongly suggest that the breeding method can serve as a natural filter to identify the slowly varying, coupled instabilities in a coupled GCM, which can be used to construct ensemble perturbations for ensemble forecasts and to estimate the coupled background error covariance for coupled data assimilation.

  8. Infrared small target detection based on directional zero-crossing measure

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangyue; Ding, Qinghai; Luo, Haibo; Hui, Bin; Chang, Zheng; Zhang, Junchao

    2017-12-01

    Infrared small target detection under complex background and low signal-to-clutter ratio (SCR) condition is of great significance to the development on precision guidance and infrared surveillance. In order to detect targets precisely and extract targets from intricate clutters effectively, a detection method based on zero-crossing saliency (ZCS) map is proposed. The original map is first decomposed into different first-order directional derivative (FODD) maps by using FODD filters. Then the ZCS map is obtained by fusing all directional zero-crossing points. At last, an adaptive threshold is adopted to segment targets from the ZCS map. Experimental results on a series of images show that our method is effective and robust for detection under complex backgrounds. Moreover, compared with other five state-of-the-art methods, our method achieves better performance in terms of detection rate, SCR gain and background suppression factor.

  9. Techniques to improve the accuracy of noise power spectrum measurements in digital x-ray imaging based on background trends removal.

    PubMed

    Zhou, Zhongxing; Gao, Feng; Zhao, Huijuan; Zhang, Lixin

    2011-03-01

    Noise characterization through estimation of the noise power spectrum (NPS) is a central component of the evaluation of digital x-ray systems. Extensive works have been conducted to achieve accurate and precise measurement of NPS. One approach to improve the accuracy of the NPS measurement is to reduce the statistical variance of the NPS results by involving more data samples. However, this method is based on the assumption that the noise in a radiographic image is arising from stochastic processes. In the practical data, the artifactuals always superimpose on the stochastic noise as low-frequency background trends and prevent us from achieving accurate NPS. The purpose of this study was to investigate an appropriate background detrending technique to improve the accuracy of NPS estimation for digital x-ray systems. In order to achieve the optimal background detrending technique for NPS estimate, four methods for artifactuals removal were quantitatively studied and compared: (1) Subtraction of a low-pass-filtered version of the image, (2) subtraction of a 2-D first-order fit to the image, (3) subtraction of a 2-D second-order polynomial fit to the image, and (4) subtracting two uniform exposure images. In addition, background trend removal was separately applied within original region of interest or its partitioned sub-blocks for all four methods. The performance of background detrending techniques was compared according to the statistical variance of the NPS results and low-frequency systematic rise suppression. Among four methods, subtraction of a 2-D second-order polynomial fit to the image was most effective in low-frequency systematic rise suppression and variances reduction for NPS estimate according to the authors' digital x-ray system. Subtraction of a low-pass-filtered version of the image led to NPS variance increment above low-frequency components because of the side lobe effects of frequency response of the boxcar filtering function. Subtracting two uniform exposure images obtained the worst result on the smoothness of NPS curve, although it was effective in low-frequency systematic rise suppression. Subtraction of a 2-D first-order fit to the image was also identified effective for background detrending, but it was worse than subtraction of a 2-D second-order polynomial fit to the image according to the authors' digital x-ray system. As a result of this study, the authors verified that it is necessary and feasible to get better NPS estimate by appropriate background trend removal. Subtraction of a 2-D second-order polynomial fit to the image was the most appropriate technique for background detrending without consideration of processing time.

  10. Development of an inverse distance weighted active infrared stealth scheme using the repulsive particle swarm optimization algorithm.

    PubMed

    Han, Kuk-Il; Kim, Do-Hwi; Choi, Jun-Hyuk; Kim, Tae-Kuk

    2018-04-20

    Treatments for detection by infrared (IR) signals are higher than for other signals such as radar or sonar because an object detected by the IR sensor cannot easily recognize its detection status. Recently, research for actively reducing IR signal has been conducted to control the IR signal by adjusting the surface temperature of the object. In this paper, we propose an active IR stealth algorithm to synchronize IR signals from the object and the background around the object. The proposed method includes the repulsive particle swarm optimization statistical optimization algorithm to estimate the IR stealth surface temperature, which will result in a synchronization between the IR signals from the object and the surrounding background by setting the inverse distance weighted contrast radiant intensity (CRI) equal to zero. We tested the IR stealth performance in mid wavelength infrared (MWIR) and long wavelength infrared (LWIR) bands for a test plate located at three different positions on a forest scene to verify the proposed method. Our results show that the inverse distance weighted active IR stealth technique proposed in this study is proved to be an effective method for reducing the contrast radiant intensity between the object and background up to 32% as compared to the previous method using the CRI determined as the simple signal difference between the object and the background.

  11. A robust background regression based score estimation algorithm for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Zhao, Rui; Du, Bo; Zhang, Liangpei; Zhang, Lefei

    2016-12-01

    Anomaly detection has become a hot topic in the hyperspectral image analysis and processing fields in recent years. The most important issue for hyperspectral anomaly detection is the background estimation and suppression. Unreasonable or non-robust background estimation usually leads to unsatisfactory anomaly detection results. Furthermore, the inherent nonlinearity of hyperspectral images may cover up the intrinsic data structure in the anomaly detection. In order to implement robust background estimation, as well as to explore the intrinsic data structure of the hyperspectral image, we propose a robust background regression based score estimation algorithm (RBRSE) for hyperspectral anomaly detection. The Robust Background Regression (RBR) is actually a label assignment procedure which segments the hyperspectral data into a robust background dataset and a potential anomaly dataset with an intersection boundary. In the RBR, a kernel expansion technique, which explores the nonlinear structure of the hyperspectral data in a reproducing kernel Hilbert space, is utilized to formulate the data as a density feature representation. A minimum squared loss relationship is constructed between the data density feature and the corresponding assigned labels of the hyperspectral data, to formulate the foundation of the regression. Furthermore, a manifold regularization term which explores the manifold smoothness of the hyperspectral data, and a maximization term of the robust background average density, which suppresses the bias caused by the potential anomalies, are jointly appended in the RBR procedure. After this, a paired-dataset based k-nn score estimation method is undertaken on the robust background and potential anomaly datasets, to implement the detection output. The experimental results show that RBRSE achieves superior ROC curves, AUC values, and background-anomaly separation than some of the other state-of-the-art anomaly detection methods, and is easy to implement in practice.

  12. Laser-induced fluorescence imaging of bacteria

    NASA Astrophysics Data System (ADS)

    Hilton, Peter J.

    1998-12-01

    This paper outlines a method for optically detecting bacteria on various backgrounds, such as meat, by imaging their laser induced auto-fluorescence response. This method can potentially operate in real-time, which is many times faster than current bacterial detection methods, which require culturing of bacterial samples. This paper describes the imaging technique employed whereby a laser spot is scanned across an object while capturing, filtering, and digitizing the returned light. Preliminary results of the bacterial auto-fluorescence are reported and plans for future research are discussed. The results to date are encouraging with six of the eight bacterial strains investigated exhibiting auto-fluorescence when excited at 488 nm. Discrimination of these bacterial strains against red meat is shown and techniques for reducing background fluorescence discussed.

  13. Evaluation of sliding baseline methods for spatial estimation for cluster detection in the biosurveillance system

    PubMed Central

    Xing, Jian; Burkom, Howard; Moniz, Linda; Edgerton, James; Leuze, Michael; Tokars, Jerome

    2009-01-01

    Background The Centers for Disease Control and Prevention's (CDC's) BioSense system provides near-real time situational awareness for public health monitoring through analysis of electronic health data. Determination of anomalous spatial and temporal disease clusters is a crucial part of the daily disease monitoring task. Our study focused on finding useful anomalies at manageable alert rates according to available BioSense data history. Methods The study dataset included more than 3 years of daily counts of military outpatient clinic visits for respiratory and rash syndrome groupings. We applied four spatial estimation methods in implementations of space-time scan statistics cross-checked in Matlab and C. We compared the utility of these methods according to the resultant background cluster rate (a false alarm surrogate) and sensitivity to injected cluster signals. The comparison runs used a spatial resolution based on the facility zip code in the patient record and a finer resolution based on the residence zip code. Results Simple estimation methods that account for day-of-week (DOW) data patterns yielded a clear advantage both in background cluster rate and in signal sensitivity. A 28-day baseline gave the most robust results for this estimation; the preferred baseline is long enough to remove daily fluctuations but short enough to reflect recent disease trends and data representation. Background cluster rates were lower for the rash syndrome counts than for the respiratory counts, likely because of seasonality and the large scale of the respiratory counts. Conclusion The spatial estimation method should be chosen according to characteristics of the selected data streams. In this dataset with strong day-of-week effects, the overall best detection performance was achieved using subregion averages over a 28-day baseline stratified by weekday or weekend/holiday behavior. Changing the estimation method for particular scenarios involving different spatial resolution or other syndromes can yield further improvement. PMID:19615075

  14. Monte-Carlo background simulations of present and future detectors in x-ray astronomy

    NASA Astrophysics Data System (ADS)

    Tenzer, C.; Kendziorra, E.; Santangelo, A.

    2008-07-01

    Reaching a low-level and well understood internal instrumental background is crucial for the scientific performance of an X-ray detector and, therefore, a main objective of the instrument designers. Monte-Carlo simulations of the physics processes and interactions taking place in a space-based X-ray detector as a result of its orbital environment can be applied to explain the measured background of existing missions. They are thus an excellent tool to predict and optimize the background of future observatories. Weak points of a design and the main sources of the background can be identified and methods to reduce them can be implemented and studied within the simulations. Using the Geant4 Monte-Carlo toolkit, we have created a simulation environment for space-based detectors and we present results of such background simulations for XMM-Newton's EPIC pn-CCD camera. The environment is also currently used to estimate and optimize the background of the future instruments Simbol-X and eRosita.

  15. Large-scale weakly supervised object localization via latent category learning.

    PubMed

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  16. Separation of foreground and background from light field using gradient information.

    PubMed

    Lee, Jae Young; Park, Rae-Hong

    2017-02-01

    Studies of computer vision or machine vision applications using a light field camera have been increasing in recent years. However, the abilities that the light field camera has are not fully used in these applications. In this paper, we propose a method for direct separation of foreground and background that uses the gradient information and can be used in various applications such as pre-processing. From an optical phenomenon whereby the bundles of rays from the background are flipped, we derive that the disparity sign of the background in the captured three-dimensional scene has the opposite disparity sign of the foreground. Using the majority-weighted voting algorithm based on the gradient information with the Lambertian assumption and the gradient constraint, the foreground and background can be separated at each pixel. In regard to pre-processing, the proposed method can be used for various applications such as occlusion and saliency detection, disparity estimation, and so on. Experimental results with the EPFL light field dataset and Stanford Lytro light field dataset show that the proposed method achieves better performance in terms of the occlusion detection, and thus can be effectively used in pre-processing for saliency detection and disparity estimation.

  17. A 3D image sensor with adaptable charge subtraction scheme for background light suppression

    NASA Astrophysics Data System (ADS)

    Shin, Jungsoon; Kang, Byongmin; Lee, Keechang; Kim, James D. K.

    2013-02-01

    We present a 3D ToF (Time-of-Flight) image sensor with adaptive charge subtraction scheme for background light suppression. The proposed sensor can alternately capture high resolution color image and high quality depth map in each frame. In depth-mode, the sensor requires enough integration time for accurate depth acquisition, but saturation will occur in high background light illumination. We propose to divide the integration time into N sub-integration times adaptively. In each sub-integration time, our sensor captures an image without saturation and subtracts the charge to prevent the pixel from the saturation. In addition, the subtraction results are cumulated N times obtaining a final result image without background illumination at full integration time. Experimental results with our own ToF sensor show high background suppression performance. We also propose in-pixel storage and column-level subtraction circuit for chiplevel implementation of the proposed method. We believe the proposed scheme will enable 3D sensors to be used in out-door environment.

  18. Ship detection using STFT sea background statistical modeling for large-scale oceansat remote sensing image

    NASA Astrophysics Data System (ADS)

    Wang, Lixia; Pei, Jihong; Xie, Weixin; Liu, Jinyuan

    2018-03-01

    Large-scale oceansat remote sensing images cover a big area sea surface, which fluctuation can be considered as a non-stationary process. Short-Time Fourier Transform (STFT) is a suitable analysis tool for the time varying nonstationary signal. In this paper, a novel ship detection method using 2-D STFT sea background statistical modeling for large-scale oceansat remote sensing images is proposed. First, the paper divides the large-scale oceansat remote sensing image into small sub-blocks, and 2-D STFT is applied to each sub-block individually. Second, the 2-D STFT spectrum of sub-blocks is studied and the obvious different characteristic between sea background and non-sea background is found. Finally, the statistical model for all valid frequency points in the STFT spectrum of sea background is given, and the ship detection method based on the 2-D STFT spectrum modeling is proposed. The experimental result shows that the proposed algorithm can detect ship targets with high recall rate and low missing rate.

  19. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  20. Gaussian Multiscale Aggregation Applied to Segmentation in Hand Biometrics

    PubMed Central

    de Santos Sierra, Alberto; Ávila, Carmen Sánchez; Casanova, Javier Guerra; del Pozo, Gonzalo Bailador

    2011-01-01

    This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage. PMID:22247658

  1. Gaussian multiscale aggregation applied to segmentation in hand biometrics.

    PubMed

    de Santos Sierra, Alberto; Avila, Carmen Sánchez; Casanova, Javier Guerra; del Pozo, Gonzalo Bailador

    2011-01-01

    This paper presents an image segmentation algorithm based on Gaussian multiscale aggregation oriented to hand biometric applications. The method is able to isolate the hand from a wide variety of background textures such as carpets, fabric, glass, grass, soil or stones. The evaluation was carried out by using a publicly available synthetic database with 408,000 hand images in different backgrounds, comparing the performance in terms of accuracy and computational cost to two competitive segmentation methods existing in literature, namely Lossy Data Compression (LDC) and Normalized Cuts (NCuts). The results highlight that the proposed method outperforms current competitive segmentation methods with regard to computational cost, time performance, accuracy and memory usage.

  2. Introductory Guide to the Statistics of Molecular Genetics

    ERIC Educational Resources Information Center

    Eley, Thalia C.; Rijsdijk, Fruhling

    2005-01-01

    Background: This introductory guide presents the main two analytical approaches used by molecular geneticists: linkage and association. Methods: Traditional linkage and association methods are described, along with more recent advances in methodologies such as those using a variance components approach. Results: New methods are being developed all…

  3. Monte Carlo Bayesian Inference on a Statistical Model of Sub-gridcolumn Moisture Variability Using High-resolution Cloud Observations . Part II; Sensitivity Tests and Results

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo M.; Norris, Peter M.

    2013-01-01

    Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.

  4. Sample processing approach for detection of ricin in surface samples.

    PubMed

    Kane, Staci; Shah, Sanjiv; Erler, Anne Marie; Alfaro, Teneile

    2017-12-01

    With several ricin contamination incidents reported over the past decade, rapid and accurate methods are needed for environmental sample analysis, especially after decontamination. A sample processing method was developed for common surface sampling devices to improve the limit of detection and avoid false negative/positive results for ricin analysis. Potential assay interferents from the sample matrix (bleach residue, sample material, wetting buffer), including reference dust, were tested using a Time-Resolved Fluorescence (TRF) immunoassay. Test results suggested that the sample matrix did not cause the elevated background fluorescence sometimes observed when analyzing post-bleach decontamination samples from ricin incidents. Furthermore, sample particulates (80mg/mL Arizona Test Dust) did not enhance background fluorescence or interfere with ricin detection by TRF. These results suggested that high background fluorescence in this immunoassay could be due to labeled antibody quality and/or quantity issues. Centrifugal ultrafiltration devices were evaluated for ricin concentration as a part of sample processing. Up to 30-fold concentration of ricin was observed by the devices, which serve to remove soluble interferents and could function as the front-end sample processing step to other ricin analytical methods. The procedure has the potential to be used with a broader range of environmental sample types and with other potential interferences and to be followed by other ricin analytical methods, although additional verification studies would be required. Published by Elsevier B.V.

  5. A statistical, task-based evaluation method for three-dimensional x-ray breast imaging systems using variable-background phantoms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Subok; Jennings, Robert; Liu Haimo

    Purpose: For the last few years, development and optimization of three-dimensional (3D) x-ray breast imaging systems, such as digital breast tomosynthesis (DBT) and computed tomography, have drawn much attention from the medical imaging community, either academia or industry. However, there is still much room for understanding how to best optimize and evaluate the devices over a large space of many different system parameters and geometries. Current evaluation methods, which work well for 2D systems, do not incorporate the depth information from the 3D imaging systems. Therefore, it is critical to develop a statistically sound evaluation method to investigate the usefulnessmore » of inclusion of depth and background-variability information into the assessment and optimization of the 3D systems. Methods: In this paper, we present a mathematical framework for a statistical assessment of planar and 3D x-ray breast imaging systems. Our method is based on statistical decision theory, in particular, making use of the ideal linear observer called the Hotelling observer. We also present a physical phantom that consists of spheres of different sizes and materials for producing an ensemble of randomly varying backgrounds to be imaged for a given patient class. Lastly, we demonstrate our evaluation method in comparing laboratory mammography and three-angle DBT systems for signal detection tasks using the phantom's projection data. We compare the variable phantom case to that of a phantom of the same dimensions filled with water, which we call the uniform phantom, based on the performance of the Hotelling observer as a function of signal size and intensity. Results: Detectability trends calculated using the variable and uniform phantom methods are different from each other for both mammography and DBT systems. Conclusions: Our results indicate that measuring the system's detection performance with consideration of background variability may lead to differences in system performance estimates and comparisons. For the assessment of 3D systems, to accurately determine trade offs between image quality and radiation dose, it is critical to incorporate randomness arising from the imaging chain including background variability into system performance calculations.« less

  6. Global Binary Continuity for Color Face Detection With Complex Background

    NASA Astrophysics Data System (ADS)

    Belavadi, Bhaskar; Mahendra Prashanth, K. V.; Joshi, Sujay S.; Suprathik, N.

    2017-08-01

    In this paper, we propose a method to detect human faces in color images, with complex background. The proposed algorithm makes use of basically two color space models, specifically HSV and YCgCr. The color segmented image is filled uniformly with a single color (binary) and then all unwanted discontinuous lines are removed to get the final image. Experimental results on Caltech database manifests that the purported model is able to accomplish far better segmentation for faces of varying orientations, skin color and background environment.

  7. Recovery of Background Structures in Nanoscale Helium Ion Microscope Imaging.

    PubMed

    Carasso, Alfred S; Vladár, András E

    2014-01-01

    This paper discusses a two step enhancement technique applicable to noisy Helium Ion Microscope images in which background structures are not easily discernible due to a weak signal. The method is based on a preliminary adaptive histogram equalization, followed by 'slow motion' low-exponent Lévy fractional diffusion smoothing. This combined approach is unexpectedly effective, resulting in a companion enhanced image in which background structures are rendered much more visible, and noise is significantly reduced, all with minimal loss of image sharpness. The method also provides useful enhancements of scanning charged-particle microscopy images obtained by composing multiple drift-corrected 'fast scan' frames. The paper includes software routines, written in Interactive Data Language (IDL),(1) that can perform the above image processing tasks.

  8. Direct bonded HOPG - Analyzer support without background source

    NASA Astrophysics Data System (ADS)

    Groitl, Felix; Kitaura, Hidetoshi; Nishiki, Naomi; Rønnow, Henrik M.

    2018-04-01

    A new production process allows a direct bonding of HOPG crystals on Si wafers. This new method facilitates the production of analyzer crystals with support structure without the use of additional, background inducing fixation material, e.g. glue, wax and screws. This new method is especially interesting for the upcoming generation of CAMEA-type multiplexing spectrometers. These instruments allow for a drastic performance increase due to the increased angular coverage and multiple energy analysis. Exploiting the transparency of multiple HOPG for cold neutrons, a consecutive arrangement of HOPG analyzer crystals per Q-channel can be achieved. This implies that neutrons travel through up to 10 arrays of analyzer crystals before reaching the analyzer corresponding to their energy. Hence, a careful choice of the fixation method for the analyzer crystals in regards to transparency and background is necessary. Here, we present first results on the diffraction and mechanical performance of direct bonded analyzer crystals.

  9. Interactive QR code beautification with full background image embedding

    NASA Astrophysics Data System (ADS)

    Lin, Lijian; Wu, Song; Liu, Sijiang; Jiang, Bo

    2017-06-01

    QR (Quick Response) code is a kind of two dimensional barcode that was first developed in automotive industry. Nowadays, QR code has been widely used in commercial applications like product promotion, mobile payment, product information management, etc. Traditional QR codes in accordance with the international standard are reliable and fast to decode, but are lack of aesthetic appearance to demonstrate visual information to customers. In this work, we present a novel interactive method to generate aesthetic QR code. By given information to be encoded and an image to be decorated as full QR code background, our method accepts interactive user's strokes as hints to remove undesired parts of QR code modules based on the support of QR code error correction mechanism and background color thresholds. Compared to previous approaches, our method follows the intention of the QR code designer, thus can achieve more user pleasant result, while keeping high machine readability.

  10. Application of point-to-point matching algorithms for background correction in on-line liquid chromatography-Fourier transform infrared spectrometry (LC-FTIR).

    PubMed

    Kuligowski, J; Quintás, G; Garrigues, S; de la Guardia, M

    2010-03-15

    A new background correction method for the on-line coupling of gradient liquid chromatography and Fourier transform infrared spectrometry has been developed. It is based on the use of a point-to-point matching algorithm that compares the absorption spectra of the sample data set with those of a previously recorded reference data set in order to select an appropriate reference spectrum. The spectral range used for the point-to-point comparison is selected with minimal user-interaction, thus facilitating considerably the application of the whole method. The background correction method has been successfully tested on a chromatographic separation of four nitrophenols running acetonitrile (0.08%, v/v TFA):water (0.08%, v/v TFA) gradients with compositions ranging from 35 to 85% (v/v) acetonitrile, giving accurate results for both, baseline resolved and overlapped peaks. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  11. The Potential Use of Polarized Reflected Light in the Remote Sensing of Soil Moisture

    DTIC Science & Technology

    to 89% for saturated soil, indicating that the polarization method may be viable as a remote sensing system for determining soil moistures. Background on the methods and implications of the results are presented.

  12. Robust recognition of degraded machine-printed characters using complementary similarity measure and error-correction learning

    NASA Astrophysics Data System (ADS)

    Hagita, Norihiro; Sawaki, Minako

    1995-03-01

    Most conventional methods in character recognition extract geometrical features such as stroke direction, connectivity of strokes, etc., and compare them with reference patterns in a stored dictionary. Unfortunately, geometrical features are easily degraded by blurs, stains and the graphical background designs used in Japanese newspaper headlines. This noise must be removed before recognition commences, but no preprocessing method is completely accurate. This paper proposes a method for recognizing degraded characters and characters printed on graphical background designs. This method is based on the binary image feature method and uses binary images as features. A new similarity measure, called the complementary similarity measure, is used as a discriminant function. It compares the similarity and dissimilarity of binary patterns with reference dictionary patterns. Experiments are conducted using the standard character database ETL-2 which consists of machine-printed Kanji, Hiragana, Katakana, alphanumeric, an special characters. The results show that this method is much more robust against noise than the conventional geometrical feature method. It also achieves high recognition rates of over 92% for characters with textured foregrounds, over 98% for characters with textured backgrounds, over 98% for outline fonts, and over 99% for reverse contrast characters.

  13. Novel algorithm by low complexity filter on retinal vessel segmentation

    NASA Astrophysics Data System (ADS)

    Rostampour, Samad

    2011-10-01

    This article shows a new method to detect blood vessels in the retina by digital images. Retinal vessel segmentation is important for detection of side effect of diabetic disease, because diabetes can form new capillaries which are very brittle. The research has been done in two phases: preprocessing and processing. Preprocessing phase consists to apply a new filter that produces a suitable output. It shows vessels in dark color on white background and make a good difference between vessels and background. The complexity is very low and extra images are eliminated. The second phase is processing and used the method is called Bayesian. It is a built-in in supervision classification method. This method uses of mean and variance of intensity of pixels for calculate of probability. Finally Pixels of image are divided into two classes: vessels and background. Used images are related to the DRIVE database. After performing this operation, the calculation gives 95 percent of efficiency average. The method also was performed from an external sample DRIVE database which has retinopathy, and perfect result was obtained

  14. Evaluation of background parenchymal enhancement on breast MRI: a systematic review

    PubMed Central

    Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto

    2017-01-01

    Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480

  15. Hybrid active contour model for inhomogeneous image segmentation with background estimation

    NASA Astrophysics Data System (ADS)

    Sun, Kaiqiong; Li, Yaqin; Zeng, Shan; Wang, Jun

    2018-03-01

    This paper proposes a hybrid active contour model for inhomogeneous image segmentation. The data term of the energy function in the active contour consists of a global region fitting term in a difference image and a local region fitting term in the original image. The difference image is obtained by subtracting the background from the original image. The background image is dynamically estimated from a linear filtered result of the original image on the basis of the varying curve locations during the active contour evolution process. As in existing local models, fitting the image to local region information makes the proposed model robust against an inhomogeneous background and maintains the accuracy of the segmentation result. Furthermore, fitting the difference image to the global region information makes the proposed model robust against the initial contour location, unlike existing local models. Experimental results show that the proposed model can obtain improved segmentation results compared with related methods in terms of both segmentation accuracy and initial contour sensitivity.

  16. Improvement of Accuracy for Background Noise Estimation Method Based on TPE-AE

    NASA Astrophysics Data System (ADS)

    Itai, Akitoshi; Yasukawa, Hiroshi

    This paper proposes a method of a background noise estimation based on the tensor product expansion with a median and a Monte carlo simulation. We have shown that a tensor product expansion with absolute error method is effective to estimate a background noise, however, a background noise might not be estimated by using conventional method properly. In this paper, it is shown that the estimate accuracy can be improved by using proposed methods.

  17. Background derivation and image flattening: getimages

    NASA Astrophysics Data System (ADS)

    Men'shchikov, A.

    2017-11-01

    Modern high-resolution images obtained with space observatories display extremely strong intensity variations across images on all spatial scales. Source extraction in such images with methods based on global thresholding may bring unacceptably large numbers of spurious sources in bright areas while failing to detect sources in low-background or low-noise areas. It would be highly beneficial to subtract background and equalize the levels of small-scale fluctuations in the images before extracting sources or filaments. This paper describes getimages, a new method of background derivation and image flattening. It is based on median filtering with sliding windows that correspond to a range of spatial scales from the observational beam size up to a maximum structure width Xλ. The latter is a single free parameter of getimages that can be evaluated manually from the observed image ℐλ. The median filtering algorithm provides a background image \\tilde{Bλ} for structures of all widths below Xλ. The same median filtering procedure applied to an image of standard deviations 𝓓λ derived from a background-subtracted image \\tilde{Sλ} results in a flattening image \\tilde{Fλ}. Finally, a flattened detection image I{λD} = \\tilde{Sλ}/\\tilde{Fλ} is computed, whose standard deviations are uniform outside sources and filaments. Detecting sources in such greatly simplified images results in much cleaner extractions that are more complete and reliable. As a bonus, getimages reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images.

  18. Research on the algorithm of infrared target detection based on the frame difference and background subtraction method

    NASA Astrophysics Data System (ADS)

    Liu, Yun; Zhao, Yuejin; Liu, Ming; Dong, Liquan; Hui, Mei; Liu, Xiaohua; Wu, Yijian

    2015-09-01

    As an important branch of infrared imaging technology, infrared target tracking and detection has a very important scientific value and a wide range of applications in both military and civilian areas. For the infrared image which is characterized by low SNR and serious disturbance of background noise, an innovative and effective target detection algorithm is proposed in this paper, according to the correlation of moving target frame-to-frame and the irrelevance of noise in sequential images based on OpenCV. Firstly, since the temporal differencing and background subtraction are very complementary, we use a combined detection method of frame difference and background subtraction which is based on adaptive background updating. Results indicate that it is simple and can extract the foreground moving target from the video sequence stably. For the background updating mechanism continuously updating each pixel, we can detect the infrared moving target more accurately. It paves the way for eventually realizing real-time infrared target detection and tracking, when transplanting the algorithms on OpenCV to the DSP platform. Afterwards, we use the optimal thresholding arithmetic to segment image. It transforms the gray images to black-white images in order to provide a better condition for the image sequences detection. Finally, according to the relevance of moving objects between different frames and mathematical morphology processing, we can eliminate noise, decrease the area, and smooth region boundaries. Experimental results proves that our algorithm precisely achieve the purpose of rapid detection of small infrared target.

  19. Site Amplification Characteristics of the Several Seismic Stations at Jeju Island, in Korea, using S-wave Energy, Background Noise, and Coda waves from the East Japan earthquake (Mar. 11th, 2011) Series.

    NASA Astrophysics Data System (ADS)

    Seong-hwa, Y.; Wee, S.; Kim, J.

    2016-12-01

    Observed ground motions are composed of 3 main factors such as seismic source, seismic wave attenuation and site amplification. Among them, site amplification is also important factor and should be considered to estimate soil-structure dynamic interaction with more reliability. Though various estimation methods are suggested, this study used the method by Castro et. al.(1997) for estimating site amplification. This method has been extended to background noise, coda waves and S waves recently for estimating site amplification. This study applied the Castro et. al.(1997)'s method to 3 different seismic waves, that is, S-wave Energy, Background Noise, and Coda waves. This study analysed much more than about 200 ground motions (acceleration type) from the East Japan earthquake (March 11th, 2011) Series of seismic stations at Jeju Island (JJU, SGP, HALB, SSP and GOS; Fig. 1), in Korea. The results showed that most of the seismic stations gave similar results among three types of seismic energies. Each station showed its own characteristics of site amplification property in low, high and specific resonance frequency ranges. Comparison of this study to other studies can give us much information about dynamic amplification of domestic sites characteristics and site classification.

  20. Reducing DRIFT backgrounds with a submicron aluminized-mylar cathode

    NASA Astrophysics Data System (ADS)

    Battat, J. B. R.; Daw, E.; Dorofeev, A.; Ezeribe, A. C.; Fox, J. R.; Gauvreau, J.-L.; Gold, M.; Harmon, L.; Harton, J.; Lafler, R.; Landers, J.; Lauer, R. J.; Lee, E. R.; Loomba, D.; Lumnah, A.; Matthews, J.; Miller, E. H.; Mouton, F.; Murphy, A. St. J.; Paling, S. M.; Phan, N.; Sadler, S. W.; Scarff, A.; Schuckman, F. G.; Snowden-Ifft, D.; Spooner, N. J. C.; Walker, D.

    2015-09-01

    Background events in the DRIFT-IId dark matter detector, mimicking potential WIMP signals, are predominantly caused by alpha decays on the central cathode in which the alpha particle is completely or partially absorbed by the cathode material. We installed a 0.9 μm thick aluminized-mylar cathode as a way to reduce the probability of producing these backgrounds. We study three generations of cathode (wire, thin-film, and radiologically clean thin-film) with a focus on the ratio of background events to alpha decays. Two independent methods of measuring the absolute alpha decay rate are used to ensure an accurate result, and agree to within 10%. Using alpha range spectroscopy, we measure the radiologically cleanest cathode version to have a contamination of 3.3±0.1 ppt 234U and 73±2 ppb 238U. This cathode reduces the probability of producing an RPR from an alpha decay by a factor of 70±20 compared to the original stainless steel wire cathode. First results are presented from a texturized version of the cathode, intended to be even more transparent to alpha particles. These efforts, along with other background reduction measures, have resulted in a drop in the observed background rate from 500/day to 1/day. With the recent implementation of full-volume fiducialization, these remaining background events are identified, allowing for background-free operation.

  1. Spectral feature characterization methods for blood stain detection in crime scene backgrounds

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Mathew, Jobin J.; Dube, Roger R.; Messinger, David W.

    2016-05-01

    Blood stains are one of the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Blood spectral signatures containing unique reflectance or absorption features are important both for forensic on-site investigation and laboratory testing. They can be used for target detection and identification applied to crime scene hyperspectral imagery, and also be utilized to analyze the spectral variation of blood on various backgrounds. Non-blood stains often mislead the detection and can generate false alarms at a real crime scene, especially for dark and red backgrounds. This paper measured the reflectance of liquid blood and 9 kinds of non-blood samples in the range of 350 nm - 2500 nm in various crime scene backgrounds, such as pure samples contained in petri dish with various thicknesses, mixed samples with different colors and materials of fabrics, and mixed samples with wood, all of which are examined to provide sub-visual evidence for detecting and recognizing blood from non-blood samples in a realistic crime scene. The spectral difference between blood and non-blood samples are examined and spectral features such as "peaks" and "depths" of reflectance are selected. Two blood stain detection methods are proposed in this paper. The first method uses index to denote the ratio of "depth" minus "peak" over"depth" add"peak" within a wavelength range of the reflectance spectrum. The second method uses relative band depth of the selected wavelength ranges of the reflectance spectrum. Results show that the index method is able to discriminate blood from non-blood samples in most tested crime scene backgrounds, but is not able to detect it from black felt. Whereas the relative band depth method is able to discriminate blood from non-blood samples on all of the tested background material types and colors.

  2. Infrared dim small target segmentation method based on ALI-PCNN model

    NASA Astrophysics Data System (ADS)

    Zhao, Shangnan; Song, Yong; Zhao, Yufei; Li, Yun; Li, Xu; Jiang, Yurong; Li, Lin

    2017-10-01

    Pulse Coupled Neural Network (PCNN) is improved by Adaptive Lateral Inhibition (ALI), while a method of infrared (IR) dim small target segmentation based on ALI-PCNN model is proposed in this paper. Firstly, the feeding input signal is modulated by lateral inhibition network to suppress background. Then, the linking input is modulated by ALI, and linking weight matrix is generated adaptively by calculating ALI coefficient of each pixel. Finally, the binary image is generated through the nonlinear modulation and the pulse generator in PCNN. The experimental results show that the segmentation effect as well as the values of contrast across region and uniformity across region of the proposed method are better than the OTSU method, maximum entropy method, the methods based on conventional PCNN and visual attention, and the proposed method has excellent performance in extracting IR dim small target from complex background.

  3. Recursive least squares background prediction of univariate syndromic surveillance data.

    PubMed

    Najmi, Amir-Homayoon; Burkom, Howard

    2009-01-16

    Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the residuals of the RLS forecasts. We compare the detection performance of this algorithm to the W2 method recently implemented at CDC. The modified RLS method gives consistently better sensitivity at multiple background alert rates, and we recommend that it should be considered for routine application in bio-surveillance systems.

  4. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications

    PubMed Central

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-01-01

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC. PMID:29280970

  5. Background Registration-Based Adaptive Noise Filtering of LWIR/MWIR Imaging Sensors for UAV Applications.

    PubMed

    Kim, Byeong Hak; Kim, Min Young; Chae, You Seong

    2017-12-27

    Unmanned aerial vehicles (UAVs) are equipped with optical systems including an infrared (IR) camera such as electro-optical IR (EO/IR), target acquisition and designation sights (TADS), or forward looking IR (FLIR). However, images obtained from IR cameras are subject to noise such as dead pixels, lines, and fixed pattern noise. Nonuniformity correction (NUC) is a widely employed method to reduce noise in IR images, but it has limitations in removing noise that occurs during operation. Methods have been proposed to overcome the limitations of the NUC method, such as two-point correction (TPC) and scene-based NUC (SBNUC). However, these methods still suffer from unfixed pattern noise. In this paper, a background registration-based adaptive noise filtering (BRANF) method is proposed to overcome the limitations of conventional methods. The proposed BRANF method utilizes background registration processing and robust principle component analysis (RPCA). In addition, image quality verification methods are proposed that can measure the noise filtering performance quantitatively without ground truth images. Experiments were performed for performance verification with middle wave infrared (MWIR) and long wave infrared (LWIR) images obtained from practical military optical systems. As a result, it is found that the image quality improvement rate of BRANF is 30% higher than that of conventional NUC.

  6. WDR-PK-AK-018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollister, R

    2009-08-26

    Method - CES SOP-HW-P556 'Field and Bulk Gamma Analysis'. Detector - High-purity germanium, 40% relative efficiency. Calibration - The detector was calibrated on February 8, 2006 using a NIST-traceable sealed source, and the calibration was verified using an independent sealed source. Count Time and Geometry - The sample was counted for 20 minutes at 72 inches from the detector. A lead collimator was used to limit the field-of-view to the region of the sample. The drum was rotated 180 degrees halfway through the count time. Date and Location of Scans - June 1,2006 in Building 235 Room 1136. Spectral Analysismore » Spectra were analyzed with ORTEC GammaVision software. Matrix and geometry corrections were calculated using OR TEC Isotopic software. A background spectrum was measured at the counting location. No man-made radioactivity was observed in the background. Results were determined from the sample spectra without background subtraction. Minimum detectable activities were calculated by the Nureg 4.16 method. Results - Detected Pu-238, Pu-239, Am-241 and Am-243.« less

  7. Writer identification on historical Glagolitic documents

    NASA Astrophysics Data System (ADS)

    Fiel, Stefan; Hollaus, Fabian; Gau, Melanie; Sablatnig, Robert

    2013-12-01

    This work aims at automatically identifying scribes of historical Slavonic manuscripts. The quality of the ancient documents is partially degraded by faded-out ink or varying background. The writer identification method used is based on image features, which are described with Scale Invariant Feature Transform (SIFT) features. A visual vocabulary is used for the description of handwriting characteristics, whereby the features are clustered using a Gaussian Mixture Model and employing the Fisher kernel. The writer identification approach is originally designed for grayscale images of modern handwritings. But contrary to modern documents, the historical manuscripts are partially corrupted by background clutter and water stains. As a result, SIFT features are also found on the background. Since the method shows also good results on binarized images of modern handwritings, the approach was additionally applied on binarized images of the ancient writings. Experiments show that this preprocessing step leads to a significant performance increase: The identification rate on binarized images is 98.9%, compared to an identification rate of 87.6% gained on grayscale images.

  8. A baseline drift detrending technique for fast scan cyclic voltammetry.

    PubMed

    DeWaele, Mark; Oh, Yoonbae; Park, Cheonho; Kang, Yu Min; Shin, Hojin; Blaha, Charles D; Bennet, Kevin E; Kim, In Young; Lee, Kendall H; Jang, Dong Pyo

    2017-11-06

    Fast scan cyclic voltammetry (FSCV) has been commonly used to measure extracellular neurotransmitter concentrations in the brain. Due to the unstable nature of the background currents inherent in FSCV measurements, analysis of FSCV data is limited to very short amounts of time using traditional background subtraction. In this paper, we propose the use of a zero-phase high pass filter (HPF) as the means to remove the background drift. Instead of the traditional method of low pass filtering across voltammograms to increase the signal to noise ratio, a HPF with a low cutoff frequency was applied to the temporal dataset at each voltage point to remove the background drift. As a result, the HPF utilizing cutoff frequencies between 0.001 Hz and 0.01 Hz could be effectively used to a set of FSCV data for removing the drifting patterns while preserving the temporal kinetics of the phasic dopamine response recorded in vivo. In addition, compared to a drift removal method using principal component analysis, this was found to be significantly more effective in reducing the drift (unpaired t-test p < 0.0001, t = 10.88) when applied to data collected from Tris buffer over 24 hours although a drift removal method using principal component analysis also showed the effective background drift reduction. The HPF was also applied to 5 hours of FSCV in vivo data. Electrically evoked dopamine peaks, observed in the nucleus accumbens, were clearly visible even without background subtraction. This technique provides a new, simple, and yet robust, approach to analyse FSCV data with an unstable background.

  9. A Mixed-Methods Comparison of Classroom Context during Food, Health & Choices, a Childhood Obesity Prevention Intervention

    ERIC Educational Resources Information Center

    Burgermaster, Marissa; Koroly, Jenna; Contento, Isobel; Koch, Pamela; Gray, Heewon L.

    2017-01-01

    Background: Schools are frequent settings for childhood obesity prevention; however, intervention results are mixed. Classroom context may hold important clues to improving these interventions. Methods: We used mixed methods to examine classroom context during a curriculum intervention taught by trained instructors in fifth grade classrooms. We…

  10. Environmental Asbestos Assessment Manual Superfund Method for the Determination of Asbestos in Ambient Air Part 2: Technical Background Document

    EPA Science Inventory

    A sampling and analysis method for the determination of asbestos in air is presented in Part 1 of this report, under separate cover. This method is designed specifically to provide results suitable for supporting risk assessments at Superfund sites, although it is applicable t...

  11. Phosphorus Determination by Derivative Activation Analysis: A Multifaceted Radiochemical Application.

    ERIC Educational Resources Information Center

    Kleppinger, E. W.; And Others

    1984-01-01

    Although determination of phosphorus is important in biology, physiology, and environmental science, traditional gravimetric and colorimetric methods are cumbersome and lack the requisite sensitivity. Therefore, a derivative activation analysis method is suggested. Background information, procedures, and results are provided. (JN)

  12. Discrete Element Method (DEM) Simulations using PFC3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matt Evans

    Contains input scripts, background information, reduced data, and results associated with the discrete element method (DEM) simulations of interface shear tests, plate anchor pullout tests, and torpedo anchor installation and pullout tests, using the software PFC3D (v4.0).

  13. Recognized Leader in Electrochemical Purification

    ScienceCinema

    Hoppe, Eric

    2018-01-16

    PNNL scientists developed an electrochemical method for purifying copper, a key material that makes possible radiation detection systems of unprecedented sensitivity. The method begins with the purest copper materials available, and results in the lowest-background copper in the world. Chemist Eric Hoppe explains the process.

  14. Recovery of Background Structures in Nanoscale Helium Ion Microscope Imaging

    PubMed Central

    Carasso, Alfred S; Vladár, András E

    2014-01-01

    This paper discusses a two step enhancement technique applicable to noisy Helium Ion Microscope images in which background structures are not easily discernible due to a weak signal. The method is based on a preliminary adaptive histogram equalization, followed by ‘slow motion’ low-exponent Lévy fractional diffusion smoothing. This combined approach is unexpectedly effective, resulting in a companion enhanced image in which background structures are rendered much more visible, and noise is significantly reduced, all with minimal loss of image sharpness. The method also provides useful enhancements of scanning charged-particle microscopy images obtained by composing multiple drift-corrected ‘fast scan’ frames. The paper includes software routines, written in Interactive Data Language (IDL),1 that can perform the above image processing tasks. PMID:26601050

  15. Evaluation of the impact of observations on blended sea surface winds in a two-dimensional variational scheme using degrees of freedom

    NASA Astrophysics Data System (ADS)

    Wang, Ting; Xiang, Jie; Fei, Jianfang; Wang, Yi; Liu, Chunxia; Li, Yuanxiang

    2017-12-01

    This paper presents an evaluation of the observational impacts on blended sea surface winds from a two-dimensional variational data assimilation (2D-Var) scheme. We begin by briefly introducing the analysis sensitivity with respect to observations in variational data assimilation systems and its relationship with the degrees of freedom for signal (DFS), and then the DFS concept is applied to the 2D-Var sea surface wind blending scheme. Two methods, a priori and a posteriori, are used to estimate the DFS of the zonal ( u) and meridional ( v) components of winds in the 2D-Var blending scheme. The a posteriori method can obtain almost the same results as the a priori method. Because only by-products of the blending scheme are used for the a posteriori method, the computation time is reduced significantly. The magnitude of the DFS is critically related to the observational and background error statistics. Changing the observational and background error variances can affect the DFS value. Because the observation error variances are assumed to be uniform, the observational influence at each observational location is related to the background error variance, and the observations located at the place where there are larger background error variances have larger influences. The average observational influence of u and v with respect to the analysis is about 40%, implying that the background influence with respect to the analysis is about 60%.

  16. The Impacts of Heating Strategy on Soil Moisture Estimation Using Actively Heated Fiber Optics.

    PubMed

    Dong, Jianzhi; Agliata, Rosa; Steele-Dunne, Susan; Hoes, Olivier; Bogaard, Thom; Greco, Roberto; van de Giesen, Nick

    2017-09-13

    Several recent studies have highlighted the potential of Actively Heated Fiber Optics (AHFO) for high resolution soil moisture mapping. In AHFO, the soil moisture can be calculated from the cumulative temperature ( T cum ), the maximum temperature ( T max ), or the soil thermal conductivity determined from the cooling phase after heating ( λ ). This study investigates the performance of the T cum , T max and λ methods for different heating strategies, i.e., differences in the duration and input power of the applied heat pulse. The aim is to compare the three approaches and to determine which is best suited to field applications where the power supply is limited. Results show that increasing the input power of the heat pulses makes it easier to differentiate between dry and wet soil conditions, which leads to an improved accuracy. Results suggest that if the power supply is limited, the heating strength is insufficient for the λ method to yield accurate estimates. Generally, the T cum and T max methods have similar accuracy. If the input power is limited, increasing the heat pulse duration can improve the accuracy of the AHFO method for both of these techniques. In particular, extending the heating duration can significantly increase the sensitivity of T cum to soil moisture. Hence, the T cum method is recommended when the input power is limited. Finally, results also show that up to 50% of the cable temperature change during the heat pulse can be attributed to soil background temperature, i.e., soil temperature changed by the net solar radiation. A method is proposed to correct this background temperature change. Without correction, soil moisture information can be completely masked by the background temperature error.

  17. The Impacts of Heating Strategy on Soil Moisture Estimation Using Actively Heated Fiber Optics

    PubMed Central

    Dong, Jianzhi; Agliata, Rosa; Steele-Dunne, Susan; Hoes, Olivier; Bogaard, Thom; Greco, Roberto; van de Giesen, Nick

    2017-01-01

    Several recent studies have highlighted the potential of Actively Heated Fiber Optics (AHFO) for high resolution soil moisture mapping. In AHFO, the soil moisture can be calculated from the cumulative temperature (Tcum), the maximum temperature (Tmax), or the soil thermal conductivity determined from the cooling phase after heating (λ). This study investigates the performance of the Tcum, Tmax and λ methods for different heating strategies, i.e., differences in the duration and input power of the applied heat pulse. The aim is to compare the three approaches and to determine which is best suited to field applications where the power supply is limited. Results show that increasing the input power of the heat pulses makes it easier to differentiate between dry and wet soil conditions, which leads to an improved accuracy. Results suggest that if the power supply is limited, the heating strength is insufficient for the λ method to yield accurate estimates. Generally, the Tcum and Tmax methods have similar accuracy. If the input power is limited, increasing the heat pulse duration can improve the accuracy of the AHFO method for both of these techniques. In particular, extending the heating duration can significantly increase the sensitivity of Tcum to soil moisture. Hence, the Tcum method is recommended when the input power is limited. Finally, results also show that up to 50% of the cable temperature change during the heat pulse can be attributed to soil background temperature, i.e., soil temperature changed by the net solar radiation. A method is proposed to correct this background temperature change. Without correction, soil moisture information can be completely masked by the background temperature error. PMID:28902141

  18. Gas leak detection in infrared video with background modeling

    NASA Astrophysics Data System (ADS)

    Zeng, Xiaoxia; Huang, Likun

    2018-03-01

    Background modeling plays an important role in the task of gas detection based on infrared video. VIBE algorithm is a widely used background modeling algorithm in recent years. However, the processing speed of the VIBE algorithm sometimes cannot meet the requirements of some real time detection applications. Therefore, based on the traditional VIBE algorithm, we propose a fast prospect model and optimize the results by combining the connected domain algorithm and the nine-spaces algorithm in the following processing steps. Experiments show the effectiveness of the proposed method.

  19. Nonrelativistic trace and diffeomorphism anomalies in particle number background

    NASA Astrophysics Data System (ADS)

    Auzzi, Roberto; Baiguera, Stefano; Nardelli, Giuseppe

    2018-04-01

    Using the heat kernel method, we compute nonrelativistic trace anomalies for Schrödinger theories in flat spacetime, with a generic background gauge field for the particle number symmetry, both for a free scalar and a free fermion. The result is genuinely nonrelativistic, and it has no counterpart in the relativistic case. Contrary to naive expectations, the anomaly is not gauge invariant; this is similar to the nongauge covariance of the non-Abelian relativistic anomaly. We also show that, in the same background, the gravitational anomaly for a nonrelativistic scalar vanishes.

  20. Adaptive nonlinear control for autonomous ground vehicles

    NASA Astrophysics Data System (ADS)

    Black, William S.

    We present the background and motivation for ground vehicle autonomy, and focus on uses for space-exploration. Using a simple design example of an autonomous ground vehicle we derive the equations of motion. After providing the mathematical background for nonlinear systems and control we present two common methods for exactly linearizing nonlinear systems, feedback linearization and backstepping. We use these in combination with three adaptive control methods: model reference adaptive control, adaptive sliding mode control, and extremum-seeking model reference adaptive control. We show the performances of each combination through several simulation results. We then consider disturbances in the system, and design nonlinear disturbance observers for both single-input-single-output and multi-input-multi-output systems. Finally, we show the performance of these observers with simulation results.

  1. [The validation of the effect of correcting spectral background changes based on floating reference method by simulation].

    PubMed

    Wang, Zhu-lou; Zhang, Wan-jie; Li, Chen-xi; Chen, Wen-liang; Xu, Ke-xin

    2015-02-01

    There are some challenges in near-infrared non-invasive blood glucose measurement, such as the low signal to noise ratio of instrument, the unstable measurement conditions, the unpredictable and irregular changes of the measured object, and etc. Therefore, it is difficult to extract the information of blood glucose concentrations from the complicated signals accurately. Reference measurement method is usually considered to be used to eliminate the effect of background changes. But there is no reference substance which changes synchronously with the anylate. After many years of research, our research group has proposed the floating reference method, which is succeeded in eliminating the spectral effects induced by the instrument drifts and the measured object's background variations. But our studies indicate that the reference-point will changes following the changing of measurement location and wavelength. Therefore, the effects of floating reference method should be verified comprehensively. In this paper, keeping things simple, the Monte Carlo simulation employing Intralipid solution with the concentrations of 5% and 10% is performed to verify the effect of floating reference method used into eliminating the consequences of the light source drift. And the light source drift is introduced through varying the incident photon number. The effectiveness of the floating reference method with corresponding reference-points at different wavelengths in eliminating the variations of the light source drift is estimated. The comparison of the prediction abilities of the calibration models with and without using this method shows that the RMSEPs of the method are decreased by about 98.57% (5%Intralipid)and 99.36% (10% Intralipid)for different Intralipid. The results indicate that the floating reference method has obvious effect in eliminating the background changes.

  2. On consistent inter-view synthesis for autostereoscopic displays

    NASA Astrophysics Data System (ADS)

    Tran, Lam C.; Bal, Can; Pal, Christopher J.; Nguyen, Truong Q.

    2012-03-01

    In this paper we present a novel stereo view synthesis algorithm that is highly accurate with respect to inter-view consistency, thus to enabling stereo contents to be viewed on the autostereoscopic displays. The algorithm finds identical occluded regions within each virtual view and aligns them together to extract a surrounding background layer. The background layer for each occluded region is then used with an exemplar based inpainting method to synthesize all virtual views simultaneously. Our algorithm requires the alignment and extraction of background layers for each occluded region; however, these two steps are done efficiently with lower computational complexity in comparison to previous approaches using the exemplar based inpainting algorithms. Thus, it is more efficient than existing algorithms that synthesize one virtual view at a time. This paper also describes the implementation of a simplified GPU accelerated version of the approach and its implementation in CUDA. Our CUDA method has sublinear complexity in terms of the number of views that need to be generated, which makes it especially useful for generating content for autostereoscopic displays that require many views to operate. An objective of our work is to allow the user to change depth and viewing perspective on the fly. Therefore, to further accelerate the CUDA variant of our approach, we present a modified version of our method to warp the background pixels from reference views to a middle view to recover background pixels. We then use an exemplar based inpainting method to fill in the occluded regions. We use warping of the foreground from the reference images and background from the filled regions to synthesize new virtual views on the fly. Our experimental results indicate that the simplified CUDA implementation decreases running time by orders of magnitude with negligible loss in quality. [Figure not available: see fulltext.

  3. Dual-wavelength excitation to reduce background fluorescence for fluorescence spectroscopic quantitation of erythrocyte zinc protoporphyrin-IX and protoporphyrin-IX from whole blood and oral mucosa

    NASA Astrophysics Data System (ADS)

    Hennig, Georg; Vogeser, Michael; Holdt, Lesca M.; Homann, Christian; Großmann, Michael; Stepp, Herbert; Gruber, Christian; Erdogan, Ilknur; Hasmüller, Stephan; Hasbargen, Uwe; Brittenham, Gary M.

    2014-02-01

    Erythrocyte zinc protoporphyrin-IX (ZnPP) and protoporphyrin-IX (PPIX) accumulate in a variety of disorders that restrict or disrupt the biosynthesis of heme, including iron deficiency and various porphyrias. We describe a reagent-free spectroscopic method based on dual-wavelength excitation that can measure simultaneously both ZnPP and PPIX fluorescence from unwashed whole blood while virtually eliminating background fluorescence. We further aim to quantify ZnPP and PPIX non-invasively from the intact oral mucosa using dual-wavelength excitation to reduce the strong tissue background fluorescence while retaining the faint porphyrin fluorescence signal originating from erythrocytes. Fluorescence spectroscopic measurements were made on 35 diluted EDTA blood samples using a custom front-face fluorometer. The difference spectrum between fluorescence at 425 nm and 407 nm excitation effectively eliminated background autofluorescence while retaining the characteristic porphyrin peaks. These peaks were evaluated quantitatively and the results compared to a reference HPLC-kit method. A modified instrument using a single 1000 μm fiber for light delivery and detection was used to record fluorescence spectra from oral mucosa. For blood measurements, the ZnPP and PPIX fluorescence intensities from the difference spectra correlated well with the reference method (ZnPP: Spearman's rho rs = 0.943, p < 0.0001; PPIX: rs = 0.959, p < 0.0001). In difference spectra from oral mucosa, background fluorescence was reduced significantly, while porphyrin signals remained observable. The dual-wavelength excitation method evaluates quantitatively the ZnPP/heme and PPIX/heme ratios from unwashed whole blood, simplifying clinical laboratory measurements. The difference technique reduces the background fluorescence from measurements on oral mucosa, allowing for future non-invasive quantitation of erythrocyte ZnPP and PPIX.

  4. A Series of Case Studies of Tinnitus Suppression With Mixed Background Stimuli in a Cochlear Implant

    PubMed Central

    Keiner, A. J.; Walker, Kurt; Deshpande, Aniruddha K.; Witt, Shelley; Killian, Matthijs; Ji, Helena; Patrick, Jim; Dillier, Norbert; van Dijk, Pim; Lai, Wai Kong; Hansen, Marlan R.; Gantz, Bruce

    2015-01-01

    Purpose Background sounds provided by a wearable sound playback device were mixed with the acoustical input picked up by a cochlear implant speech processor in an attempt to suppress tinnitus. Method First, patients were allowed to listen to several sounds and to select up to 4 sounds that they thought might be effective. These stimuli were programmed to loop continuously in the wearable playback device. Second, subjects were instructed to use 1 background sound each day on the wearable device, and they sequenced the selected background sounds during a 28-day trial. Patients were instructed to go to a website at the end of each day and rate the loudness and annoyance of the tinnitus as well as the acceptability of the background sound. Patients completed the Tinnitus Primary Function Questionnaire (Tyler, Stocking, Secor, & Slattery, 2014) at the beginning of the trial. Results Results indicated that background sounds were very effective at suppressing tinnitus. There was considerable variability in sounds preferred by the subjects. Conclusion The study shows that a background sound mixed with the microphone input can be effective for suppressing tinnitus during daily use of the sound processor in selected cochlear implant users. PMID:26001407

  5. CRISPR/Cas9 Editing of the Bacillus subtilis Genome

    PubMed Central

    Burby, Peter E.; Simmons, Lyle A.

    2017-01-01

    A fundamental procedure for most modern biologists is the genetic manipulation of the organism under study. Although many different methods for editing bacterial genomes have been used in laboratories for decades, the adaptation of CRISPR/Cas9 technology to bacterial genetics has allowed researchers to manipulate bacterial genomes with unparalleled facility. CRISPR/Cas9 has allowed for genome edits to be more precise, while also increasing the efficiency of transferring mutations into a variety of genetic backgrounds. As a result, the advantages are realized in tractable organisms and organisms that have been refractory to genetic manipulation. Here, we describe our method for editing the genome of the bacterium Bacillus subtilis. Our method is highly efficient, resulting in precise, markerless mutations. Further, after generating the editing plasmid, the mutation can be quickly introduced into several genetic backgrounds, greatly increasing the speed with which genetic analyses may be performed. PMID:28706963

  6. MEASUREMENT OF INDOOR AIR EMISSIONS FROM DRY-PROCESS PHOTOCOPY MACHINES

    EPA Science Inventory

    The article provides background information on indoor air emissions from office equipment, with emphasis on dry-process photocopy machines. The test method is described in detail along with results of a study to evaluate the test method using four dry-process photocopy machines. ...

  7. Joint detection and tracking of size-varying infrared targets based on block-wise sparse decomposition

    NASA Astrophysics Data System (ADS)

    Li, Miao; Lin, Zaiping; Long, Yunli; An, Wei; Zhou, Yiyu

    2016-05-01

    The high variability of target size makes small target detection in Infrared Search and Track (IRST) a challenging task. A joint detection and tracking method based on block-wise sparse decomposition is proposed to address this problem. For detection, the infrared image is divided into overlapped blocks, and each block is weighted on the local image complexity and target existence probabilities. Target-background decomposition is solved by block-wise inexact augmented Lagrange multipliers. For tracking, label multi-Bernoulli (LMB) tracker tracks multiple targets taking the result of single-frame detection as input, and provides corresponding target existence probabilities for detection. Unlike fixed-size methods, the proposed method can accommodate size-varying targets, due to no special assumption for the size and shape of small targets. Because of exact decomposition, classical target measurements are extended and additional direction information is provided to improve tracking performance. The experimental results show that the proposed method can effectively suppress background clutters, detect and track size-varying targets in infrared images.

  8. Hawking radiation as tunneling from squashed Kaluza-Klein black hole

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsuno, Ken; Umetsu, Koichiro

    2011-03-15

    We discuss Hawking radiation from a five-dimensional squashed Kaluza-Klein black hole on the basis of the tunneling mechanism. A simple method, which was recently suggested by Umetsu, may be used to extend the original derivation by Parikh and Wilczek to various black holes. That is, we use the two-dimensional effective metric, which is obtained by the dimensional reduction near the horizon, as the background metric. Using the same method, we derive both the desired result of the Hawking temperature and the effect of the backreaction associated with the radiation in the squashed Kaluza-Klein black hole background.

  9. Radiative improvement of the lattice nonrelativistic QCD action using the background field method and application to the hyperfine splitting of quarkonium states.

    PubMed

    Hammant, T C; Hart, A G; von Hippel, G M; Horgan, R R; Monahan, C J

    2011-09-09

    We present the first application of the background field method to nonrelativistic QCD (NRQCD) on the lattice in order to determine the one-loop radiative corrections to the coefficients of the NRQCD action in a manifestly gauge-covariant manner. The coefficients of the σ·B term in the NRQCD action and the four-fermion spin-spin interaction are computed at the one-loop level; the resulting shift of the hyperfine splitting of bottomonium is found to bring the lattice predictions in line with experiment.

  10. Development of criteria used to establish a background environmental monitoring station

    DOE PAGES

    Fritz, Brad G.; Barnett, J. Matthew; Snyder, Sandra F.; ...

    2015-03-02

    It is generally considered necessary to measure concentrations of contaminants-of-concern at a background location when conducting atmospheric environmental surveillance. This is because it is recognized that measurements of background concentrations can enhance interpretation of environmental monitoring data. Despite the recognized need for background measurements, there is little published guidance available that describes how to identify an appropriate atmospheric background monitoring location. This paper develops generic criteria that can guide the decision making process for identifying suitable locations for background atmospheric monitoring station. Detailed methods for evaluating some of these criteria are also provided and a case study for establishment ofmore » an atmospheric background surveillance station as part of an environmental surveillance program is described. While the case study focuses on monitoring for radionuclides, the approach is equally valid for any airborne constituent being monitored. The case study shows that implementation of the developed criteria can result in a good, defensible choice for a background atmospheric monitoring location.« less

  11. Study on negative incident photon-to-electron conversion efficiency of quantum dot-sensitized solar cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Chunhui; Wu, Huijue; Zhu, Lifeng

    2014-02-15

    Recently, negative signals are frequently observed during the measuring process of monochromatic incident photon-to-electron conversion efficiency (IPCE) for sensitized solar cells by DC method. This phenomenon is confusing and hindering the reasonable evaluation of solar cells. Here, cause of negative IPCE values is studied by taking quantum dot-sensitized solar cell (QDSC) as an example, and the accurate measurement method to avoid the negative value is suggested. The negative background signals of QDSC without illumination are found the direct cause of the negative IPCE values by DC method. Ambient noise, significant capacitance characteristics, and uncontrolled electrochemical reaction all can lead tomore » the negative background signals. When the photocurrent response of device under monochromatic light illumination is relatively weak, the actual photocurrent signals will be covered by the negative background signals and the resulting IPCE values will appear negative. To improve the signal-to-noise ratio, quasi-AC method is proposed for IPCE measurement of solar cells with weak photocurrent response based on the idea of replacing the absolute values by the relative values.« less

  12. People counting in classroom based on video surveillance

    NASA Astrophysics Data System (ADS)

    Zhang, Quanbin; Huang, Xiang; Su, Juan

    2014-11-01

    Currently, the switches of the lights and other electronic devices in the classroom are mainly relied on manual control, as a result, many lights are on while no one or only few people in the classroom. It is important to change the current situation and control the electronic devices intelligently according to the number and the distribution of the students in the classroom, so as to reduce the considerable waste of electronic resources. This paper studies the problem of people counting in classroom based on video surveillance. As the camera in the classroom can not get the full shape contour information of bodies and the clear features information of faces, most of the classical algorithms such as the pedestrian detection method based on HOG (histograms of oriented gradient) feature and the face detection method based on machine learning are unable to obtain a satisfied result. A new kind of dual background updating model based on sparse and low-rank matrix decomposition is proposed in this paper, according to the fact that most of the students in the classroom are almost in stationary state and there are body movement occasionally. Firstly, combining the frame difference with the sparse and low-rank matrix decomposition to predict the moving areas, and updating the background model with different parameters according to the positional relationship between the pixels of current video frame and the predicted motion regions. Secondly, the regions of moving objects are determined based on the updated background using the background subtraction method. Finally, some operations including binarization, median filtering and morphology processing, connected component detection, etc. are performed on the regions acquired by the background subtraction, in order to induce the effects of the noise and obtain the number of people in the classroom. The experiment results show the validity of the algorithm of people counting.

  13. Randomized subspace-based robust principal component analysis for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Weiwei; Yang, Gang; Li, Jialin; Zhang, Dianfa

    2018-01-01

    A randomized subspace-based robust principal component analysis (RSRPCA) method for anomaly detection in hyperspectral imagery (HSI) is proposed. The RSRPCA combines advantages of randomized column subspace and robust principal component analysis (RPCA). It assumes that the background has low-rank properties, and the anomalies are sparse and do not lie in the column subspace of the background. First, RSRPCA implements random sampling to sketch the original HSI dataset from columns and to construct a randomized column subspace of the background. Structured random projections are also adopted to sketch the HSI dataset from rows. Sketching from columns and rows could greatly reduce the computational requirements of RSRPCA. Second, the RSRPCA adopts the columnwise RPCA (CWRPCA) to eliminate negative effects of sampled anomaly pixels and that purifies the previous randomized column subspace by removing sampled anomaly columns. The CWRPCA decomposes the submatrix of the HSI data into a low-rank matrix (i.e., background component), a noisy matrix (i.e., noise component), and a sparse anomaly matrix (i.e., anomaly component) with only a small proportion of nonzero columns. The algorithm of inexact augmented Lagrange multiplier is utilized to optimize the CWRPCA problem and estimate the sparse matrix. Nonzero columns of the sparse anomaly matrix point to sampled anomaly columns in the submatrix. Third, all the pixels are projected onto the complemental subspace of the purified randomized column subspace of the background and the anomaly pixels in the original HSI data are finally exactly located. Several experiments on three real hyperspectral images are carefully designed to investigate the detection performance of RSRPCA, and the results are compared with four state-of-the-art methods. Experimental results show that the proposed RSRPCA outperforms four comparison methods both in detection performance and in computational time.

  14. Wind profiling for a coherent wind Doppler lidar by an auto-adaptive background subtraction approach.

    PubMed

    Wu, Yanwei; Guo, Pan; Chen, Siying; Chen, He; Zhang, Yinchao

    2017-04-01

    Auto-adaptive background subtraction (AABS) is proposed as a denoising method for data processing of the coherent Doppler lidar (CDL). The method is proposed specifically for a low-signal-to-noise-ratio regime, in which the drifting power spectral density of CDL data occurs. Unlike the periodogram maximum (PM) and adaptive iteratively reweighted penalized least squares (airPLS), the proposed method presents reliable peaks and is thus advantageous in identifying peak locations. According to the analysis results of simulated and actually measured data, the proposed method outperforms the airPLS method and the PM algorithm in the furthest detectable range. The proposed method improves the detection range approximately up to 16.7% and 40% when compared to the airPLS method and the PM method, respectively. It also has smaller mean wind velocity and standard error values than the airPLS and PM methods. The AABS approach improves the quality of Doppler shift estimates and can be applied to obtain the whole wind profiling by the CDL.

  15. Introducing 3D U-statistic method for separating anomaly from background in exploration geochemical data with associated software development

    NASA Astrophysics Data System (ADS)

    Ghannadpour, Seyyed Saeed; Hezarkhani, Ardeshir

    2016-03-01

    The U-statistic method is one of the most important structural methods to separate the anomaly from the background. It considers the location of samples and carries out the statistical analysis of the data without judging from a geochemical point of view and tries to separate subpopulations and determine anomalous areas. In the present study, to use U-statistic method in three-dimensional (3D) condition, U-statistic is applied on the grade of two ideal test examples, by considering sample Z values (elevation). So far, this is the first time that this method has been applied on a 3D condition. To evaluate the performance of 3D U-statistic method and in order to compare U-statistic with one non-structural method, the method of threshold assessment based on median and standard deviation (MSD method) is applied on the two example tests. Results show that the samples indicated by U-statistic method as anomalous are more regular and involve less dispersion than those indicated by the MSD method. So that, according to the location of anomalous samples, denser areas of them can be determined as promising zones. Moreover, results show that at a threshold of U = 0, the total error of misclassification for U-statistic method is much smaller than the total error of criteria of bar {x}+n× s. Finally, 3D model of two test examples for separating anomaly from background using 3D U-statistic method is provided. The source code for a software program, which was developed in the MATLAB programming language in order to perform the calculations of the 3D U-spatial statistic method, is additionally provided. This software is compatible with all the geochemical varieties and can be used in similar exploration projects.

  16. GafChromic EBT film dosimetry with flatbed CCD scanner: a novel background correction method and full dose uncertainty analysis.

    PubMed

    Saur, Sigrun; Frengen, Jomar

    2008-07-01

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scans of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16 x 16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution can be compared with the film-measured dose distribution using a dose constraint of 4% (relative to the measured dose) for doses between 1 and 3 Gy. At lower doses, the dose constraint must be relaxed.

  17. Integrative relational machine-learning for understanding drug side-effect profiles

    PubMed Central

    2013-01-01

    Background Drug side effects represent a common reason for stopping drug development during clinical trials. Improving our ability to understand drug side effects is necessary to reduce attrition rates during drug development as well as the risk of discovering novel side effects in available drugs. Today, most investigations deal with isolated side effects and overlook possible redundancy and their frequent co-occurrence. Results In this work, drug annotations are collected from SIDER and DrugBank databases. Terms describing individual side effects reported in SIDER are clustered with a semantic similarity measure into term clusters (TCs). Maximal frequent itemsets are extracted from the resulting drug x TC binary table, leading to the identification of what we call side-effect profiles (SEPs). A SEP is defined as the longest combination of TCs which are shared by a significant number of drugs. Frequent SEPs are explored on the basis of integrated drug and target descriptors using two machine learning methods: decision-trees and inductive-logic programming. Although both methods yield explicit models, inductive-logic programming method performs relational learning and is able to exploit not only drug properties but also background knowledge. Learning efficiency is evaluated by cross-validation and direct testing with new molecules. Comparison of the two machine-learning methods shows that the inductive-logic-programming method displays a greater sensitivity than decision trees and successfully exploit background knowledge such as functional annotations and pathways of drug targets, thereby producing rich and expressive rules. All models and theories are available on a dedicated web site. Conclusions Side effect profiles covering significant number of drugs have been extracted from a drug ×side-effect association table. Integration of background knowledge concerning both chemical and biological spaces has been combined with a relational learning method for discovering rules which explicitly characterize drug-SEP associations. These rules are successfully used for predicting SEPs associated with new drugs. PMID:23802887

  18. Optical Flow for Flight and Wind Tunnel Background Oriented Schlieren Imaging

    NASA Technical Reports Server (NTRS)

    Smith, Nathanial T.; Heineck, James T.; Schairer, Edward T.

    2017-01-01

    Background oriented Schlieren images have historically been generated by calculating the observed pixel displacement between a wind-on and wind-o image pair using normalized cross-correlation. This work uses optical flow to solve the displacement fields which generate the Schlieren images. A well established method used in the computer vision community, optical flow is the apparent motion in an image sequence due to brightness changes. The regularization method of Horn and Schunck is used to create Schlieren images using two data sets: a supersonic jet plume shock interaction from the NASA Ames Unitary Plan Wind Tunnel, and a transonic flight test of a T-38 aircraft using a naturally occurring background, performed in conjunction with NASA Ames and Armstrong Research Centers. Results are presented and contrasted with those using normalized cross-correlation. The optical flow Schlieren images are found to provided significantly more detail. We apply the method to historical data sets to demonstrate the broad applicability and limitations of the technique.

  19. Closed strings and moduli in AdS3/CFT2

    NASA Astrophysics Data System (ADS)

    Sax, Olof Ohlsson; Stefański, Bogdan

    2018-05-01

    String theory on AdS3 × S3 × T4 has 20 moduli. We investigate how the perturbative closed string spectrum changes as we move around this moduli space in both the RR and NSNS flux backgrounds. We find that, at weak string coupling, only four of the moduli affect the energies. In the RR background the only effect of these moduli is to change the radius of curvature of the background. On the other hand, in the NSNS background, the moduli introduce worldsheet interactions which enable the use of integrability methods to solve the spectral problem. Our results show that the worldsheet theory is integrable across the 20 dimensional moduli space.

  20. The Enterococcus QPCR Method for Recreational Water Quality Testing: Testing Background, Performance and Issues

    EPA Science Inventory

    Currently accepted culture-based monitoring methods for fecal indicator bacteria in surface waters take at least 24 hr to determine if unacceptable levels of fecal pollution have reached our recreational beaches. During this waiting period changing water conditions may result eit...

  1. Kinetic Study of Adsorption Processes in Solution: An Undergraduate Physical Chemistry Experiment.

    ERIC Educational Resources Information Center

    Casado, Julio; And Others

    1985-01-01

    Background information, apparatus needed, procedures used, and results obtained are provided for a simple kinetic method for the monitoring of adsorption processes. The method, which involved adsorption of crystal violet onto activated carbon, is suitable for classroom and/or research purposes. (JN)

  2. Infrared small target detection technology based on OpenCV

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Huang, Zhijian

    2013-05-01

    Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.

  3. Infrared small target detection technology based on OpenCV

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Huang, Zhijian

    2013-09-01

    Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.

  4. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results

    PubMed Central

    Norris, Peter M.; da Silva, Arlindo M.

    2018-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational–Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state. PMID:29618848

  5. Monte Carlo Bayesian Inference on a Statistical Model of Sub-Gridcolumn Moisture Variability Using High-Resolution Cloud Observations. Part 2: Sensitivity Tests and Results

    NASA Technical Reports Server (NTRS)

    Norris, Peter M.; da Silva, Arlindo M.

    2016-01-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.

  6. Monte Carlo Bayesian inference on a statistical model of sub-gridcolumn moisture variability using high-resolution cloud observations. Part 2: Sensitivity tests and results.

    PubMed

    Norris, Peter M; da Silva, Arlindo M

    2016-07-01

    Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.

  7. Quantitative analysis of trace levels of surface contamination by X-ray photoelectron spectroscopy Part I: statistical uncertainty near the detection limit.

    PubMed

    Hill, Shannon B; Faradzhev, Nadir S; Powell, Cedric J

    2017-12-01

    We discuss the problem of quantifying common sources of statistical uncertainties for analyses of trace levels of surface contamination using X-ray photoelectron spectroscopy. We examine the propagation of error for peak-area measurements using common forms of linear and polynomial background subtraction including the correlation of points used to determine both background and peak areas. This correlation has been neglected in previous analyses, but we show that it contributes significantly to the peak-area uncertainty near the detection limit. We introduce the concept of relative background subtraction variance (RBSV) which quantifies the uncertainty introduced by the method of background determination relative to the uncertainty of the background area itself. The uncertainties of the peak area and atomic concentration and of the detection limit are expressed using the RBSV, which separates the contributions from the acquisition parameters, the background-determination method, and the properties of the measured spectrum. These results are then combined to find acquisition strategies that minimize the total measurement time needed to achieve a desired detection limit or atomic-percentage uncertainty for a particular trace element. Minimization of data-acquisition time is important for samples that are sensitive to x-ray dose and also for laboratories that need to optimize throughput.

  8. Optimization of Experimental Conditions of the Pulsed Current GTAW Parameters for Mechanical Properties of SDSS UNS S32760 Welds Based on the Taguchi Design Method

    NASA Astrophysics Data System (ADS)

    Yousefieh, M.; Shamanian, M.; Saatchi, A.

    2012-09-01

    Taguchi design method with L9 orthogonal array was implemented to optimize the pulsed current gas tungsten arc welding parameters for the hardness and the toughness of super duplex stainless steel (SDSS, UNS S32760) welds. In this regard, the hardness and the toughness were considered as performance characteristics. Pulse current, background current, % on time, and pulse frequency were chosen as main parameters. Each parameter was varied at three different levels. As a result of pooled analysis of variance, the pulse current is found to be the most significant factor for both the hardness and the toughness of SDSS welds by percentage contribution of 71.81 for hardness and 78.18 for toughness. The % on time (21.99%) and the background current (17.81%) had also the next most significant effect on the hardness and the toughness, respectively. The optimum conditions within the selected parameter values for hardness were found as the first level of pulse current (100 A), third level of background current (70 A), first level of % on time (40%), and first level of pulse frequency (1 Hz), while they were found as the second level of pulse current (120 A), second level of background current (60 A), second level of % on time (60%), and third level of pulse frequency (5 Hz) for toughness. The Taguchi method was found to be a promising tool to obtain the optimum conditions for such studies. Finally, in order to verify experimental results, confirmation tests were carried out at optimum working conditions. Under these conditions, there were good agreements between the predicted and the experimental results for the both hardness and toughness.

  9. Solving a Higgs optimization problem with quantum annealing for machine learning.

    PubMed

    Mott, Alex; Job, Joshua; Vlimant, Jean-Roch; Lidar, Daniel; Spiropulu, Maria

    2017-10-18

    The discovery of Higgs-boson decays in a background of standard-model processes was assisted by machine learning methods. The classifiers used to separate signals such as these from background are trained using highly unerring but not completely perfect simulations of the physical processes involved, often resulting in incorrect labelling of background processes or signals (label noise) and systematic errors. Here we use quantum and classical annealing (probabilistic techniques for approximating the global maximum or minimum of a given function) to solve a Higgs-signal-versus-background machine learning optimization problem, mapped to a problem of finding the ground state of a corresponding Ising spin model. We build a set of weak classifiers based on the kinematic observables of the Higgs decay photons, which we then use to construct a strong classifier. This strong classifier is highly resilient against overtraining and against errors in the correlations of the physical observables in the training data. We show that the resulting quantum and classical annealing-based classifier systems perform comparably to the state-of-the-art machine learning methods that are currently used in particle physics. However, in contrast to these methods, the annealing-based classifiers are simple functions of directly interpretable experimental parameters with clear physical meaning. The annealer-trained classifiers use the excited states in the vicinity of the ground state and demonstrate some advantage over traditional machine learning methods for small training datasets. Given the relative simplicity of the algorithm and its robustness to error, this technique may find application in other areas of experimental particle physics, such as real-time decision making in event-selection problems and classification in neutrino physics.

  10. Solving a Higgs optimization problem with quantum annealing for machine learning

    NASA Astrophysics Data System (ADS)

    Mott, Alex; Job, Joshua; Vlimant, Jean-Roch; Lidar, Daniel; Spiropulu, Maria

    2017-10-01

    The discovery of Higgs-boson decays in a background of standard-model processes was assisted by machine learning methods. The classifiers used to separate signals such as these from background are trained using highly unerring but not completely perfect simulations of the physical processes involved, often resulting in incorrect labelling of background processes or signals (label noise) and systematic errors. Here we use quantum and classical annealing (probabilistic techniques for approximating the global maximum or minimum of a given function) to solve a Higgs-signal-versus-background machine learning optimization problem, mapped to a problem of finding the ground state of a corresponding Ising spin model. We build a set of weak classifiers based on the kinematic observables of the Higgs decay photons, which we then use to construct a strong classifier. This strong classifier is highly resilient against overtraining and against errors in the correlations of the physical observables in the training data. We show that the resulting quantum and classical annealing-based classifier systems perform comparably to the state-of-the-art machine learning methods that are currently used in particle physics. However, in contrast to these methods, the annealing-based classifiers are simple functions of directly interpretable experimental parameters with clear physical meaning. The annealer-trained classifiers use the excited states in the vicinity of the ground state and demonstrate some advantage over traditional machine learning methods for small training datasets. Given the relative simplicity of the algorithm and its robustness to error, this technique may find application in other areas of experimental particle physics, such as real-time decision making in event-selection problems and classification in neutrino physics.

  11. Means and method of detection in chemical separation procedures

    DOEpatents

    Yeung, Edward S.; Koutny, Lance B.; Hogan, Barry L.; Cheung, Chan K.; Ma, Yinfa

    1993-03-09

    A means and method for indirect detection of constituent components of a mixture separated in a chemical separation process. Fluorescing ions are distributed across the area in which separation of the mixture will occur to provide a generally uniform background fluorescence intensity. For example, the mixture is comprised of one or more charged analytes which displace fluorescing ions where its constituent components separate to. Fluorescing ions of the same charge as the charged analyte components cause a displacement. The displacement results in the location of the separated components having a reduced fluorescence intensity to the remainder of the background. Detection of the lower fluorescence intensity areas can be visually, by photographic means and methods, or by automated laser scanning.

  12. Means and method of detection in chemical separation procedures

    DOEpatents

    Yeung, E.S.; Koutny, L.B.; Hogan, B.L.; Cheung, C.K.; Yinfa Ma.

    1993-03-09

    A means and method are described for indirect detection of constituent components of a mixture separated in a chemical separation process. Fluorescing ions are distributed across the area in which separation of the mixture will occur to provide a generally uniform background fluorescence intensity. For example, the mixture is comprised of one or more charged analytes which displace fluorescing ions where its constituent components separate to. Fluorescing ions of the same charge as the charged analyte components cause a displacement. The displacement results in the location of the separated components having a reduced fluorescence intensity to the remainder of the background. Detection of the lower fluorescence intensity areas can be visually, by photographic means and methods, or by automated laser scanning.

  13. A method for measuring coherent elastic neutrino-nucleus scattering at a far off-axis high-energy neutrino beam target

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brice, S. J.; Cooper, R. L.; DeJongh, F.

    2014-04-03

    We present an experimental method for measuring the process of coherent elastic neutrino-nucleus scattering (CENNS). This method uses a detector situated transverse to a high-energy neutrino beam production target. This detector would be sensitive to the low-energy neutrinos arising from decay-at-rest pions in the target. We discuss the physics motivation for making this measurement and outline the predicted backgrounds and sensitivities using this approach. We report a measurement of neutron backgrounds as found in an off-axis surface location of the Fermilab Booster Neutrino Beam (BNB) target. The results indicate that the Fermilab BNB target is a favorable location for amore » CENNS experiment.« less

  14. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Adaptive removal of background and white space from document images using seam categorization

    NASA Astrophysics Data System (ADS)

    Fillion, Claude; Fan, Zhigang; Monga, Vishal

    2011-03-01

    Document images are obtained regularly by rasterization of document content and as scans of printed documents. Resizing via background and white space removal is often desired for better consumption of these images, whether on displays or in print. While white space and background are easy to identify in images, existing methods such as naïve removal and content aware resizing (seam carving) each have limitations that can lead to undesirable artifacts, such as uneven spacing between lines of text or poor arrangement of content. An adaptive method based on image content is hence needed. In this paper we propose an adaptive method to intelligently remove white space and background content from document images. Document images are different from pictorial images in structure. They typically contain objects (text letters, pictures and graphics) separated by uniform background, which include both white paper space and other uniform color background. Pixels in uniform background regions are excellent candidates for deletion if resizing is required, as they introduce less change in document content and style, compared with deletion of object pixels. We propose a background deletion method that exploits both local and global context. The method aims to retain the document structural information and image quality.

  16. 24 CFR Appendix II to Subpart C of... - Development of Standards; Calculation Methods

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...; Calculation Methods I. Background Information Concerning the Standards (a) Thermal Radiation: (1) Introduction... and structures in the event of fire. The resulting fireball emits thermal radiation which is absorbed... radiation being emitted. The radiation can cause severe burn, injuries and even death to exposed persons...

  17. A novel spatial-temporal detection method of dim infrared moving small target

    NASA Astrophysics Data System (ADS)

    Chen, Zhong; Deng, Tao; Gao, Lei; Zhou, Heng; Luo, Song

    2014-09-01

    Moving small target detection under complex background in infrared image sequence is one of the major challenges of modern military in Early Warning Systems (EWS) and the use of Long-Range Strike (LRS). However, because of the low SNR and undulating background, the infrared moving small target detection is a difficult problem in a long time. To solve this problem, a novel spatial-temporal detection method based on bi-dimensional empirical mode decomposition (EMD) and time-domain difference is proposed in this paper. This method is downright self-data decomposition and do not rely on any transition kernel function, so it has a strong adaptive capacity. Firstly, we generalized the 1D EMD algorithm to the 2D case. In this process, the project has solved serial issues in 2D EMD, such as large amount of data operations, define and identify extrema in 2D case, and two-dimensional signal boundary corrosion. The EMD algorithm studied in this project can be well adapted to the automatic detection of small targets under low SNR and complex background. Secondly, considering the characteristics of moving target, we proposed an improved filtering method based on three-frame difference on basis of the original difference filtering in time-domain, which greatly improves the ability of anti-jamming algorithm. Finally, we proposed a new time-space fusion method based on a combined processing of 2D EMD and improved time-domain differential filtering. And, experimental results show that this method works well in infrared small moving target detection under low SNR and complex background.

  18. A novel star extraction method based on modified water flow model

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Niu, Yanxiong; Lu, Jiazhen; Ouyang, Zibiao; Yang, Yanqiang

    2017-11-01

    Star extraction is the essential procedure for attitude measurement of star sensor. The great challenge for star extraction is to segment star area exactly from various noise and background. In this paper, a novel star extraction method based on Modified Water Flow Model(MWFM) is proposed. The star image is regarded as a 3D terrain. The morphology is adopted for noise elimination and Tentative Star Area(TSA) selection. Star area can be extracted through adaptive water flowing within TSAs. This method can achieve accurate star extraction with improved efficiency under complex conditions such as loud noise and uneven backgrounds. Several groups of different types of star images are processed using proposed method. Comparisons with existing methods are conducted. Experimental results show that MWFM performs excellently under different imaging conditions. The star extraction rate is better than 95%. The star centroid accuracy is better than 0.075 pixels. The time-consumption is also significantly reduced.

  19. Moving Object Detection on a Vehicle Mounted Back-Up Camera

    PubMed Central

    Kim, Dong-Sun; Kwon, Jinsan

    2015-01-01

    In the detection of moving objects from vision sources one usually assumes that the scene has been captured by stationary cameras. In case of backing up a vehicle, however, the camera mounted on the vehicle moves according to the vehicle’s movement, resulting in ego-motions on the background. This results in mixed motion in the scene, and makes it difficult to distinguish between the target objects and background motions. Without further treatments on the mixed motion, traditional fixed-viewpoint object detection methods will lead to many false-positive detection results. In this paper, we suggest a procedure to be used with the traditional moving object detection methods relaxing the stationary cameras restriction, by introducing additional steps before and after the detection. We also decribe the implementation as a FPGA platform along with the algorithm. The target application of this suggestion is use with a road vehicle’s rear-view camera systems. PMID:26712761

  20. Radioactive contamination of scintillators

    NASA Astrophysics Data System (ADS)

    Danevich, F. A.; Tretyak, V. I.

    2018-03-01

    Low counting experiments (search for double β decay and dark matter particles, measurements of neutrino fluxes from different sources, search for hypothetical nuclear and subnuclear processes, low background α, β, γ spectrometry) require extremely low background of a detector. Scintillators are widely used to search for rare events both as conventional scintillation detectors and as cryogenic scintillating bolometers. Radioactive contamination of a scintillation material plays a key role to reach low level of background. Origin and nature of radioactive contamination of scintillators, experimental methods and results are reviewed. A programme to develop radiopure crystal scintillators for low counting experiments is discussed briefly.

  1. Compensatable muon collider calorimeter with manageable backgrounds

    DOEpatents

    Raja, Rajendran

    2015-02-17

    A method and system for reducing background noise in a particle collider, comprises identifying an interaction point among a plurality of particles within a particle collider associated with a detector element, defining a trigger start time for each of the pixels as the time taken for light to travel from the interaction point to the pixel and a trigger stop time as a selected time after the trigger start time, and collecting only detections that occur between the start trigger time and the stop trigger time in order to thereafter compensate the result from the particle collider to reduce unwanted background detection.

  2. An improved correlation method for determining the period of a torsion pendulum

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo Jie; Wang Dianhong

    Considering variation of environment temperature and unhomogeneity of background gravitational field, an improved correlation method was proposed to determine the variational period of a torsion pendulum with high precision. The result of processing experimental data shows that the uncertainty of determining the period with this method has been improved about twofolds than traditional correlation method, which is significant for the determination of gravitational constant with time-of-swing method.

  3. Developing and Evaluating a Target-Background Similarity Metric for Camouflage Detection

    PubMed Central

    Lin, Chiuhsiang Joe; Chang, Chi-Chan; Liu, Bor-Shong

    2014-01-01

    Background Measurement of camouflage performance is of fundamental importance for military stealth applications. The goal of camouflage assessment algorithms is to automatically assess the effect of camouflage in agreement with human detection responses. In a previous study, we found that the Universal Image Quality Index (UIQI) correlated well with the psychophysical measures, and it could be a potentially camouflage assessment tool. Methodology In this study, we want to quantify the camouflage similarity index and psychophysical results. We compare several image quality indexes for computational evaluation of camouflage effectiveness, and present the results of an extensive human visual experiment conducted to evaluate the performance of several camouflage assessment algorithms and analyze the strengths and weaknesses of these algorithms. Significance The experimental data demonstrates the effectiveness of the approach, and the correlation coefficient result of the UIQI was higher than those of other methods. This approach was highly correlated with the human target-searching results. It also showed that this method is an objective and effective camouflage performance evaluation method because it considers the human visual system and image structure, which makes it consistent with the subjective evaluation results. PMID:24498310

  4. Motion artifact and background noise suppression on optical microangiography frames using a naïve Bayes mask.

    PubMed

    Reif, Roberto; Baran, Utku; Wang, Ruikang K

    2014-07-01

    Optical coherence tomography (OCT) is a technique that allows for the three-dimensional (3D) imaging of small volumes of tissue (a few millimeters) with high resolution (∼10  μm). Optical microangiography (OMAG) is a method of processing OCT data, which allows for the extraction of the tissue vasculature with capillary resolution from the OCT images. Cross-sectional B-frame OMAG images present the location of the patent blood vessels; however, the signal-to-noise-ratio of these images can be affected by several factors such as the quality of the OCT system and the tissue motion artifact. This background noise can appear in the en face projection view image. In this work we propose to develop a binary mask that can be applied on the cross-sectional B-frame OMAG images, which will reduce the background noise while leaving the signal from the blood vessels intact. The mask is created by using a naïve Bayes (NB) classification algorithm trained with a gold standard image which is manually segmented by an expert. The masked OMAG images present better contrast for binarizing the image and quantifying the result without the influence of noise. The results are compared with a previously developed frequency rejection filter (FRF) method which is applied on the en face projection view image. It is demonstrated that both the NB and FRF methods provide similar vessel length fractions. The advantage of the NB method is that the results are applicable in 3D and that its use is not limited to periodic motion artifacts.

  5. Infrared maritime target detection using the high order statistic filtering in fractional Fourier domain

    NASA Astrophysics Data System (ADS)

    Zhou, Anran; Xie, Weixin; Pei, Jihong

    2018-06-01

    Accurate detection of maritime targets in infrared imagery under various sea clutter conditions is always a challenging task. The fractional Fourier transform (FRFT) is the extension of the Fourier transform in the fractional order, and has richer spatial-frequency information. By combining it with the high order statistic filtering, a new ship detection method is proposed. First, the proper range of angle parameter is determined to make it easier for the ship components and background to be separated. Second, a new high order statistic curve (HOSC) at each fractional frequency point is designed. It is proved that maximal peak interval in HOSC reflects the target information, while the points outside the interval reflect the background. And the value of HOSC relative to the ship is much bigger than that to the sea clutter. Then, search the curve's maximal target peak interval and extract the interval by bandpass filtering in fractional Fourier domain. The value outside the peak interval of HOSC decreases rapidly to 0, so the background is effectively suppressed. Finally, the detection result is obtained by the double threshold segmenting and the target region selection method. The results show the proposed method is excellent for maritime targets detection with high clutters.

  6. Perceptual grouping and attention: not all groupings are equal.

    PubMed

    Kimchi, Ruth; Razpurker-Apfeld, Irene

    2004-08-01

    We examined grouping under inattention using Driver, Davis, Russell, Turatto, & Freeman's (2001) method. On each trial, two successive displays were briefly presented, each comprising a central target square surrounded by elements. The task was to judge whether the two targets were the same or different. The organization of the background elements stayed the same or changed, independently of the targets. In different conditions, background elements grouped into columns/rows by color similarity, a shape (a triangle/arrow, a square/cross, or a vertical/horizontal line) by color similarity, and a shape with no other elements in the background. We measured the influence of the background on the target same-different judgments. The results imply that background elements grouped into columns/rows by color similarity and into a shape when no segregation from other elements was involved and the shape was relatively "good." In contrast, no background grouping was observed when resolving figure-ground relations for segregated units was required, as in grouping into a shape by color similarity. These results suggest that grouping is a multiplicity of processes that vary in their attentional demands. Regardless of attentional demands, the products of grouping are not available to awareness without attention.

  7. A biological hierarchical model based underwater moving object detection.

    PubMed

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results.

  8. A Biological Hierarchical Model Based Underwater Moving Object Detection

    PubMed Central

    Shen, Jie; Fan, Tanghuai; Tang, Min; Zhang, Qian; Sun, Zhen; Huang, Fengchen

    2014-01-01

    Underwater moving object detection is the key for many underwater computer vision tasks, such as object recognizing, locating, and tracking. Considering the super ability in visual sensing of the underwater habitats, the visual mechanism of aquatic animals is generally regarded as the cue for establishing bionic models which are more adaptive to the underwater environments. However, the low accuracy rate and the absence of the prior knowledge learning limit their adaptation in underwater applications. Aiming to solve the problems originated from the inhomogeneous lumination and the unstable background, the mechanism of the visual information sensing and processing pattern from the eye of frogs are imitated to produce a hierarchical background model for detecting underwater objects. Firstly, the image is segmented into several subblocks. The intensity information is extracted for establishing background model which could roughly identify the object and the background regions. The texture feature of each pixel in the rough object region is further analyzed to generate the object contour precisely. Experimental results demonstrate that the proposed method gives a better performance. Compared to the traditional Gaussian background model, the completeness of the object detection is 97.92% with only 0.94% of the background region that is included in the detection results. PMID:25140194

  9. Comparison of presbyopic additions determined by the fused cross-cylinder method using alternative target background colours.

    PubMed

    Wee, Sung-Hyun; Yu, Dong-Sik; Moon, Byeong-Yeon; Cho, Hyun Gug

    2010-11-01

    To compare and contrast standard and alternative versions of refractor head (phoropter)-based charts used to determine reading addition. Forty one presbyopic subjects aged between 42 and 60 years were tested. Tentative additions were determined using a red-green background letter chart, and 4 cross-grid charts (with white, red, green, or red-green backgrounds) which were used with the fused cross cylinder (FCC) method. The final addition for a 40 cm working distance was determined for each subject by subjectively adjusting the tentative additions. There were significant differences in the tentative additions obtained using the 5 methods (repeated measures ANOVA, p < 0.001). The mean differences between the tentative and final additions were <0.10 D and were not clinically meaningful, with the exception of the red-green letter test, and the red background in the FCC method. There were no significant differences between the tentative and final additions for the green background in the FCC method (p > 0.05). The intervals of the 95% limits of agreement were under ±0.50 D, and the narrowest interval (±0.26 D) was for the red-green background. The 3 FCC methods with a white, green, or red-green background provided a tentative addition close to the final addition. Compared with the other methods, the FCC method with the red-green background had a narrow range of error. Further, since this method combines the functions of both the fused cross-cylinder test and the duochrome test, it can be a useful technique for determining presbyopic additions. © 2010 The Authors. Ophthalmic and Physiological Optics © 2010 The College of Optometrists.

  10. In-mine testing of a natural background sensor, part B

    NASA Technical Reports Server (NTRS)

    Martzloff, F. D.

    1981-01-01

    The capability of a natural background sensor for measuring the thickness of top coal on a longwall face was examined. The limitations on the time during which tests could be performed, and the roof conditions, did not produce readings of top coal measurements during the shearer operation. It was demonstrated that the system is capable to survive operating conditions in the mine environment, while the static tests confirmed that the natural background sensor approach is a valid method of measuring top coal thickness in mines where the roof rock provides a constant radiation level. It is concluded that the practical results will improve sequent development of an integrated vertical control system which is information from the natural background system.

  11. Estimation of channel parameters and background irradiance for free-space optical link.

    PubMed

    Khatoon, Afsana; Cowley, William G; Letzepis, Nick; Giggenbach, Dirk

    2013-05-10

    Free-space optical communication can experience severe fading due to optical scintillation in long-range links. Channel estimation is also corrupted by background and electrical noise. Accurate estimation of channel parameters and scintillation index (SI) depends on perfect removal of background irradiance. In this paper, we propose three different methods, the minimum-value (MV), mean-power (MP), and maximum-likelihood (ML) based methods, to remove the background irradiance from channel samples. The MV and MP methods do not require knowledge of the scintillation distribution. While the ML-based method assumes gamma-gamma scintillation, it can be easily modified to accommodate other distributions. Each estimator's performance is compared using simulation data as well as experimental measurements. The estimators' performance are evaluated from low- to high-SI areas using simulation data as well as experimental trials. The MV and MP methods have much lower complexity than the ML-based method. However, the ML-based method shows better SI and background-irradiance estimation performance.

  12. Fast and fully automatic phalanx segmentation using a grayscale-histogram morphology algorithm

    NASA Astrophysics Data System (ADS)

    Hsieh, Chi-Wen; Liu, Tzu-Chiang; Jong, Tai-Lang; Chen, Chih-Yen; Tiu, Chui-Mei; Chan, Din-Yuen

    2011-08-01

    Bone age assessment is a common radiological examination used in pediatrics to diagnose the discrepancy between the skeletal and chronological age of a child; therefore, it is beneficial to develop a computer-based bone age assessment to help junior pediatricians estimate bone age easily. Unfortunately, the phalanx on radiograms is not easily separated from the background and soft tissue. Therefore, we proposed a new method, called the grayscale-histogram morphology algorithm, to segment the phalanges fast and precisely. The algorithm includes three parts: a tri-stage sieve algorithm used to eliminate the background of hand radiograms, a centroid-edge dual scanning algorithm to frame the phalanx region, and finally a segmentation algorithm based on disk traverse-subtraction filter to segment the phalanx. Moreover, two more segmentation methods: adaptive two-mean and adaptive two-mean clustering were performed, and their results were compared with the segmentation algorithm based on disk traverse-subtraction filter using five indices comprising misclassification error, relative foreground area error, modified Hausdorff distances, edge mismatch, and region nonuniformity. In addition, the CPU time of the three segmentation methods was discussed. The result showed that our method had a better performance than the other two methods. Furthermore, satisfactory segmentation results were obtained with a low standard error.

  13. Automatic lesion boundary detection in dermoscopy images using gradient vector flow snakes

    PubMed Central

    Erkol, Bulent; Moss, Randy H.; Stanley, R. Joe; Stoecker, William V.; Hvatum, Erik

    2011-01-01

    Background Malignant melanoma has a good prognosis if treated early. Dermoscopy images of pigmented lesions are most commonly taken at × 10 magnification under lighting at a low angle of incidence while the skin is immersed in oil under a glass plate. Accurate skin lesion segmentation from the background skin is important because some of the features anticipated to be used for diagnosis deal with shape of the lesion and others deal with the color of the lesion compared with the color of the surrounding skin. Methods In this research, gradient vector flow (GVF) snakes are investigated to find the border of skin lesions in dermoscopy images. An automatic initialization method is introduced to make the skin lesion border determination process fully automated. Results Skin lesion segmentation results are presented for 70 benign and 30 melanoma skin lesion images for the GVF-based method and a color histogram analysis technique. The average errors obtained by the GVF-based method are lower for both the benign and melanoma image sets than for the color histogram analysis technique based on comparison with manually segmented lesions determined by a dermatologist. Conclusions The experimental results for the GVF-based method demonstrate promise as an automated technique for skin lesion segmentation in dermoscopy images. PMID:15691255

  14. [Improved opportunities for the identification of people with a migrant background for mortality research using the example of Bremen].

    PubMed

    Makarova, N; Reiss, K; Zeeb, H; Razum, O; Spallek, J

    2013-06-01

    19.6% of Germany's population has a "migrant" background. Comprehensive epidemiological research on health and health development of this large, heterogeneous and increasingly important population group in Germany is still deficient. There is a lack of results on mortality and morbidity, particularly concerning chronic diseases and disease processes. The aim of this paper is to combine and to compare already applied methods with new methodological approaches for determining the vital status and the mortality of immigrants from Turkey and the former Soviet Union. For this purpose we used data from the state of Bremen (666 709 residents, last update 2010). We examined 2 methodological aspects: (i) possibilities for identifying immigrant background in the data of residents' registration office with different methods (onomastic, toponomastic, etc.) and (ii) opportunities for record linkage of the obtained data with the Bremen mortality index. Immigrants from Turkey and the former Soviet Union were successfully identified in databases of the residents' registration office by a combination of different methods. The combination of different methodological approaches proved to be considerably better than using one method only. Through the application of a name-based algorithm we found that Turkish immigrants comprise 6.9% of the total population living in Bremen. By combining the variables "citizenship" and "country of birth" the total population proportion of immigrants from the former Soviet Union was found to be 5%. We also identified the deceased immigrant population in Bremen. The information obtained from residents' registration office could be successfully linked by death register number with the data of the Bremen mortality index. This information can be used in further detailed mortality analyses. The results of this analysis show the existing opportunities to consider the heterogeneity of the German population in mortality research, especially by means of combination of different methods to identify the immigrant background. © Georg Thieme Verlag KG Stuttgart · New York.

  15. getimages: Background derivation and image flattening method

    NASA Astrophysics Data System (ADS)

    Men'shchikov, Alexander

    2017-05-01

    getimages performs background derivation and image flattening for high-resolution images obtained with space observatories. It is based on median filtering with sliding windows corresponding to a range of spatial scales from the observational beam size up to a maximum structure width X. The latter is a single free parameter of getimages that can be evaluated manually from the observed image. The median filtering algorithm provides a background image for structures of all widths below X. The same median filtering procedure applied to an image of standard deviations derived from a background-subtracted image results in a flattening image. Finally, a flattened image is computed by dividing the background-subtracted by the flattening image. Standard deviations in the flattened image are now uniform outside sources and filaments. Detecting structures in such radically simplified images results in much cleaner extractions that are more complete and reliable. getimages also reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images. The code (a Bash script) uses FORTRAN utilities from getsources (ascl:1507.014), which must be installed.

  16. Assessment of background particulate matter concentrations in small cities and rural locations--Prince George, Canada.

    PubMed

    Veira, Andreas; Jackson, Peter L; Ainslie, Bruce; Fudge, Dennis

    2013-07-01

    This study investigates the development and application of a simple method to calculate annual and seasonal PM2.5 and PM10 background concentrations in small cities and rural areas. The Low Pollution Sectors and Conditions (LPSC) method is based on existing measured long-term data sets and is designed for locations where particulate matter (PM) monitors are only influenced by local anthropogenic emission sources from particular wind sectors. The LPSC method combines the analysis of measured hourly meteorological data, PM concentrations, and geographical emission source distributions. PM background levels emerge from measured data for specific wind conditions, where air parcel trajectories measured at a monitoring station are assumed to have passed over geographic sectors with negligible local emissions. Seasonal and annual background levels were estimated for two monitoring stations in Prince George, Canada, and the method was also applied to four other small cities (Burns Lake, Houston, Quesnel, Smithers) in northern British Columbia. The analysis showed reasonable background concentrations for both monitoring stations in Prince George, whereas annual PM10 background concentrations at two of the other locations and PM2.5 background concentrations at one other location were implausibly high. For those locations where the LPSC method was successful, annual background levels ranged between 1.8 +/- 0.1 microg/m3 and 2.5 +/- 0.1 microg/m3 for PM2.5 and between 6.3 +/- 0.3 microg/m3 and 8.5 +/- 0.3 microg/m3 for PM10. Precipitation effects and patterns of seasonal variability in the estimated background concentrations were detectable for all locations where the method was successful. Overall the method was dependent on the configuration of local geography and sources with respect to the monitoring location, and may fail at some locations and under some conditions. Where applicable, the LPSC method can provide a fast and cost-efficient way to estimate background PM concentrations for small cities in sparsely populated regions like northern British Columbia. In rural areas like northern British Columbia, particulate matter (PM) monitoring stations are usually located close to emission sources and residential areas in order to assess the PM impact on human health. Thus there is a lack of accurate PM background concentration data that represent PM ambient concentrations in the absence of local emissions. The background calculation method developed in this study uses observed meteorological data as well as local source emission locations and provides annual, seasonal and precipitation-related PM background concentrations that are comparable to literature values for four out of six monitoring stations.

  17. Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.

    PubMed

    Easlon, Hsien Ming; Bloom, Arnold J

    2014-07-01

    Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.

  18. [Symptoms, disease models and treatment experiences of patients in psychosomatic rehabilitation with and without a history of migration].

    PubMed

    Gruner, Andrea; Oster, Jörg; Müller, Gottfried; von Wietersheim, Jörn

    2012-01-01

    Previous studies have shown that psychosomatic rehabilitation treatments were less successful for patients with a migration background. These findings should be explored further with the help of interviews. The main aim of this study was to compare patients with and without a migration background with regards to social-demographic variables, disease model, symptoms, and the course and result of a psychosomatic rehabilitation treatment. 75 patients with and 75 without a migration background were analysed. Half-structured interviews were carried out at admission, discharge and three months after discharge from treatment. Patients with a migration background were "sicker" at the beginning of the rehabilitation. Especially men with a migration background benefit less from the treatment and often did not feel "at the right place" in the psychosomatic rehabilitation. Patients with a migration background have a more negative view of their work performance than patients without a migration background. Patient with a migration background should receive more information about psychosomatic disease models and different treatment methods prior to their rehabilitation therapy.

  19. Cosmic microwave background bispectrum from recombination.

    PubMed

    Huang, Zhiqi; Vernizzi, Filippo

    2013-03-08

    We compute the cosmic microwave background temperature bispectrum generated by nonlinearities at recombination on all scales. We use CosmoLib2nd, a numerical Boltzmann code at second order to compute cosmic microwave background bispectra on the full sky. We consistently include all effects except gravitational lensing, which can be added to our result using standard methods. The bispectrum is peaked on squeezed triangles and agrees with the analytic approximation in the squeezed limit at the few percent level for all the scales where this is applicable. On smaller scales, we recover previous results on perturbed recombination. For cosmic-variance limited data to l(max)=2000, its signal-to-noise ratio is S/N=0.47, corresponding to f(NL)(eff)=-2.79, and will bias a local signal by f(NL)(loc) ~/= 0.82.

  20. Identification of the subthalamic nucleus in deep brain stimulation surgery with a novel wavelet-derived measure of neural background activity

    PubMed Central

    Snellings, André; Sagher, Oren; Anderson, David J.; Aldridge, J. Wayne

    2016-01-01

    Object A wavelet-based measure was developed to quantitatively assess neural background activity taken during surgical neurophysiological recordings to localize the boundaries of the subthalamic nucleus during target localization for deep brain stimulator implant surgery. Methods Neural electrophysiological data was recorded from 14 patients (20 tracks, n = 275 individual recording sites) with dopamine-sensitive idiopathic Parkinson’s disease during the target localization portion of deep brain stimulator implant surgery. During intraoperative recording the STN was identified based upon audio and visual monitoring of neural firing patterns, kinesthetic tests, and comparisons between neural behavior and known characteristics of the target nucleus. The quantitative wavelet-based measure was applied off-line using MATLAB software to measure the magnitude of the neural background activity, and the results of this analysis were compared to the intraoperative conclusions. Wavelet-derived estimates were compared to power spectral density measures. Results The wavelet-derived background levels were significantly higher in regions encompassed by the clinically estimated boundaries of the STN than in surrounding regions (STN: 225 ± 61 μV vs. ventral to STN: 112 ± 32 μV, and dorsal to STN: 136 ± 66 μV). In every track, the absolute maximum magnitude was found within the clinically identified STN. The wavelet-derived background levels provided a more consistent index with less variability than power spectral density. Conclusions The wavelet-derived background activity assessor can be calculated quickly, requires no spike sorting, and can be reliably used to identify the STN with very little subjective interpretation required. This method may facilitate rapid intraoperative identification of subthalamic nucleus borders. PMID:19344225

  1. Lowering the radioactivity of the photomultiplier tubes for the XENON1T dark matter experiment

    DOE PAGES

    Aprile, E.; Agostini, F.; Alfonsi, M.; ...

    2015-11-23

    The low-background, VUV-sensitive 3-inch diameter photomultiplier tube R11410 has been developed by Hamamatsu for dark matter direct detection experiments using liquid xenon as the target material. We present the results from the joint effort between the XENON collaboration and the Hamamatsu company to produce a highly radio-pure photosensor (version R11410-21) for the XENON1T dark matter experiment. After introducing the photosensor and its components, we show the methods and results of the radioactive contamination measurements of the individual materials employed in the photomultiplier production. We then discuss the adopted strategies to reduce the radioactivity of the various PMT versions. Finally, wemore » detail the results from screening 286 tubes with ultra-low background germanium detectors, as well as their implications for the expected electronic and nuclear recoil background of the XENON1T experiment.« less

  2. Are South African Speech-Language Therapists adequately equipped to assess English Additional Language (EAL) speakers who are from an indigenous linguistic and cultural background? A profile and exploration of the current situation

    PubMed Central

    Mdlalo, Thandeka; Flack, Penelope

    2016-01-01

    This article presents the results of a survey conducted on Speech-Language Therapists (SLTs) regarding current practices in the assessment of English Additional Language (EAL) speakers in South Africa. It forms part of the rationale for a broader (PhD) study that critiques the use of assessment instruments on EAL speakers from an indigenous linguistic and cultural background. This article discusses an aspect of the broader research and presents the background, method, findings, discussion and implications of the survey. The results of this survey highlight the challenges of human and material resources to, and the dominance of English in, the profession in South Africa. The findings contribute to understanding critical factors for acquiring reliable and valid assessment results with diverse populations, particularly the implications from a cultural and linguistic perspective. PMID:27247254

  3. A depth enhancement strategy for kinect depth image

    NASA Astrophysics Data System (ADS)

    Quan, Wei; Li, Hua; Han, Cheng; Xue, Yaohong; Zhang, Chao; Hu, Hanping; Jiang, Zhengang

    2018-03-01

    Kinect is a motion sensing input device which is widely used in computer vision and other related fields. However, there are many inaccurate depth data in Kinect depth images even Kinect v2. In this paper, an algorithm is proposed to enhance Kinect v2 depth images. According to the principle of its depth measuring, the foreground and the background are considered separately. As to the background, the holes are filled according to the depth data in the neighborhood. And as to the foreground, a filling algorithm, based on the color image concerning about both space and color information, is proposed. An adaptive joint bilateral filtering method is used to reduce noise. Experimental results show that the processed depth images have clean background and clear edges. The results are better than ones of traditional Strategies. It can be applied in 3D reconstruction fields to pretreat depth image in real time and obtain accurate results.

  4. Fourier-space combination of Planck and Herschel images

    NASA Astrophysics Data System (ADS)

    Abreu-Vicente, J.; Stutz, A.; Henning, Th.; Keto, E.; Ballesteros-Paredes, J.; Robitaille, T.

    2017-08-01

    Context. Herschel has revolutionized our ability to measure column densities (NH) and temperatures (T) of molecular clouds thanks to its far infrared multiwavelength coverage. However, the lack of a well defined background intensity level in the Herschel data limits the accuracy of the NH and T maps. Aims: We aim to provide a method that corrects the missing Herschel background intensity levels using the Planck model for foreground Galactic thermal dust emission. For the Herschel/PACS data, both the constant-offset as well as the spatial dependence of the missing background must be addressed. For the Herschel/SPIRE data, the constant-offset correction has already been applied to the archival data so we are primarily concerned with the spatial dependence, which is most important at 250 μm. Methods: We present a Fourier method that combines the publicly available Planck model on large angular scales with the Herschel images on smaller angular scales. Results: We have applied our method to two regions spanning a range of Galactic environments: Perseus and the Galactic plane region around l = 11deg (HiGal-11). We post-processed the combined dust continuum emission images to generate column density and temperature maps. We compared these to previously adopted constant-offset corrections. We find significant differences (≳20%) over significant ( 15%) areas of the maps, at low column densities (NH ≲ 1022 cm-2) and relatively high temperatures (T ≳ 20 K). We have also applied our method to synthetic observations of a simulated molecular cloud to validate our method. Conclusions: Our method successfully corrects the Herschel images, including both the constant-offset intensity level and the scale-dependent background variations measured by Planck. Our method improves the previous constant-offset corrections, which did not account for variations in the background emission levels. The image FITS files used in this paper are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/604/A65

  5. Thermodynamics of Newman-Unti-Tamburino charged spaces

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mann, Robert; Department of Physics, University of Waterloo, 200 University Avenue West, Waterloo, Ontario, N2L 3G1; Stelea, Cristian

    We discuss and compare at length the results of two methods used recently to describe the thermodynamics of Taub-Newman-Unti-Tamburino (NUT) solutions in a de Sitter background. In the first approach (C approach), one deals with an analytically continued version of the metric while in the second approach (R approach), the discussion is carried out using the unmodified metric with Lorentzian signature. No analytic continuation is performed on the coordinates and/or the parameters that appear in the metric. We find that the results of both these approaches are completely equivalent modulo analytic continuation and we provide the exact prescription that relatesmore » the results in both methods. The extension of these results to the AdS/flat cases aims to give a physical interpretation of the thermodynamics of NUT-charged spacetimes in the Lorentzian sector. We also briefly discuss the higher-dimensional spaces and note that, analogous with the absence of hyperbolic NUTs in AdS backgrounds, there are no spherical Taub-NUT-dS solutions.« less

  6. Ethnicity-specific factors influencing childhood immunisation decisions among Black and Asian Minority Ethnic groups in the UK: a systematic review of qualitative research

    PubMed Central

    Forster, Alice S; Rockliffe, Lauren; Chorley, Amanda J; Marlow, Laura A V; Bedford, Helen; Smith, Samuel G; Waller, Jo

    2017-01-01

    Background Uptake of some childhood immunisations in the UK is lower among those from some Black and Asian Minority Ethnic (BAME) backgrounds. This systematic review of qualitative research sought to understand the factors that are associated with ethnicity that influence the immunisation decisions of parents from BAME backgrounds living in the UK. Methods Databases were searched on 2 December 2014 for studies published at any time using the terms ‘UK’ and ‘vaccination’ and ‘qualitative methods’ (and variations of these). Included articles comprised participants who were parents from BAME backgrounds. Thematic synthesis methods were used to develop descriptive and higher order themes. Themes specific to ethnicity and associated factors are reported. Results Eight papers were included in the review. Most participants were from Black (n=62) or Asian (n=38) backgrounds. Two ethnicity-related factors affected immunisation decisions. First, factors that are related to ethnicity itself (namely religion, upbringing and migration, and language) affected parents' perceived importance of immunisations, whether immunisations were permitted or culturally acceptable and their understanding of immunisation/the immunisation schedule. Second, perceived biological differences affected decision-making and demand for information. Conclusions Factors related to ethnicity must be considered when seeking to understand immunisation decisions among parents from BAME backgrounds. Where appropriate and feasible, vaccination information should be targeted to address beliefs about ethnic differences held by some individuals from some BAME backgrounds. PMID:27531844

  7. Video segmentation using keywords

    NASA Astrophysics Data System (ADS)

    Ton-That, Vinh; Vong, Chi-Tai; Nguyen-Dao, Xuan-Truong; Tran, Minh-Triet

    2018-04-01

    At DAVIS-2016 Challenge, many state-of-art video segmentation methods achieve potential results, but they still much depend on annotated frames to distinguish between background and foreground. It takes a lot of time and efforts to create these frames exactly. In this paper, we introduce a method to segment objects from video based on keywords given by user. First, we use a real-time object detection system - YOLOv2 to identify regions containing objects that have labels match with the given keywords in the first frame. Then, for each region identified from the previous step, we use Pyramid Scene Parsing Network to assign each pixel as foreground or background. These frames can be used as input frames for Object Flow algorithm to perform segmentation on entire video. We conduct experiments on a subset of DAVIS-2016 dataset in half the size of its original size, which shows that our method can handle many popular classes in PASCAL VOC 2012 dataset with acceptable accuracy, about 75.03%. We suggest widely testing by combining other methods to improve this result in the future.

  8. A Background Noise Reduction Technique Using Adaptive Noise Cancellation for Microphone Arrays

    NASA Technical Reports Server (NTRS)

    Spalt, Taylor B.; Fuller, Christopher R.; Brooks, Thomas F.; Humphreys, William M., Jr.; Brooks, Thomas F.

    2011-01-01

    Background noise in wind tunnel environments poses a challenge to acoustic measurements due to possible low or negative Signal to Noise Ratios (SNRs) present in the testing environment. This paper overviews the application of time domain Adaptive Noise Cancellation (ANC) to microphone array signals with an intended application of background noise reduction in wind tunnels. An experiment was conducted to simulate background noise from a wind tunnel circuit measured by an out-of-flow microphone array in the tunnel test section. A reference microphone was used to acquire a background noise signal which interfered with the desired primary noise source signal at the array. The technique s efficacy was investigated using frequency spectra from the array microphones, array beamforming of the point source region, and subsequent deconvolution using the Deconvolution Approach for the Mapping of Acoustic Sources (DAMAS) algorithm. Comparisons were made with the conventional techniques for improving SNR of spectral and Cross-Spectral Matrix subtraction. The method was seen to recover the primary signal level in SNRs as low as -29 dB and outperform the conventional methods. A second processing approach using the center array microphone as the noise reference was investigated for more general applicability of the ANC technique. It outperformed the conventional methods at the -29 dB SNR but yielded less accurate results when coherence over the array dropped. This approach could possibly improve conventional testing methodology but must be investigated further under more realistic testing conditions.

  9. Probabilistic BPRRC: Robust Change Detection against Illumination Changes and Background Movements

    NASA Astrophysics Data System (ADS)

    Yokoi, Kentaro

    This paper presents Probabilistic Bi-polar Radial Reach Correlation (PrBPRRC), a change detection method that is robust against illumination changes and background movements. Most of the traditional change detection methods are robust against either illumination changes or background movements; BPRRC is one of the illumination-robust change detection methods. We introduce a probabilistic background texture model into BPRRC and add the robustness against background movements including foreground invasions such as moving cars, walking people, swaying trees, and falling snow. We show the superiority of PrBPRRC in the environment with illumination changes and background movements by using three public datasets and one private dataset: ATON Highway data, Karlsruhe traffic sequence data, PETS 2007 data, and Walking-in-a-room data.

  10. Parameter estimation for the exponential-normal convolution model for background correction of affymetrix GeneChip data.

    PubMed

    McGee, Monnie; Chen, Zhongxue

    2006-01-01

    There are many methods of correcting microarray data for non-biological sources of error. Authors routinely supply software or code so that interested analysts can implement their methods. Even with a thorough reading of associated references, it is not always clear how requisite parts of the method are calculated in the software packages. However, it is important to have an understanding of such details, as this understanding is necessary for proper use of the output, or for implementing extensions to the model. In this paper, the calculation of parameter estimates used in Robust Multichip Average (RMA), a popular preprocessing algorithm for Affymetrix GeneChip brand microarrays, is elucidated. The background correction method for RMA assumes that the perfect match (PM) intensities observed result from a convolution of the true signal, assumed to be exponentially distributed, and a background noise component, assumed to have a normal distribution. A conditional expectation is calculated to estimate signal. Estimates of the mean and variance of the normal distribution and the rate parameter of the exponential distribution are needed to calculate this expectation. Simulation studies show that the current estimates are flawed; therefore, new ones are suggested. We examine the performance of preprocessing under the exponential-normal convolution model using several different methods to estimate the parameters.

  11. Dual-wavelength digital holographic imaging with phase background subtraction

    NASA Astrophysics Data System (ADS)

    Khmaladze, Alexander; Matz, Rebecca L.; Jasensky, Joshua; Seeley, Emily; Holl, Mark M. Banaszak; Chen, Zhan

    2012-05-01

    Three-dimensional digital holographic microscopic phase imaging of objects that are thicker than the wavelength of the imaging light is ambiguous and results in phase wrapping. In recent years, several unwrapping methods that employed two or more wavelengths were introduced. These methods compare the phase information obtained from each of the wavelengths and extend the range of unambiguous height measurements. A straightforward dual-wavelength phase imaging method is presented which allows for a flexible tradeoff between the maximum height of the sample and the amount of noise the method can tolerate. For highly accurate phase measurements, phase unwrapping of objects with heights higher than the beat (synthetic) wavelength (i.e. the product of the original two wavelengths divided by their difference), can be achieved. Consequently, three-dimensional measurements of a wide variety of biological systems and microstructures become technically feasible. Additionally, an effective method of removing phase background curvature based on slowly varying polynomial fitting is proposed. This method allows accurate volume measurements of several small objects with the same image frame.

  12. Experimental Evaluation of Concepts for Miqsture; An Online Interactive Language for Tactical Intelligence Processing

    DTIC Science & Technology

    1979-12-10

    Review Collection Plan File. L... _.. Table 20 (Item 18) Items 76 17, and’ 78 compared three different METHODS for recording the outcomes of a task...3-1 3.2 Background ........ n. .... .............. ...... 3-1 3.3 Method Summary...aspects of descriptions of selected tasks from Army tactical Intelli- gence processing. The results provided indications of what query methods have

  13. Stroke-model-based character extraction from gray-level document images.

    PubMed

    Ye, X; Cheriet, M; Suen, C Y

    2001-01-01

    Global gray-level thresholding techniques such as Otsu's method, and local gray-level thresholding techniques such as edge-based segmentation or the adaptive thresholding method are powerful in extracting character objects from simple or slowly varying backgrounds. However, they are found to be insufficient when the backgrounds include sharply varying contours or fonts in different sizes. A stroke-model is proposed to depict the local features of character objects as double-edges in a predefined size. This model enables us to detect thin connected components selectively, while ignoring relatively large backgrounds that appear complex. Meanwhile, since the stroke width restriction is fully factored in, the proposed technique can be used to extract characters in predefined font sizes. To process large volumes of documents efficiently, a hybrid method is proposed for character extraction from various backgrounds. Using the measurement of class separability to differentiate images with simple backgrounds from those with complex backgrounds, the hybrid method can process documents with different backgrounds by applying the appropriate methods. Experiments on extracting handwriting from a check image, as well as machine-printed characters from scene images demonstrate the effectiveness of the proposed model.

  14. Current On-Campus Attitudes toward Energy Usage, Efficiency, and Emerging Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lennon, Liz; Sintov, Nicole; Orosz, Michael

    Context & Background for Energy Survey Methods & Survey Overview Respondent Demographics Results Demand Response Current Environmental Comfort Perceptions Smart Meters Perceived Smart Meter Benefits Motivators of Energy Efficient Practices Summary & Implications

  15. Structural models used in real-time biosurveillance outbreak detection and outbreak curve isolation from noisy background morbidity levels

    PubMed Central

    Cheng, Karen Elizabeth; Crary, David J; Ray, Jaideep; Safta, Cosmin

    2013-01-01

    Objective We discuss the use of structural models for the analysis of biosurveillance related data. Methods and results Using a combination of real and simulated data, we have constructed a data set that represents a plausible time series resulting from surveillance of a large scale bioterrorist anthrax attack in Miami. We discuss the performance of anomaly detection with structural models for these data using receiver operating characteristic (ROC) and activity monitoring operating characteristic (AMOC) analysis. In addition, we show that these techniques provide a method for predicting the level of the outbreak valid for approximately 2 weeks, post-alarm. Conclusions Structural models provide an effective tool for the analysis of biosurveillance data, in particular for time series with noisy, non-stationary background and missing data. PMID:23037798

  16. Scaling images using their background ratio. An application in statistical comparisons of images.

    PubMed

    Kalemis, A; Binnie, D; Bailey, D L; Flower, M A; Ott, R J

    2003-06-07

    Comparison of two medical images often requires image scaling as a pre-processing step. This is usually done with the scaling-to-the-mean or scaling-to-the-maximum techniques which, under certain circumstances, in quantitative applications may contribute a significant amount of bias. In this paper, we present a simple scaling method which assumes only that the most predominant values in the corresponding images belong to their background structure. The ratio of the two images to be compared is calculated and its frequency histogram is plotted. The scaling factor is given by the position of the peak in this histogram which belongs to the background structure. The method was tested against the traditional scaling-to-the-mean technique on simulated planar gamma-camera images which were compared using pixelwise statistical parametric tests. Both sensitivity and specificity for each condition were measured over a range of different contrasts and sizes of inhomogeneity for the two scaling techniques. The new method was found to preserve sensitivity in all cases while the traditional technique resulted in significant degradation of sensitivity in certain cases.

  17. [Study on trace elements of lake sediments by ICP-AES and XRF core scanning].

    PubMed

    Cheng, Ai-Ying; Yu, Jun-Qing; Gao, Chun-Liang; Zhang, Li-Sha; He, Xian-Hu

    2013-07-01

    It is the first time to study sediment of Toson lake in Qaidam Basin. Trace elements including Cd, Cr, Cu, Zn and Pb in lake sediment were measured by ICP-AES method, studied and optimized from different resolution methods respectively, and finally determined a optimum pretreatment system for sediment of Toson lake, namely, HCl-HNO3-HF-HClO4-H2O2 system in the proportions of 5 : 5 : 5 : 1 : 1 was determined. At the same time, the data measured by XRF core scanning were compared, the use of moisture content correction method was analyzed, and the influence of the moisture content on the scanning method was discussed. The results showed that, compared to the background value, the contents of Cd and Zn were a little higher, the content of Cr, Cu and Pb was within the background value limits. XRF core scanning was controlled by sediment elements as well as water content in sediment to some extent. The results by the two methods showed a significant positive correlation, with the correlation coefficient up to 0.673-0.925, and they have a great comparability.

  18. The Pedestrian Detection Method Using an Extension Background Subtraction about the Driving Safety Support Systems

    NASA Astrophysics Data System (ADS)

    Muranaka, Noriaki; Date, Kei; Tokumaru, Masataka; Imanishi, Shigeru

    In recent years, the traffic accident occurs frequently with explosion of traffic density. Therefore, we think that the safe and comfortable transportation system to defend the pedestrian who is the traffic weak is necessary. First, we detect and recognize the pedestrian (the crossing person) by the image processing. Next, we inform all the drivers of the right or left turn that the pedestrian exists by the sound and the image and so on. By prompting a driver to do safe driving in this way, the accident to the pedestrian can decrease. In this paper, we are using a background subtraction method for the movement detection of the movement object. In the background subtraction method, the update method in the background was important, and as for the conventional way, the threshold values of the subtraction processing and background update were identical. That is, the mixing rate of the input image and the background image of the background update was a fixation value, and the fine tuning which corresponded to the environment change of the weather was difficult. Therefore, we propose the update method of the background image that the estimated mistake is difficult to be amplified. We experiment and examines in the comparison about five cases of sunshine, cloudy, evening, rain, sunlight change, except night. This technique can set separately the threshold values of the subtraction processing and background update processing which suited the environmental condition of the weather and so on. Therefore, the fine tuning becomes possible freely in the mixing rate of the input image and the background image of the background update. Because the setting of the parameter which suited an environmental condition becomes important to minimize mistaking percentage, we examine about the setting of a parameter.

  19. Sign Language Translator Application Using OpenCV

    NASA Astrophysics Data System (ADS)

    Triyono, L.; Pratisto, E. H.; Bawono, S. A. T.; Purnomo, F. A.; Yudhanto, Y.; Raharjo, B.

    2018-03-01

    This research focuses on the development of sign language translator application using OpenCV Android based, this application is based on the difference in color. The author also utilizes Support Machine Learning to predict the label. Results of the research showed that the coordinates of the fingertip search methods can be used to recognize a hand gesture to the conditions contained open arms while to figure gesture with the hand clenched using search methods Hu Moments value. Fingertip methods more resilient in gesture recognition with a higher success rate is 95% on the distance variation is 35 cm and 55 cm and variations of light intensity of approximately 90 lux and 100 lux and light green background plain condition compared with the Hu Moments method with the same parameters and the percentage of success of 40%. While the background of outdoor environment applications still can not be used with a success rate of only 6 managed and the rest failed.

  20. Background suppression of infrared small target image based on inter-frame registration

    NASA Astrophysics Data System (ADS)

    Ye, Xiubo; Xue, Bindang

    2018-04-01

    We propose a multi-frame background suppression method for remote infrared small target detection. Inter-frame information is necessary when the heavy background clutters make it difficult to distinguish real targets and false alarms. A registration procedure based on points matching in image patches is used to compensate the local deformation of background. Then the target can be separated by background subtraction. Experiments show our method serves as an effective preliminary of target detection.

  1. Fast cat-eye effect target recognition based on saliency extraction

    NASA Astrophysics Data System (ADS)

    Li, Li; Ren, Jianlin; Wang, Xingbin

    2015-09-01

    Background complexity is a main reason that results in false detection in cat-eye target recognition. Human vision has selective attention property which can help search the salient target from complex unknown scenes quickly and precisely. In the paper, we propose a novel cat-eye effect target recognition method named Multi-channel Saliency Processing before Fusion (MSPF). This method combines traditional cat-eye target recognition with the selective characters of visual attention. Furthermore, parallel processing enables it to achieve fast recognition. Experimental results show that the proposed method performs better in accuracy, robustness and speed compared to other methods.

  2. BiFCROS: A Low-Background Fluorescence Repressor Operator System for Labeling of Genomic Loci.

    PubMed

    Milbredt, Sarah; Waldminghaus, Torsten

    2017-06-07

    Fluorescence-based methods are widely used to analyze elementary cell processes such as DNA replication or chromosomal folding and segregation. Labeling DNA with a fluorescent protein allows the visualization of its temporal and spatial organization. One popular approach is FROS (fluorescence repressor operator system). This method specifically labels DNA in vivo through binding of a fusion of a fluorescent protein and a repressor protein to an operator array, which contains numerous copies of the repressor binding site integrated into the genomic site of interest. Bound fluorescent proteins are then visible as foci in microscopic analyses and can be distinguished from the background fluorescence caused by unbound fusion proteins. Even though this method is widely used, no attempt has been made so far to decrease the background fluorescence to facilitate analysis of the actual signal of interest. Here, we present a new method that greatly reduces the background signal of FROS. BiFCROS (Bimolecular Fluorescence Complementation and Repressor Operator System) is based on fusions of repressor proteins to halves of a split fluorescent protein. Binding to a hybrid FROS array results in fluorescence signals due to bimolecular fluorescence complementation. Only proteins bound to the hybrid FROS array fluoresce, greatly improving the signal to noise ratio compared to conventional FROS. We present the development of BiFCROS and discuss its potential to be used as a fast and single-cell readout for copy numbers of genetic loci. Copyright © 2017 Milbredt and Waldminghaus.

  3. BiFCROS: A Low-Background Fluorescence Repressor Operator System for Labeling of Genomic Loci

    PubMed Central

    Milbredt, Sarah; Waldminghaus, Torsten

    2017-01-01

    Fluorescence-based methods are widely used to analyze elementary cell processes such as DNA replication or chromosomal folding and segregation. Labeling DNA with a fluorescent protein allows the visualization of its temporal and spatial organization. One popular approach is FROS (fluorescence repressor operator system). This method specifically labels DNA in vivo through binding of a fusion of a fluorescent protein and a repressor protein to an operator array, which contains numerous copies of the repressor binding site integrated into the genomic site of interest. Bound fluorescent proteins are then visible as foci in microscopic analyses and can be distinguished from the background fluorescence caused by unbound fusion proteins. Even though this method is widely used, no attempt has been made so far to decrease the background fluorescence to facilitate analysis of the actual signal of interest. Here, we present a new method that greatly reduces the background signal of FROS. BiFCROS (Bimolecular Fluorescence Complementation and Repressor Operator System) is based on fusions of repressor proteins to halves of a split fluorescent protein. Binding to a hybrid FROS array results in fluorescence signals due to bimolecular fluorescence complementation. Only proteins bound to the hybrid FROS array fluoresce, greatly improving the signal to noise ratio compared to conventional FROS. We present the development of BiFCROS and discuss its potential to be used as a fast and single-cell readout for copy numbers of genetic loci. PMID:28450375

  4. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  5. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  6. Enhancements to the MCNP6 background source

    DOE PAGES

    McMath, Garrett E.; McKinney, Gregg W.

    2015-10-19

    The particle transport code MCNP has been used to produce a background radiation data file on a worldwide grid that can easily be sampled as a source in the code. Location-dependent cosmic showers were modeled by Monte Carlo methods to produce the resulting neutron and photon background flux at 2054 locations around Earth. An improved galactic-cosmic-ray feature was used to model the source term as well as data from multiple sources to model the transport environment through atmosphere, soil, and seawater. A new elevation scaling feature was also added to the code to increase the accuracy of the cosmic neutronmore » background for user locations with off-grid elevations. Furthermore, benchmarking has shown the neutron integral flux values to be within experimental error.« less

  7. The perception of isoluminant coloured stimuli of amblyopic eye and defocused eye

    NASA Astrophysics Data System (ADS)

    Krumina, Gunta; Ozolinsh, Maris; Ikaunieks, Gatis

    2008-09-01

    In routine eye examination the visual acuity usually is determined using standard charts with black letters on a white background, however contrast and colour are important characteristics of visual perception. The purpose of research was to study the perception of isoluminant coloured stimuli in the cases of true and simulated amlyopia. We estimated difference in visual acuity with isoluminant coloured stimuli comparing to that for high contrast black-white stimuli for true amblyopia and simulated amblyopia. Tests were generated on computer screen. Visual acuity was detected using different charts in two ways: standard achromatic stimuli (black symbols on a white background) and isoluminant coloured stimuli (white symbols on a yellow background, grey symbols on blue, green or red background). Thus isoluminant tests had colour contrast only but had no luminance contrast. Visual acuity evaluated with the standard method and colour tests were studied for subjects with good visual acuity, if necessary using the best vision correction. The same was performed for subjects with defocused eye and with true amblyopia. Defocus was realized with optical lenses placed in front of the normal eye. The obtained results applying the isoluminant colour charts revealed worsening of the visual acuity comparing with the visual acuity estimated with a standard high contrast method (black symbols on a white background).

  8. Lutetium oxyorthosilicate (LSO) intrinsic activity correction and minimal detectable target activity study for SPECT imaging with a LSO-based animal PET scanner

    NASA Astrophysics Data System (ADS)

    Yao, Rutao; Ma, Tianyu; Shao, Yiping

    2008-08-01

    This work is part of a feasibility study to develop SPECT imaging capability on a lutetium oxyorthosilicate (LSO) based animal PET system. The SPECT acquisition was enabled by inserting a collimator assembly inside the detector ring and acquiring data in singles mode. The same LSO detectors were used for both PET and SPECT imaging. The intrinsic radioactivity of 176Lu in the LSO crystals, however, contaminates the SPECT data, and can generate image artifacts and introduce quantification error. The objectives of this study were to evaluate the effectiveness of a LSO background subtraction method, and to estimate the minimal detectable target activity (MDTA) of image object for SPECT imaging. For LSO background correction, the LSO contribution in an image study was estimated based on a pre-measured long LSO background scan and subtracted prior to the image reconstruction. The MDTA was estimated in two ways. The empirical MDTA (eMDTA) was estimated from screening the tomographic images at different activity levels. The calculated MDTA (cMDTA) was estimated from using a formula based on applying a modified Currie equation on an average projection dataset. Two simulated and two experimental phantoms with different object activity distributions and levels were used in this study. The results showed that LSO background adds concentric ring artifacts to the reconstructed image, and the simple subtraction method can effectively remove these artifacts—the effect of the correction was more visible when the object activity level was near or above the eMDTA. For the four phantoms studied, the cMDTA was consistently about five times of the corresponding eMDTA. In summary, we implemented a simple LSO background subtraction method and demonstrated its effectiveness. The projection-based calculation formula yielded MDTA results that closely correlate with that obtained empirically and may have predicative value for imaging applications.

  9. Formal concept analysis with background knowledge: a case study in paleobiological taxonomy of belemnites

    NASA Astrophysics Data System (ADS)

    Belohlavek, Radim; Kostak, Martin; Osicka, Petr

    2013-05-01

    We present a case study in identification of taxa in paleobiological data. Our approach utilizes formal concept analysis and is based on conceiving a taxon as a group of individuals sharing a collection of attributes. In addition to the incidence relation between individuals and their attributes, the method uses expert background knowledge regarding importance of attributes which helps to filter out correctly formed but paleobiologically irrelevant taxa. We present results of experiments carried out with belemnites-a group of extinct cephalopods which seems particularly suitable for such a purpose. We demonstrate that the methods are capable of revealing taxa and relationships among them that are relevant from a paleobiological point of view.

  10. DNDO Report: Predicting Solar Modulation Potentials for Modeling Cosmic Background Radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behne, Patrick Alan

    The modeling of the detectability of special nuclear material (SNM) at ports and border crossings requires accurate knowledge of the background radiation at those locations. Background radiation originates from two main sources, cosmic and terrestrial. Cosmic background is produced by high-energy galactic cosmic rays (GCR) entering the atmosphere and inducing a cascade of particles that eventually impact the earth’s surface. The solar modulation potential represents one of the primary inputs to modeling cosmic background radiation. Usosokin et al. formally define solar modulation potential as “the mean energy loss [per unit charge] of a cosmic ray particle inside the heliosphere…” Modulationmore » potential, a function of elevation, location, and time, shares an inverse relationship with cosmic background radiation. As a result, radiation detector thresholds require adjustment to account for differing background levels, caused partly by differing solar modulations. Failure to do so can result in higher rates of false positives and failed detection of SNM for low and high levels of solar modulation potential, respectively. This study focuses on solar modulation’s time dependence, and seeks the best method to predict modulation for future dates using Python. To address the task of predicting future solar modulation, we utilize both non-linear least squares sinusoidal curve fitting and cubic spline interpolation. This material will be published in transactions of the ANS winter meeting of November, 2016.« less

  11. A Catalog of Galaxy Clusters Observed by XMM-Newton

    NASA Technical Reports Server (NTRS)

    Snowden, S. L.; Mushotzky, R. M.; Kuntz, K. D.; Davis, David S.

    2007-01-01

    Images and the radial profiles of the temperature, abundance, and brightness for 70 clusters of galaxies observed by XMM-Newton are presented along with a detailed discussion of the data reduction and analysis methods, including background modeling, which were used in the processing. Proper consideration of the various background components is vital to extend the reliable determination of cluster parameters to the largest possible cluster radii. The various components of the background including the quiescent particle background, cosmic diffuse emission, soft proton contamination, and solar wind charge exchange emission are discussed along with suggested means of their identification, filtering, and/or their modeling and subtraction. Every component is spectrally variable, sometimes significantly so, and all components except the cosmic background are temporally variable as well. The distributions of the events over the FOV vary between the components, and some distributions vary with energy. The scientific results from observations of low surface brightness objects and the diffuse background itself can be strongly affected by these background components and therefore great care should be taken in their consideration.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aalseth, Craig E.; Day, Anthony R.; Fuller, Erin S.

    Abstract A new ultra-low-background proportional counter (ULBPC) design was recently developed at Pacific Northwest National Laboratory (PNNL). This design, along with an ultra-low-background counting system (ULBCS) which provides passive and active shielding with radon exclusion, has been developed to complement a new shallow underground laboratory (~30 meters water-equivalent) constructed at PNNL. After these steps to mitigate dominant backgrounds (cosmic rays, external gamma-rays, radioactivity in materials), remaining background events do not exclusively arise from ionization of the proportional counter gas. Digital pulse-shape discrimination (PSD) is thus employed to further improve measurement sensitivity. In this work, a template shape is generated formore » each individual sample measurement of interest, a "self-calibrating" template. Differences in event topology can also cause differences in pulse shape. In this work, the temporal region analyzed for each event is refined to maximize background discrimination while avoiding unwanted sensitivity to event topology. This digital PSD method is applied to sample and background data, and initial measurement results from a biofuel methane sample are presented in the context of low-background measurements currently being developed.« less

  13. Application of nonparametric regression methods to study the relationship between NO2 concentrations and local wind direction and speed at background sites.

    PubMed

    Donnelly, Aoife; Misstear, Bruce; Broderick, Brian

    2011-02-15

    Background concentrations of nitrogen dioxide (NO(2)) are not constant but vary temporally and spatially. The current paper presents a powerful tool for the quantification of the effects of wind direction and wind speed on background NO(2) concentrations, particularly in cases where monitoring data are limited. In contrast to previous studies which applied similar methods to sites directly affected by local pollution sources, the current study focuses on background sites with the aim of improving methods for predicting background concentrations adopted in air quality modelling studies. The relationship between measured NO(2) concentration in air at three such sites in Ireland and locally measured wind direction has been quantified using nonparametric regression methods. The major aim was to analyse a method for quantifying the effects of local wind direction on background levels of NO(2) in Ireland. The method was expanded to include wind speed as an added predictor variable. A Gaussian kernel function is used in the analysis and circular statistics employed for the wind direction variable. Wind direction and wind speed were both found to have a statistically significant effect on background levels of NO(2) at all three sites. Frequently environmental impact assessments are based on short term baseline monitoring producing a limited dataset. The presented non-parametric regression methods, in contrast to the frequently used methods such as binning of the data, allow concentrations for missing data pairs to be estimated and distinction between spurious and true peaks in concentrations to be made. The methods were found to provide a realistic estimation of long term concentration variation with wind direction and speed, even for cases where the data set is limited. Accurate identification of the actual variation at each location and causative factors could be made, thus supporting the improved definition of background concentrations for use in air quality modelling studies. Copyright © 2010 Elsevier B.V. All rights reserved.

  14. Imperial County baseline health survey potential impact of geothermal energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deane, M.

    The survey purpose, methods, and statistical methods are presented. Results are discussed according to: area differences in background variables, area differences in health variables, area differences in annoyance reactions, and comparison of symptom frequencies with age, smoking, and drinking. Included in appendices are tables of data, enumeration forms, the questionnaire, interviewer cards, and interviewer instructions. (MHR)

  15. Relativistic Corrections to the Sunyaev-Zeldovich Effect for Clusters of Galaxies. III. Polarization Effect

    NASA Astrophysics Data System (ADS)

    Itoh, Naoki; Nozawa, Satoshi; Kohyama, Yasuharu

    2000-04-01

    We extend the formalism of relativistic thermal and kinematic Sunyaev-Zeldovich effects and include the polarization of the cosmic microwave background photons. We consider the situation of a cluster of galaxies moving with a velocity β≡v/c with respect to the cosmic microwave background radiation. In the present formalism, polarization of the scattered cosmic microwave background radiation caused by the proper motion of a cluster of galaxies is naturally derived as a special case of the kinematic Sunyaev-Zeldovich effect. The relativistic corrections are also included in a natural way. Our results are in complete agreement with the recent results of relativistic corrections obtained by Challinor, Ford, & Lasenby with an entirely different method, as well as the nonrelativistic limit obtained by Sunyaev & Zeldovich. The relativistic correction becomes significant in the Wien region.

  16. New method for quantification of vuggy porosity from digital optical borehole images as applied to the karstic Pleistocene limestone of the Biscayne aquifer, southeastern Florida

    USGS Publications Warehouse

    Cunningham, K.J.; Carlson, J.I.; Hurley, N.F.

    2004-01-01

    Vuggy porosity is gas- or fluid-filled openings in rock matrix that are large enough to be seen with the unaided eye. Well-connected vugs can form major conduits for flow of ground water, especially in carbonate rocks. This paper presents a new method for quantification of vuggy porosity calculated from digital borehole images collected from 47 test coreholes that penetrate the karstic Pleistocene limestone of the Biscayne aquifer, southeastern Florida. Basically, the method interprets vugs and background based on the grayscale color of each in digital borehole images and calculates a percentage of vuggy porosity. Development of the method was complicated because environmental conditions created an uneven grayscale contrast in the borehole images that makes it difficult to distinguish vugs from background. The irregular contrast was produced by unbalanced illumination of the borehole wall, which was a result of eccentering of the borehole-image logging tool. Experimentation showed that a simple, single grayscale threshold would not realistically differentiate between the grayscale contrast of vugs and background. Therefore, an equation was developed for an effective subtraction of the changing grayscale contrast, due to uneven illumination, to produce a grayscale threshold that successfully identifies vugs. In the equation, a moving average calculated around the circumference of the borehole and expressed as the background grayscale intensity is defined as a baseline from which to identify a grayscale threshold for vugs. A constant was derived empirically by calibration with vuggy porosity values derived from digital images of slabbed-core samples and used to make the subtraction from the background baseline to derive the vug grayscale threshold as a function of azimuth. The method should be effective in estimating vuggy porosity in any carbonate aquifer. ?? 2003 Published by Elsevier B.V.

  17. A method of reducing background fluctuation in tunable diode laser absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Yang, Rendi; Dong, Xiaozhou; Bi, Yunfeng; Lv, Tieliang

    2018-03-01

    Optical interference fringe is the main factor that leads to background fluctuation in gas concentration detection based on tunable diode laser absorption spectroscopy. The interference fringes are generated by multiple reflections or scatterings upon optical surfaces in optical path and make the background signal present an approximated sinusoidal oscillation. To reduce the fluctuation of the background, a method that combines dual tone modulation (DTM) with vibration reflector (VR) is proposed in this paper. The combination of DTM and VR can make the unwanted periodic interference fringes to be averaged out and the effectiveness of the method in reducing background fluctuation has been verified by simulation and real experiments in this paper. In the detection system based on the proposed method, the standard deviation (STD) value of the background signal is decreased to 0.0924 parts per million (ppm), which is reduced by a factor of 16 compared with that of wavelength modulation spectroscopy. The STD value of 0.0924 ppm corresponds to the absorption of 4 . 328 × 10-6Hz - 1 / 2 (with effective optical path length of 4 m and integral time of 0.1 s). Moreover, the proposed method presents a better stable performance in reducing background fluctuation in long time experiments.

  18. A marker-based watershed method for X-ray image segmentation.

    PubMed

    Zhang, Xiaodong; Jia, Fucang; Luo, Suhuai; Liu, Guiying; Hu, Qingmao

    2014-03-01

    Digital X-ray images are the most frequent modality for both screening and diagnosis in hospitals. To facilitate subsequent analysis such as quantification and computer aided diagnosis (CAD), it is desirable to exclude image background. A marker-based watershed segmentation method was proposed to segment background of X-ray images. The method consisted of six modules: image preprocessing, gradient computation, marker extraction, watershed segmentation from markers, region merging and background extraction. One hundred clinical direct radiograph X-ray images were used to validate the method. Manual thresholding and multiscale gradient based watershed method were implemented for comparison. The proposed method yielded a dice coefficient of 0.964±0.069, which was better than that of the manual thresholding (0.937±0.119) and that of multiscale gradient based watershed method (0.942±0.098). Special means were adopted to decrease the computational cost, including getting rid of few pixels with highest grayscale via percentile, calculation of gradient magnitude through simple operations, decreasing the number of markers by appropriate thresholding, and merging regions based on simple grayscale statistics. As a result, the processing time was at most 6s even for a 3072×3072 image on a Pentium 4 PC with 2.4GHz CPU (4 cores) and 2G RAM, which was more than one time faster than that of the multiscale gradient based watershed method. The proposed method could be a potential tool for diagnosis and quantification of X-ray images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Human Body 3D Posture Estimation Using Significant Points and Two Cameras

    PubMed Central

    Juang, Chia-Feng; Chen, Teng-Chang; Du, Wei-Chin

    2014-01-01

    This paper proposes a three-dimensional (3D) human posture estimation system that locates 3D significant body points based on 2D body contours extracted from two cameras without using any depth sensors. The 3D significant body points that are located by this system include the head, the center of the body, the tips of the feet, the tips of the hands, the elbows, and the knees. First, a linear support vector machine- (SVM-) based segmentation method is proposed to distinguish the human body from the background in red, green, and blue (RGB) color space. The SVM-based segmentation method uses not only normalized color differences but also included angle between pixels in the current frame and the background in order to reduce shadow influence. After segmentation, 2D significant points in each of the two extracted images are located. A significant point volume matching (SPVM) method is then proposed to reconstruct the 3D significant body point locations by using 2D posture estimation results. Experimental results show that the proposed SVM-based segmentation method shows better performance than other gray level- and RGB-based segmentation approaches. This paper also shows the effectiveness of the 3D posture estimation results in different postures. PMID:24883422

  1. Motion detection and compensation in infrared retinal image sequences.

    PubMed

    Scharcanski, J; Schardosim, L R; Santos, D; Stuchi, A

    2013-01-01

    Infrared image data captured by non-mydriatic digital retinography systems often are used in the diagnosis and treatment of the diabetic macular edema (DME). Infrared illumination is less aggressive to the patient retina, and retinal studies can be carried out without pupil dilation. However, sequences of infrared eye fundus images of static scenes, tend to present pixel intensity fluctuations in time, and noisy and background illumination changes pose a challenge to most motion detection methods proposed in the literature. In this paper, we present a retinal motion detection method that is adaptive to background noise and illumination changes. Our experimental results indicate that this method is suitable for detecting retinal motion in infrared image sequences, and compensate the detected motion, which is relevant in retinal laser treatment systems for DME. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Screen for Carbon Dioxide.

    ERIC Educational Resources Information Center

    Foster, John; And Others

    1986-01-01

    Presents a set of laboratory experiments that can assist students in the detection of carbon dioxide. Offers a variation of the supported drop method of carbon dioxide detection that provides readily visible positive results. Includes background information on carbon dioxide. (ML)

  3. Detection of admittivity anomaly on high-contrast heterogeneous backgrounds using frequency difference EIT.

    PubMed

    Jang, J; Seo, J K

    2015-06-01

    This paper describes a multiple background subtraction method in frequency difference electrical impedance tomography (fdEIT) to detect an admittivity anomaly from a high-contrast background conductivity distribution. The proposed method expands the use of the conventional weighted frequency difference EIT method, which has been used limitedly to detect admittivity anomalies in a roughly homogeneous background. The proposed method can be viewed as multiple weighted difference imaging in fdEIT. Although the spatial resolutions of the output images by fdEIT are very low due to the inherent ill-posedness, numerical simulations and phantom experiments of the proposed method demonstrate its feasibility to detect anomalies. It has potential application in stroke detection in a head model, which is highly heterogeneous due to the skull.

  4. A generalized transport-velocity formulation for smoothed particle hydrodynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Chi; Hu, Xiangyu Y., E-mail: xiangyu.hu@tum.de; Adams, Nikolaus A.

    The standard smoothed particle hydrodynamics (SPH) method suffers from tensile instability. In fluid-dynamics simulations this instability leads to particle clumping and void regions when negative pressure occurs. In solid-dynamics simulations, it results in unphysical structure fragmentation. In this work the transport-velocity formulation of Adami et al. (2013) is generalized for providing a solution of this long-standing problem. Other than imposing a global background pressure, a variable background pressure is used to modify the particle transport velocity and eliminate the tensile instability completely. Furthermore, such a modification is localized by defining a shortened smoothing length. The generalized formulation is suitable formore » fluid and solid materials with and without free surfaces. The results of extensive numerical tests on both fluid and solid dynamics problems indicate that the new method provides a unified approach for multi-physics SPH simulations.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saur, Sigrun; Frengen, Jomar; Department of Oncology and Radiotherapy, St. Olavs University Hospital, N-7006 Trondheim

    Film dosimetry using radiochromic EBT film in combination with a flatbed charge coupled device scanner is a useful method both for two-dimensional verification of intensity-modulated radiation treatment plans and for general quality assurance of treatment planning systems and linear accelerators. Unfortunately, the response over the scanner area is nonuniform, and when not corrected for, this results in a systematic error in the measured dose which is both dose and position dependent. In this study a novel method for background correction is presented. The method is based on the subtraction of a correction matrix, a matrix that is based on scansmore » of films that are irradiated to nine dose levels in the range 0.08-2.93 Gy. Because the response of the film is dependent on the film's orientation with respect to the scanner, correction matrices for both landscape oriented and portrait oriented scans were made. In addition to the background correction method, a full dose uncertainty analysis of the film dosimetry procedure was performed. This analysis takes into account the fit uncertainty of the calibration curve, the variation in response for different film sheets, the nonuniformity after background correction, and the noise in the scanned films. The film analysis was performed for film pieces of size 16x16 cm, all with the same lot number, and all irradiations were done perpendicular onto the films. The results show that the 2-sigma dose uncertainty at 2 Gy is about 5% and 3.5% for landscape and portrait scans, respectively. The uncertainty gradually increases as the dose decreases, but at 1 Gy the 2-sigma dose uncertainty is still as good as 6% and 4% for landscape and portrait scans, respectively. The study shows that film dosimetry using GafChromic EBT film, an Epson Expression 1680 Professional scanner and a dedicated background correction technique gives precise and accurate results. For the purpose of dosimetric verification, the calculated dose distribution can be compared with the film-measured dose distribution using a dose constraint of 4% (relative to the measured dose) for doses between 1 and 3 Gy. At lower doses, the dose constraint must be relaxed.« less

  6. Investigating the role of background and observation error correlations in improving a model forecast of forest carbon balance using four dimensional variational data assimilation.

    NASA Astrophysics Data System (ADS)

    Pinnington, Ewan; Casella, Eric; Dance, Sarah; Lawless, Amos; Morison, James; Nichols, Nancy; Wilkinson, Matthew; Quaife, Tristan

    2016-04-01

    Forest ecosystems play an important role in sequestering human emitted carbon-dioxide from the atmosphere and therefore greatly reduce the effect of anthropogenic induced climate change. For that reason understanding their response to climate change is of great importance. Efforts to implement variational data assimilation routines with functional ecology models and land surface models have been limited, with sequential and Markov chain Monte Carlo data assimilation methods being prevalent. When data assimilation has been used with models of carbon balance, background "prior" errors and observation errors have largely been treated as independent and uncorrelated. Correlations between background errors have long been known to be a key aspect of data assimilation in numerical weather prediction. More recently, it has been shown that accounting for correlated observation errors in the assimilation algorithm can considerably improve data assimilation results and forecasts. In this paper we implement a 4D-Var scheme with a simple model of forest carbon balance, for joint parameter and state estimation and assimilate daily observations of Net Ecosystem CO2 Exchange (NEE) taken at the Alice Holt forest CO2 flux site in Hampshire, UK. We then investigate the effect of specifying correlations between parameter and state variables in background error statistics and the effect of specifying correlations in time between observation error statistics. The idea of including these correlations in time is new and has not been previously explored in carbon balance model data assimilation. In data assimilation, background and observation error statistics are often described by the background error covariance matrix and the observation error covariance matrix. We outline novel methods for creating correlated versions of these matrices, using a set of previously postulated dynamical constraints to include correlations in the background error statistics and a Gaussian correlation function to include time correlations in the observation error statistics. The methods used in this paper will allow the inclusion of time correlations between many different observation types in the assimilation algorithm, meaning that previously neglected information can be accounted for. In our experiments we compared the results using our new correlated background and observation error covariance matrices and those using diagonal covariance matrices. We found that using the new correlated matrices reduced the root mean square error in the 14 year forecast of daily NEE by 44 % decreasing from 4.22 g C m-2 day-1 to 2.38 g C m-2 day-1.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soffientini, Chiara Dolores, E-mail: chiaradolores.soffientini@polimi.it; Baselli, Giuseppe; De Bernardi, Elisabetta

    Purpose: Quantitative {sup 18}F-fluorodeoxyglucose positron emission tomography is limited by the uncertainty in lesion delineation due to poor SNR, low resolution, and partial volume effects, subsequently impacting oncological assessment, treatment planning, and follow-up. The present work develops and validates a segmentation algorithm based on statistical clustering. The introduction of constraints based on background features and contiguity priors is expected to improve robustness vs clinical image characteristics such as lesion dimension, noise, and contrast level. Methods: An eight-class Gaussian mixture model (GMM) clustering algorithm was modified by constraining the mean and variance parameters of four background classes according to the previousmore » analysis of a lesion-free background volume of interest (background modeling). Hence, expectation maximization operated only on the four classes dedicated to lesion detection. To favor the segmentation of connected objects, a further variant was introduced by inserting priors relevant to the classification of neighbors. The algorithm was applied to simulated datasets and acquired phantom data. Feasibility and robustness toward initialization were assessed on a clinical dataset manually contoured by two expert clinicians. Comparisons were performed with respect to a standard eight-class GMM algorithm and to four different state-of-the-art methods in terms of volume error (VE), Dice index, classification error (CE), and Hausdorff distance (HD). Results: The proposed GMM segmentation with background modeling outperformed standard GMM and all the other tested methods. Medians of accuracy indexes were VE <3%, Dice >0.88, CE <0.25, and HD <1.2 in simulations; VE <23%, Dice >0.74, CE <0.43, and HD <1.77 in phantom data. Robustness toward image statistic changes (±15%) was shown by the low index changes: <26% for VE, <17% for Dice, and <15% for CE. Finally, robustness toward the user-dependent volume initialization was demonstrated. The inclusion of the spatial prior improved segmentation accuracy only for lesions surrounded by heterogeneous background: in the relevant simulation subset, the median VE significantly decreased from 13% to 7%. Results on clinical data were found in accordance with simulations, with absolute VE <7%, Dice >0.85, CE <0.30, and HD <0.81. Conclusions: The sole introduction of constraints based on background modeling outperformed standard GMM and the other tested algorithms. Insertion of a spatial prior improved the accuracy for realistic cases of objects in heterogeneous backgrounds. Moreover, robustness against initialization supports the applicability in a clinical setting. In conclusion, application-driven constraints can generally improve the capabilities of GMM and statistical clustering algorithms.« less

  8. Background field Landau mode operators for the nucleon

    NASA Astrophysics Data System (ADS)

    Kamleh, Waseem; Bignell, Ryan; Leinweber, Derek B.; Burkardt, Matthias

    2018-03-01

    The introduction of a uniform background magnetic field breaks threedimensional spatial symmetry for a charged particle and introduces Landau mode effects. Standard quark operators are inefficient at isolating the nucleon correlation function at nontrivial field strengths. We introduce novel quark operators constructed from the twodimensional Laplacian eigenmodes that describe a charged particle on a finite lattice. These eigenmode-projected quark operators provide enhanced precision for calculating nucleon energy shifts in a magnetic field. Preliminary results are obtained for the neutron and proton magnetic polarisabilities using these methods.

  9. Space moving target detection and tracking method in complex background

    NASA Astrophysics Data System (ADS)

    Lv, Ping-Yue; Sun, Sheng-Li; Lin, Chang-Qing; Liu, Gao-Rui

    2018-06-01

    The background of the space-borne detectors in real space-based environment is extremely complex and the signal-to-clutter ratio is very low (SCR ≈ 1), which increases the difficulty for detecting space moving targets. In order to solve this problem, an algorithm combining background suppression processing based on two-dimensional least mean square filter (TDLMS) and target enhancement based on neighborhood gray-scale difference (GSD) is proposed in this paper. The latter can filter out most of the residual background clutter processed by the former such as cloud edge. Through this procedure, both global and local SCR have obtained substantial improvement, indicating that the target has been greatly enhanced. After removing the detector's inherent clutter region through connected domain processing, the image only contains the target point and the isolated noise, in which the isolated noise could be filtered out effectively through multi-frame association. The proposed algorithm in this paper has been compared with some state-of-the-art algorithms for moving target detection and tracking tasks. The experimental results show that the performance of this algorithm is the best in terms of SCR gain, background suppression factor (BSF) and detection results.

  10. Development and evaluation of a data-adaptive alerting algorithm for univariate temporal biosurveillance data.

    PubMed

    Elbert, Yevgeniy; Burkom, Howard S

    2009-11-20

    This paper discusses further advances in making robust predictions with the Holt-Winters forecasts for a variety of syndromic time series behaviors and introduces a control-chart detection approach based on these forecasts. Using three collections of time series data, we compare biosurveillance alerting methods with quantified measures of forecast agreement, signal sensitivity, and time-to-detect. The study presents practical rules for initialization and parameterization of biosurveillance time series. Several outbreak scenarios are used for detection comparison. We derive an alerting algorithm from forecasts using Holt-Winters-generalized smoothing for prospective application to daily syndromic time series. The derived algorithm is compared with simple control-chart adaptations and to more computationally intensive regression modeling methods. The comparisons are conducted on background data from both authentic and simulated data streams. Both types of background data include time series that vary widely by both mean value and cyclic or seasonal behavior. Plausible, simulated signals are added to the background data for detection performance testing at signal strengths calculated to be neither too easy nor too hard to separate the compared methods. Results show that both the sensitivity and the timeliness of the Holt-Winters-based algorithm proved to be comparable or superior to that of the more traditional prediction methods used for syndromic surveillance.

  11. A new iterative triclass thresholding technique in image segmentation.

    PubMed

    Cai, Hongmin; Yang, Zhong; Cao, Xinhua; Xia, Weiming; Xu, Xiaoyin

    2014-03-01

    We present a new method in image segmentation that is based on Otsu's method but iteratively searches for subregions of the image for segmentation, instead of treating the full image as a whole region for processing. The iterative method starts with Otsu's threshold and computes the mean values of the two classes as separated by the threshold. Based on the Otsu's threshold and the two mean values, the method separates the image into three classes instead of two as the standard Otsu's method does. The first two classes are determined as the foreground and background and they will not be processed further. The third class is denoted as a to-be-determined (TBD) region that is processed at next iteration. At the succeeding iteration, Otsu's method is applied on the TBD region to calculate a new threshold and two class means and the TBD region is again separated into three classes, namely, foreground, background, and a new TBD region, which by definition is smaller than the previous TBD regions. Then, the new TBD region is processed in the similar manner. The process stops when the Otsu's thresholds calculated between two iterations is less than a preset threshold. Then, all the intermediate foreground and background regions are, respectively, combined to create the final segmentation result. Tests on synthetic and real images showed that the new iterative method can achieve better performance than the standard Otsu's method in many challenging cases, such as identifying weak objects and revealing fine structures of complex objects while the added computational cost is minimal.

  12. Evaluating flow cytometer performance with weighted quadratic least squares analysis of LED and multi-level bead data

    PubMed Central

    Parks, David R.; Khettabi, Faysal El; Chase, Eric; Hoffman, Robert A.; Perfetto, Stephen P.; Spidlen, Josef; Wood, James C.S.; Moore, Wayne A.; Brinkman, Ryan R.

    2017-01-01

    We developed a fully automated procedure for analyzing data from LED pulses and multi-level bead sets to evaluate backgrounds and photoelectron scales of cytometer fluorescence channels. The method improves on previous formulations by fitting a full quadratic model with appropriate weighting and by providing standard errors and peak residuals as well as the fitted parameters themselves. Here we describe the details of the methods and procedures involved and present a set of illustrations and test cases that demonstrate the consistency and reliability of the results. The automated analysis and fitting procedure is generally quite successful in providing good estimates of the Spe (statistical photoelectron) scales and backgrounds for all of the fluorescence channels on instruments with good linearity. The precision of the results obtained from LED data is almost always better than for multi-level bead data, but the bead procedure is easy to carry out and provides results good enough for most purposes. Including standard errors on the fitted parameters is important for understanding the uncertainty in the values of interest. The weighted residuals give information about how well the data fits the model, and particularly high residuals indicate bad data points. Known photoelectron scales and measurement channel backgrounds make it possible to estimate the precision of measurements at different signal levels and the effects of compensated spectral overlap on measurement quality. Combining this information with measurements of standard samples carrying dyes of biological interest, we can make accurate comparisons of dye sensitivity among different instruments. Our method is freely available through the R/Bioconductor package flowQB. PMID:28160404

  13. Determining the 40K radioactivity in rocks using x-ray spectrometry

    NASA Astrophysics Data System (ADS)

    Pilakouta, M.; Kallithrakas-Kontos, N.; Nikolaou, G.

    2017-09-01

    In this paper we propose an experimental method for the determination of potassium-40 (40K) radioactivity in commercial granite samples using x-ray fluorescence (XRF). The method correlates the total potassium concentration (yield) in samples deduced by XRF analysis with the radioactivity of the sample due to the 40K radionuclide. This method can be used in an undergraduate student laboratory. A brief theoretical background and description of the method, as well as some results and their interpretation, are presented.

  14. Software Engineering Laboratory (SEL) Ada performance study report

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1991-01-01

    The goals of the Ada Performance Study are described. The methods used are explained. Guidelines for future Ada development efforts are given. The goals and scope of the study are detailed, and the background of Ada development in the Flight Dynamics Division (FDD) is presented. The organization and overall purpose of each test are discussed. The purpose, methods, and results of each test and analyses of these results are given. Guidelines for future development efforts based on the analysis of results from this study are provided. The approach used on the performance tests is discussed.

  15. A Dynamic Enhancement With Background Reduction Algorithm: Overview and Application to Satellite-Based Dust Storm Detection

    NASA Astrophysics Data System (ADS)

    Miller, Steven D.; Bankert, Richard L.; Solbrig, Jeremy E.; Forsythe, John M.; Noh, Yoo-Jeong; Grasso, Lewis D.

    2017-12-01

    This paper describes a Dynamic Enhancement Background Reduction Algorithm (DEBRA) applicable to multispectral satellite imaging radiometers. DEBRA uses ancillary information about the clear-sky background to reduce false detections of atmospheric parameters in complex scenes. Applied here to the detection of lofted dust, DEBRA enlists a surface emissivity database coupled with a climatological database of surface temperature to approximate the clear-sky equivalent signal for selected infrared-based multispectral dust detection tests. This background allows for suppression of false alarms caused by land surface features while retaining some ability to detect dust above those problematic surfaces. The algorithm is applicable to both day and nighttime observations and enables weighted combinations of dust detection tests. The results are provided quantitatively, as a detection confidence factor [0, 1], but are also readily visualized as enhanced imagery. Utilizing the DEBRA confidence factor as a scaling factor in false color red/green/blue imagery enables depiction of the targeted parameter in the context of the local meteorology and topography. In this way, the method holds utility to both automated clients and human analysts alike. Examples of DEBRA performance from notable dust storms and comparisons against other detection methods and independent observations are presented.

  16. Robust foreground detection: a fusion of masked grey world, probabilistic gradient information and extended conditional random field approach.

    PubMed

    Zulkifley, Mohd Asyraf; Moran, Bill; Rawlinson, David

    2012-01-01

    Foreground detection has been used extensively in many applications such as people counting, traffic monitoring and face recognition. However, most of the existing detectors can only work under limited conditions. This happens because of the inability of the detector to distinguish foreground and background pixels, especially in complex situations. Our aim is to improve the robustness of foreground detection under sudden and gradual illumination change, colour similarity issue, moving background and shadow noise. Since it is hard to achieve robustness using a single model, we have combined several methods into an integrated system. The masked grey world algorithm is introduced to handle sudden illumination change. Colour co-occurrence modelling is then fused with the probabilistic edge-based background modelling. Colour co-occurrence modelling is good in filtering moving background and robust to gradual illumination change, while an edge-based modelling is used for solving a colour similarity problem. Finally, an extended conditional random field approach is used to filter out shadow and afterimage noise. Simulation results show that our algorithm performs better compared to the existing methods, which makes it suitable for higher-level applications.

  17. Three-dimensional ionospheric tomography reconstruction using the model function approach in Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Wang, Sicheng; Huang, Sixun; Xiang, Jie; Fang, Hanxian; Feng, Jian; Wang, Yu

    2016-12-01

    Ionospheric tomography is based on the observed slant total electron content (sTEC) along different satellite-receiver rays to reconstruct the three-dimensional electron density distributions. Due to incomplete measurements provided by the satellite-receiver geometry, it is a typical ill-posed problem, and how to overcome the ill-posedness is still a crucial content of research. In this paper, Tikhonov regularization method is used and the model function approach is applied to determine the optimal regularization parameter. This algorithm not only balances the weights between sTEC observations and background electron density field but also converges globally and rapidly. The background error covariance is given by multiplying background model variance and location-dependent spatial correlation, and the correlation model is developed by using sample statistics from an ensemble of the International Reference Ionosphere 2012 (IRI2012) model outputs. The Global Navigation Satellite System (GNSS) observations in China are used to present the reconstruction results, and measurements from two ionosondes are used to make independent validations. Both the test cases using artificial sTEC observations and actual GNSS sTEC measurements show that the regularization method can effectively improve the background model outputs.

  18. Legibility Evaluation with Oculomotor Analysis

    NASA Astrophysics Data System (ADS)

    Saito, Daisuke; Saito, Keiichi; Saito, Masao

    Web page legibility is important because of WWW dissemination and color combinations between a foreground and a background are the crucial factors to provide sufficient legibility. In our previous studies, the visibilities of several web-safe color combinations were examined using a psychological method. In those studies, simple stimuli were used because of experimental restriction. In this study, legibility of sentences on Web sites was examined using a psychophisiological method by oculomotor and the effect of the achromatic color combinations, that is contrast, was examined with calculated reading time. The presentation stimuli were positive coloration whose font color luminance is lower than background color, and negative coloration whose font color luminance is higher than background color. And the number of characters per line in each page was arranged in the same number, and the four achromatic colors that is, the contrast between the background color and font color are 92.5, 75.0, 50.0 and 25.0 percent, were examined. As the results, it was shown that reading time of became long when the contrast. However, in the negative coloration, there were great differences between individuals. Therefore, considering web accessibility, the legibility is found to be useful for using a positive coloration.

  19. Reexamining the Relationship between Verbal Knowledge Background and Keyword Training for Vocabulary Acquisition

    PubMed

    Hogben; Lawson

    1997-07-01

    The literature on keyword training presents a confusing picture of the usefulness of the keyword method for foreign language vocabulary learning by students with strong verbal knowledge backgrounds. This paper reviews research which notes the existence of conflicting sets of findings concerning the verbal background-keyword training relationship and presents the results of analyses which argue against the assertion made by McDaniel and Pressley (1984) that keyword training will have minimal effect on students with high verbal ability. Findings from regression analyses of data from two studies did not show that the relationship between keyword training and immediate recall performance was moderated by verbal knowledge background. The disparate sets of findings related to the keyword training-verbal knowledge relationship and themes emerging from other research suggest that this relationship requires further examination.

  20. Use and Nonuse of a Rail Trail Conversion for Physical Activity: Implications for Promoting Trail Use

    ERIC Educational Resources Information Center

    Price, Anna E.; Reed, Julian A.

    2014-01-01

    Background: There is limited research examining both use and nonuse of trails for physical activity. Purpose: Such research might enable health educators to better promote physical activity on trails.Methods:We used random digit dialing methods to survey 726 respondents in 2012. Results: The majority (75.1%) of respondents reported not using the…

  1. Development of a tagged source of Pb-206 nuclei

    NASA Astrophysics Data System (ADS)

    Cutter, J.; Godfrey, B.; Hillbrand, S.; Irving, M.; Manalaysay, A.; Minaker, Z.; Morad, J.; Tripathi, M.

    2018-02-01

    There is a particular class of unavoidable backgrounds that plague low-background experiments and rare event searches, particularly those searching for nuclear recoil event signatures: decaying daughters of the 238U nuclear decay chain, which result from radon plate-out on detector materials. One such daughter isotope, 210Po, undergoes α-decay and produces a recoiling 103 keV 206Pb nucleus. To characterize this important background in the context of noble element detectors, we have implemented a triggered source for these 206Pb recoils in a dual-phase xenon time projection chamber (Xe TPC) within the Davis Xenon R&D testbed system (DAX). By adhering 210Po to the surface of a PIN diode and electrically floating the diode on the cathode of the TPC, we tag the α signals produced in the PIN diode and trigger on the correlated nuclear recoils in the liquid xenon (LXe). We discuss our methods for 210Po deposition, electronic readout of the PIN diode signals at high voltage, and analysis methods for event selection.

  2. Infrared Spectroscopy of Deuterated Compounds.

    ERIC Educational Resources Information Center

    MacCarthy, Patrick

    1985-01-01

    Background information, procedures used, and typical results obtained are provided for an experiment (based on the potassium bromide pressed-pellet method) involving the infrared spectroscopy of deuterated compounds. Deuteration refers to deuterium-hydrogen exchange at active hydrogen sites in the molecule. (JN)

  3. Background field removal technique based on non-regularized variable kernels sophisticated harmonic artifact reduction for phase data for quantitative susceptibility mapping.

    PubMed

    Kan, Hirohito; Arai, Nobuyuki; Takizawa, Masahiro; Omori, Kazuyoshi; Kasai, Harumasa; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2018-06-11

    We developed a non-regularized, variable kernel, sophisticated harmonic artifact reduction for phase data (NR-VSHARP) method to accurately estimate local tissue fields without regularization for quantitative susceptibility mapping (QSM). We then used a digital brain phantom to evaluate the accuracy of the NR-VSHARP method, and compared it with the VSHARP and iterative spherical mean value (iSMV) methods through in vivo human brain experiments. Our proposed NR-VSHARP method, which uses variable spherical mean value (SMV) kernels, minimizes L2 norms only within the volume of interest to reduce phase errors and save cortical information without regularization. In a numerical phantom study, relative local field and susceptibility map errors were determined using NR-VSHARP, VSHARP, and iSMV. Additionally, various background field elimination methods were used to image the human brain. In a numerical phantom study, the use of NR-VSHARP considerably reduced the relative local field and susceptibility map errors throughout a digital whole brain phantom, compared with VSHARP and iSMV. In the in vivo experiment, the NR-VSHARP-estimated local field could sufficiently achieve minimal boundary losses and phase error suppression throughout the brain. Moreover, the susceptibility map generated using NR-VSHARP minimized the occurrence of streaking artifacts caused by insufficient background field removal. Our proposed NR-VSHARP method yields minimal boundary losses and highly precise phase data. Our results suggest that this technique may facilitate high-quality QSM. Copyright © 2017. Published by Elsevier Inc.

  4. A new background distribution-based active contour model for three-dimensional lesion segmentation in breast DCE-MRI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Hui; Liu, Yiping; Qiu, Tianshuang

    2014-08-15

    Purpose: To develop and evaluate a computerized semiautomatic segmentation method for accurate extraction of three-dimensional lesions from dynamic contrast-enhanced magnetic resonance images (DCE-MRIs) of the breast. Methods: The authors propose a new background distribution-based active contour model using level set (BDACMLS) to segment lesions in breast DCE-MRIs. The method starts with manual selection of a region of interest (ROI) that contains the entire lesion in a single slice where the lesion is enhanced. Then the lesion volume from the volume data of interest, which is captured automatically, is separated. The core idea of BDACMLS is a new signed pressure functionmore » which is based solely on the intensity distribution combined with pathophysiological basis. To compare the algorithm results, two experienced radiologists delineated all lesions jointly to obtain the ground truth. In addition, results generated by other different methods based on level set (LS) are also compared with the authors’ method. Finally, the performance of the proposed method is evaluated by several region-based metrics such as the overlap ratio. Results: Forty-two studies with 46 lesions that contain 29 benign and 17 malignant lesions are evaluated. The dataset includes various typical pathologies of the breast such as invasive ductal carcinoma, ductal carcinomain situ, scar carcinoma, phyllodes tumor, breast cysts, fibroadenoma, etc. The overlap ratio for BDACMLS with respect to manual segmentation is 79.55% ± 12.60% (mean ± s.d.). Conclusions: A new active contour model method has been developed and shown to successfully segment breast DCE-MRI three-dimensional lesions. The results from this model correspond more closely to manual segmentation, solve the weak-edge-passed problem, and improve the robustness in segmenting different lesions.« less

  5. Component separation for cosmic microwave background radiation

    NASA Astrophysics Data System (ADS)

    Fernández-Cobos, R.; Vielva, P.; Barreiro, R. B.; Martínez-González, E.

    2011-11-01

    Cosmic microwave background (CMB) radiation data obtained by different experiments contains, besides the desired signal, a superposition of microwave sky contributions mainly due to, on the one hand, synchrotron radiation, free-free emission and re-emission of dust clouds in our galaxy; and, on the other hand, extragalactic sources. We present an analytical method, using a wavelet decomposition on the sphere, to recover the CMB signal from microwave maps. Being applied to both temperature and polarization data, it is shown as a significant powerful tool when it is used in particularly polluted regions of the sky. The applied wavelet has the advantages of requiring little computering time in its calculations being adapted to the HEALPix pixelization scheme (which is the format that the community uses to report the CMB data) and offering the possibility of multi-resolution analysis. The decomposition is implemented as part of a template fitting method, minimizing the variance of the resulting map. The method was tested with simulations of WMAP data and results have been positive, with improvements up to 12% in the variance of the resulting full sky map and about 3% in low contaminate regions. Finally, we also present some preliminary results with WMAP data in the form of an angular cross power spectrum C_ℓ^{TE}, consistent with the spectrum offered by WMAP team.

  6. To BG or not to BG: Background Subtraction for EIT Coronal Loops

    NASA Astrophysics Data System (ADS)

    Beene, J. E.; Schmelz, J. T.

    2003-05-01

    One of the few observational tests for various coronal heating models is to determine the temperature profile along coronal loops. Since loops are such an abundant coronal feature, this method originally seemed quite promising - that the coronal heating problem might actually be solved by determining the temperature as a function of arc length and comparing these observations with predictions made by different models. But there are many instruments currently available to study loops, as well as various techniques used to determine their temperature characteristics. Consequently, there are many different, mostly conflicting temperature results. We chose data for ten coronal loops observed with the Extreme ultraviolet Imaging Telescope (EIT), and chose specific pixels along each loop, as well as corresponding nearby background pixels where the loop emission was not present. Temperature analysis from the 171-to-195 and 195-to-284 angstrom image ratios was then performed on three forms of the data: the original data alone, the original data with a uniform background subtraction, and the original data with a pixel-by-pixel background subtraction. The original results show loops of constant temperature, as other authors have found before us, but the 171-to-195 and 195-to-284 results are significantly different. Background subtraction does not change the constant-temperature result or the value of the temperature itself. This does not mean that loops are isothermal, however, because the background pixels, which are not part of any contiguous structure, also produce a constant-temperature result with the same value as the loop pixels. These results indicate that EIT temperature analysis should not be trusted, and the isothermal loops that result from EIT (and TRACE) analysis may be an artifact of the analysis process. Solar physics research at the University of Memphis is supported by NASA grants NAG5-9783 and NAG5-12096.

  7. Application of Research on the Metallogenic Background in the Assessment of Mineral Resources Potentiality

    NASA Astrophysics Data System (ADS)

    Jia, D.; Feng, Y.; Liu, J.; Yao, X.; Zhang, Z.; Ye, T.

    2017-12-01

    1. Working BackgroundCurrent Status of Geological Prospecting: Detecting boundaries and bottoms, making ore search nearby; Seeing the stars, not seeing the Moon; Deep prospecting, undesirable results. The reasons of these problems are the regional metallogenic backgroud unclear and the metallogenic backgroud of the exploration regions unknown. Accordingly, Development and Research Center, CGS organized a geological setting research, in detail investigate metallogenic geological features and acquire mineralization information. 2. Technical SchemeCore research content is prediction elements of Metallogenic Structure. Adopt unified technical requirements from top to bottom, and technical route from bottom to top; Divide elements of mineral forecast and characteristics of geological structure into five elements for research and expression; Make full use of geophysical, geochemical and remote sensing inferences for the interpretation of macro information. After eight years the great project was completed. 3. Main AchievementsInnovation of basic maps compilation content of geological background, reinforce of geological structure data base of potentiality valuation. Preparation of geotectonic facies maps in different scales and professions, providing brand-new geologic background for potentiality assessment, promoting Chinese geotectonic research to the new height. Preparation of 3,375 geological structure thematic base maps of detecting working area in 6 kinds of prediction methods, providing base working maps, rock assemblage, structure of the protolith of geologic body / mineralization / ore controlling for mineral prediction of 25 ores. Enrichment and development of geotectonic facies analysis method, establishment of metallogenic background research thoughts and approach system for assessment of national mineral resources potentiality for the first time. 4. Application EffectOrientation——More and better results with less effort. Positioning——Have a definite object in view. Heart calm down——Confidence.

  8. Inhibition of recombinase polymerase amplification by background DNA: a lateral flow-based method for enriching target DNA.

    PubMed

    Rohrman, Brittany; Richards-Kortum, Rebecca

    2015-02-03

    Recombinase polymerase amplification (RPA) may be used to detect a variety of pathogens, often after minimal sample preparation. However, previous work has shown that whole blood inhibits RPA. In this paper, we show that the concentrations of background DNA found in whole blood prevent the amplification of target DNA by RPA. First, using an HIV-1 RPA assay with known concentrations of nonspecific background DNA, we show that RPA tolerates more background DNA when higher HIV-1 target concentrations are present. Then, using three additional assays, we demonstrate that the maximum amount of background DNA that may be tolerated in RPA reactions depends on the DNA sequences used in the assay. We also show that changing the RPA reaction conditions, such as incubation time and primer concentration, has little effect on the ability of RPA to function when high concentrations of background DNA are present. Finally, we develop and characterize a lateral flow-based method for enriching the target DNA concentration relative to the background DNA concentration. This sample processing method enables RPA of 10(4) copies of HIV-1 DNA in a background of 0-14 μg of background DNA. Without lateral flow sample enrichment, the maximum amount of background DNA tolerated is 2 μg when 10(6) copies of HIV-1 DNA are present. This method requires no heating or other external equipment, may be integrated with upstream DNA extraction and purification processes, is compatible with the components of lysed blood, and has the potential to detect HIV-1 DNA in infant whole blood with high proviral loads.

  9. Directional Histogram Ratio at Random Probes: A Local Thresholding Criterion for Capillary Images

    PubMed Central

    Lu, Na; Silva, Jharon; Gu, Yu; Gerber, Scott; Wu, Hulin; Gelbard, Harris; Dewhurst, Stephen; Miao, Hongyu

    2013-01-01

    With the development of micron-scale imaging techniques, capillaries can be conveniently visualized using methods such as two-photon and whole mount microscopy. However, the presence of background staining, leaky vessels and the diffusion of small fluorescent molecules can lead to significant complexity in image analysis and loss of information necessary to accurately quantify vascular metrics. One solution to this problem is the development of accurate thresholding algorithms that reliably distinguish blood vessels from surrounding tissue. Although various thresholding algorithms have been proposed, our results suggest that without appropriate pre- or post-processing, the existing approaches may fail to obtain satisfactory results for capillary images that include areas of contamination. In this study, we propose a novel local thresholding algorithm, called directional histogram ratio at random probes (DHR-RP). This method explicitly considers the geometric features of tube-like objects in conducting image binarization, and has a reliable performance in distinguishing small vessels from either clean or contaminated background. Experimental and simulation studies suggest that our DHR-RP algorithm is superior over existing thresholding methods. PMID:23525856

  10. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  11. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  12. Discriminative object tracking via sparse representation and online dictionary learning.

    PubMed

    Xie, Yuan; Zhang, Wensheng; Li, Cuihua; Lin, Shuyang; Qu, Yanyun; Zhang, Yinghua

    2014-04-01

    We propose a robust tracking algorithm based on local sparse coding with discriminative dictionary learning and new keypoint matching schema. This algorithm consists of two parts: the local sparse coding with online updated discriminative dictionary for tracking (SOD part), and the keypoint matching refinement for enhancing the tracking performance (KP part). In the SOD part, the local image patches of the target object and background are represented by their sparse codes using an over-complete discriminative dictionary. Such discriminative dictionary, which encodes the information of both the foreground and the background, may provide more discriminative power. Furthermore, in order to adapt the dictionary to the variation of the foreground and background during the tracking, an online learning method is employed to update the dictionary. The KP part utilizes refined keypoint matching schema to improve the performance of the SOD. With the help of sparse representation and online updated discriminative dictionary, the KP part are more robust than the traditional method to reject the incorrect matches and eliminate the outliers. The proposed method is embedded into a Bayesian inference framework for visual tracking. Experimental results on several challenging video sequences demonstrate the effectiveness and robustness of our approach.

  13. Numerical experiment to estimate the validity of negative ion diagnostic using photo-detachment combined with Langmuir probing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oudini, N.; Sirse, N.; Ellingboe, A. R.

    2015-07-15

    This paper presents a critical assessment of the theory of photo-detachment diagnostic method used to probe the negative ion density and electronegativity α = n{sub -}/n{sub e}. In this method, a laser pulse is used to photo-detach all negative ions located within the electropositive channel (laser spot region). The negative ion density is estimated based on the assumption that the increase of the current collected by an electrostatic probe biased positively to the plasma is a result of only the creation of photo-detached electrons. In parallel, the background electron density and temperature are considered as constants during this diagnostics. While the numericalmore » experiments performed here show that the background electron density and temperature increase due to the formation of an electrostatic potential barrier around the electropositive channel. The time scale of potential barrier rise is about 2 ns, which is comparable to the time required to completely photo-detach the negative ions in the electropositive channel (∼3 ns). We find that neglecting the effect of the potential barrier on the background plasma leads to an erroneous determination of the negative ion density. Moreover, the background electron velocity distribution function within the electropositive channel is not Maxwellian. This is due to the acceleration of these electrons through the electrostatic potential barrier. In this work, the validity of the photo-detachment diagnostic assumptions is questioned and our results illustrate the weakness of these assumptions.« less

  14. Relative contributions of three descriptive methods: implications for behavioral assessment.

    PubMed

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods-the ABC method, the conditional probability method, and the conditional and background probability method-to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n = 2), social negative reinforcement (n = 2), or automatic reinforcement (n = 2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations.

  15. Detection of enhancement in number densities of background galaxies due to magnification by massive galaxy clusters

    DOE PAGES

    Chiu, I.; Dietrich, J. P.; Mohr, J.; ...

    2016-02-18

    We present a detection of the enhancement in the number densities of background galaxies induced from lensing magnification and use it to test the Sunyaev-Zel'dovich effect (SZE) inferred masses in a sample of 19 galaxy clusters with median redshift z≃0.42 selected from the South Pole Telescope SPT-SZ survey. Two background galaxy populations are selected for this study through their photometric colours; they have median redshifts z median≃0.9 (low-z background) and z median≃1.8 (high-z background). Stacking these populations, we detect the magnification bias effect at 3.3σ and 1.3σ for the low- and high-z backgrounds, respectively. We fit NFW models simultaneously tomore » all observed magnification bias profiles to estimate the multiplicative factor η that describes the ratio of the weak lensing mass to the mass inferred from the SZE observable-mass relation. We further quantify systematic uncertainties in η resulting from the photometric noise and bias, the cluster galaxy contamination and the estimations of the background properties. The resulting η for the combined background populations with 1σ uncertainties is 0.83 ± 0.24(stat) ± 0.074(sys), indicating good consistency between the lensing and the SZE-inferred masses. We also use our best-fit η to predict the weak lensing shear profiles and compare these predictions with observations, showing agreement between the magnification and shear mass constraints. Our work demonstrates the promise of using the magnification as a complementary method to estimate cluster masses in large surveys.« less

  16. Implementation of a flow-dependent background error correlation length scale formulation in the NEMOVAR OSTIA system

    NASA Astrophysics Data System (ADS)

    Fiedler, Emma; Mao, Chongyuan; Good, Simon; Waters, Jennifer; Martin, Matthew

    2017-04-01

    OSTIA is the Met Office's Operational Sea Surface Temperature (SST) and Ice Analysis system, which produces L4 (globally complete, gridded) analyses on a daily basis. Work is currently being undertaken to replace the original OI (Optimal Interpolation) data assimilation scheme with NEMOVAR, a 3D-Var data assimilation method developed for use with the NEMO ocean model. A dual background error correlation length scale formulation is used for SST in OSTIA, as implemented in NEMOVAR. Short and long length scales are combined according to the ratio of the decomposition of the background error variances into short and long spatial correlations. The pre-defined background error variances vary spatially and seasonally, but not on shorter time-scales. If the derived length scales applied to the daily analysis are too long, SST features may be smoothed out. Therefore a flow-dependent component to determining the effective length scale has also been developed. The total horizontal gradient of the background SST field is used to identify regions where the length scale should be shortened. These methods together have led to an improvement in the resolution of SST features compared to the previous OI analysis system, without the introduction of spurious noise. This presentation will show validation results for feature resolution in OSTIA using the OI scheme, the dual length scale NEMOVAR scheme, and the flow-dependent implementation.

  17. Past Tense Marking by African American English–Speaking Children Reared in Poverty

    PubMed Central

    Pruitt, Sonja; Oetting, Janna

    2012-01-01

    Purpose This study examined past tense marking by African American English (AAE)-speaking children from low- and middle-income backgrounds to determine if poverty affects children’s marking of past tense in ways that mirror the clinical condition of specific language impairment (SLI). Method Participants were 15 AAE-speaking 6-year-olds from low-income backgrounds, 15 AAE-speaking 6-year-olds from middle-income backgrounds who served as age-matched controls, and 15 AAE-speaking 5-year-olds from middle-income backgrounds who served as language-matched controls. Data were drawn from language samples and probes. Results Results revealed high rates of regular marking, variable rates of irregular marking, high rates of over-regularizations, and absence of dialect-inappropriate errors of commission. For some analyses, marking was affected by the phonological characteristics of the items and the children’s ages, but none of the analyses revealed effects for the children’s socioeconomic level. Conclusions Within AAE, poverty status as a variable affects past tense marking in ways that are different from the clinical condition of SLI. PMID:18695014

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Priel, Nadav; Landsman, Hagar; Manfredini, Alessandro

    We propose a safeguard procedure for statistical inference that provides universal protection against mismodeling of the background. The method quantifies and incorporates the signal-like residuals of the background model into the likelihood function, using information available in a calibration dataset. This prevents possible false discovery claims that may arise through unknown mismodeling, and corrects the bias in limit setting created by overestimated or underestimated background. We demonstrate how the method removes the bias created by an incomplete background model using three realistic case studies.

  19. Magnetic imager and method

    DOEpatents

    Powell, J.; Reich, M.; Danby, G.

    1997-07-22

    A magnetic imager includes a generator for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager also includes a sensor for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object. 25 figs.

  20. LArGe: active background suppression using argon scintillation for the Gerda 0ν β β -experiment

    NASA Astrophysics Data System (ADS)

    Agostini, M.; Barnabé-Heider, M.; Budjáš, D.; Cattadori, C.; Gangapshev, A.; Gusev, K.; Heisel, M.; Junker, M.; Klimenko, A.; Lubashevskiy, A.; Pelczar, K.; Schönert, S.; Smolnikov, A.; Zuzel, G.

    2015-10-01

    LArGe is a Gerda low-background test facility to study novel background suppression methods in a low-background environment, for future application in the Gerda experiment. Similar to Gerda, LArGe operates bare germanium detectors submersed into liquid argon (1 m^3, 1.4 tons), which in addition is instrumented with photomultipliers to detect argon scintillation light. The scintillation signals are used in anti-coincidence with the germanium detectors to effectively suppress background events that deposit energy in the liquid argon. The background suppression efficiency was studied in combination with a pulse shape discrimination (PSD) technique using a BEGe detector for various sources, which represent characteristic backgrounds to Gerda. Suppression factors of a few times 10^3 have been achieved. First background data of LArGe with a coaxial HPGe detector (without PSD) yield a background index of (0.12-4.6)× 10^{-2} cts/(keV kg year) (90 % C.L.), which is at the level of Gerda Phase I. Furthermore, for the first time we monitor the natural ^{42}Ar abundance (parallel to Gerda), and have indication for the 2ν β β -decay in natural germanium. These results show the effectivity of an active liquid argon veto in an ultra-low background environment. As a consequence, the implementation of a liquid argon veto in Gerda Phase II is pursued.

  1. Factors Contributing to Background Television Exposure in Low-Income Mexican-American Preschoolers.

    PubMed

    Thompson, Darcy A; Tschann, Jeanne M

    2016-09-01

    Objective Background television (TV) exposure is harmful to young children, yet few studies have focused on predictors of exposure. This study's objectives were to elucidate demographic, environmental, and behavioral correlates of background TV exposure in low-income Mexican-American preschoolers and to explore caregiver beliefs about the impact of such exposure. Methods A convenience sample of low-income Mexican-American female primary caregivers of preschoolers (3-5 years old, n = 309), recruited in safety-net clinics, were surveyed by phone. Caregivers reported the frequency of their child's exposure to background TV and responded to questions on the home media environment, TV use, and whether they had thought about background TV exposure and its impact on their child. Results Background TV exposure was common; 43 % reported that their child was often, very often, or always exposed to background TV. More hours of TV viewing by the caregiver and greater frequency of TV viewing during meals were associated with an increased frequency of exposure to background TV. Only 49 % of participants had ever thought about the impact of background TV. Believing that background TV is not harmful was associated with higher levels of background TV exposure. Conclusions Findings suggest that background TV exposure is frequent and caregiver awareness of its potential impact is low in low-income Mexican-American families. Beliefs that background TV is not harmful may predict risk of exposure. Potential targets for interventions focused on reducing background TV exposure in this population include increasing caregiver awareness of the potential negative impact of such TV exposure.

  2. Factors contributing to background television exposure in low-income Mexican American preschoolers

    PubMed Central

    Thompson, Darcy A.; Tschann, Jeanne M.

    2016-01-01

    Objective Background television (TV) exposure is harmful to young children, yet few studies have focused on predictors of exposure. This study’s objectives were to elucidate demographic, environmental, and behavioral correlates of background TV exposure in low-income Mexican American preschoolers and to explore caregiver beliefs about the impact of such exposure. Methods A convenience sample of low-income Mexican American female primary caregivers of preschoolers (3–5 years old, n=309), recruited in safety-net clinics, were surveyed by phone. Caregivers reported the frequency of their child’s exposure to background TV and responded to questions on the home media environment, TV use, and whether they had thought about background TV exposure and its impact on their child. Results Background TV exposure was common; 43% reported that their child was often, very often, or always exposed to background TV. More hours of TV viewing by the caregiver and greater frequency of TV viewing during meals were associated with an increased frequency of exposure to background TV. Only 49% of participants had ever thought about the impact of background TV. Believing that background TV is not harmful was associated with higher levels of background TV exposure. Conclusions Findings suggest that background TV exposure is frequent and caregiver awareness of its potential impact is low in low-income Mexican American families. Beliefs that background TV is not harmful may predict risk of exposure. Potential targets for interventions focused on reducing background TV exposure in this population include increasing caregiver awareness of the potential negative impact of such TV exposure. PMID:27007983

  3. A longitudinal daily diary study of family assistance and academic achievement among adolescents from Mexican, Chinese, and European backgrounds.

    PubMed

    Telzer, Eva H; Fuligni, Andrew J

    2009-04-01

    A longitudinal daily diary method was employed to examine the implications of family assistance for the academic achievement of 563 adolescents (53% female) from Mexican (n = 217), Chinese (n = 206), and European (n = 140) backgrounds during the high school years (mean age 14.9 years in 9th grade to 17.8 years in 12th grade). Although changes in family assistance time within individual adolescents were not associated with simultaneous changes in their Grade Point Averages (GPAs), increases in the proportion of days spent helping the family were linked to declines in the GPAs of students from Mexican and Chinese backgrounds. The negative implications of spending more days helping the family among these two groups was not explained by family background factors or changes in study time or school problems. These results suggest that the chronicity rather than the amount of family assistance may be difficult for adolescents from Mexican and Chinese backgrounds.

  4. Document segmentation for high-quality printing

    NASA Astrophysics Data System (ADS)

    Ancin, Hakan

    1997-04-01

    A technique to segment dark texts on light background of mixed mode color documents is presented. This process does not perceptually change graphics and photo regions. Color documents are scanned and printed from various media which usually do not have clean background. This is especially the case for the printouts generated from thin magazine samples, these printouts usually include text and figures form the back of the page, which is called bleeding. Removal of bleeding artifacts improves the perceptual quality of the printed document and reduces the color ink usage. By detecting the light background of the document, these artifacts are removed from background regions. Also detection of dark text regions enables the halftoning algorithms to use true black ink for the black text pixels instead of composite black. The processed document contains sharp black text on white background, resulting improved perceptual quality and better ink utilization. The described method is memory efficient and requires a small number of scan lines of high resolution color documents during processing.

  5. Background noise cancellation for improved acoustic detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Niezrecki, Christopher; Beusse, Diedrich O.

    2005-06-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of an increase in the number of collisions with boats. A device to alert boaters of the presence of manatees, so that a collision can be avoided, is desired. A practical implementation of the technology is dependent on the hydrophone spacing and range of detection. These parameters are primarily dependent on the manatee vocalization strength, the decay of the signal's strength with distance, and the background noise levels. An efficient method to extend the detection range by using background noise cancellation is proposed in this paper. An adaptive line enhancer (ALE) that can detect and track narrow band signals buried in broadband noise is implemented to cancel the background noise. The results indicate that the ALE algorithm can efficiently extract the manatee calls from the background noise. The improved signal-to-noise ratio of the signal can be used to extend the range of detection of manatee vocalizations and reduce the false alarm and missing detection rate in their natural habitat. .

  6. Background noise cancellation for improved acoustic detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Yan, Zheng; Niezrecki, Christopher; Beusse, Diedrich O.

    2005-04-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of an increase in the number of collisions with boats. A device to alert boaters of the presence of manatees, so that a collision can be avoided, is desired. Practical implementation of the technology is dependent on the hydrophone spacing and range of detection. These parameters are primarily dependent on the manatee vocalization strength, the decay of the signal strength with distance, and the background noise levels. An efficient method to extend the detection range by using background noise cancellation is proposed in this paper. An adaptive line enhancer (ALE) that can detect and track narrowband signals buried in broadband noise is implemented to cancel the background noise. The results indicate that the ALE algorithm can efficiently extract the manatee calls from the background noise. The improved signal-to-noise ratio of the signal can be used to extend the range of detection of manatee vocalizations and reduce the false alarm and missing detection rate in their natural habitat.

  7. Research on infrared dim-point target detection and tracking under sea-sky-line complex background

    NASA Astrophysics Data System (ADS)

    Dong, Yu-xing; Li, Yan; Zhang, Hai-bo

    2011-08-01

    Target detection and tracking technology in infrared image is an important part of modern military defense system. Infrared dim-point targets detection and recognition under complex background is a difficulty and important strategic value and challenging research topic. The main objects that carrier-borne infrared vigilance system detected are sea-skimming aircrafts and missiles. Due to the characteristics of wide field of view of vigilance system, the target is usually under the sea clutter. Detection and recognition of the target will be taken great difficulties .There are some traditional point target detection algorithms, such as adaptive background prediction detecting method. When background has dispersion-decreasing structure, the traditional target detection algorithms would be more useful. But when the background has large gray gradient, such as sea-sky-line, sea waves etc .The bigger false-alarm rate will be taken in these local area .It could not obtain satisfactory results. Because dim-point target itself does not have obvious geometry or texture feature ,in our opinion , from the perspective of mathematics, the detection of dim-point targets in image is about singular function analysis .And from the perspective image processing analysis , the judgment of isolated singularity in the image is key problem. The foregoing points for dim-point targets detection, its essence is a separation of target and background of different singularity characteristics .The image from infrared sensor usually accompanied by different kinds of noise. These external noises could be caused by the complicated background or from the sensor itself. The noise might affect target detection and tracking. Therefore, the purpose of the image preprocessing is to reduce the effects from noise, also to raise the SNR of image, and to increase the contrast of target and background. According to the low sea-skimming infrared flying small target characteristics , the median filter is used to eliminate noise, improve signal-to-noise ratio, then the multi-point multi-storey vertical Sobel algorithm will be used to detect the sea-sky-line ,so that we can segment sea and sky in the image. Finally using centroid tracking method to capture and trace target. This method has been successfully used to trace target under the sea-sky complex background.

  8. Towards standardization of 18F-FET PET imaging: do we need a consistent method of background activity assessment?

    PubMed

    Unterrainer, Marcus; Vettermann, Franziska; Brendel, Matthias; Holzgreve, Adrien; Lifschitz, Michael; Zähringer, Matthias; Suchorska, Bogdana; Wenter, Vera; Illigens, Ben M; Bartenstein, Peter; Albert, Nathalie L

    2017-12-01

    PET with O-(2- 18 F-fluoroethyl)-L-tyrosine ( 18 F-FET) has reached increasing clinical significance for patients with brain neoplasms. For quantification of standard PET-derived parameters such as the tumor-to-background ratio, the background activity is assessed using a region of interest (ROI) or volume of interest (VOI) in unaffected brain tissue. However, there is no standardized approach regarding the assessment of the background reference. Therefore, we evaluated the intra- and inter-reader variability of commonly applied approaches for clinical 18 F-FET PET reading. The background activity of 20 18 F-FET PET scans was independently evaluated by 6 readers using a (i) simple 2D-ROI, (ii) spherical VOI with 3.0 cm diameter, and (iii) VOI consisting of crescent-shaped ROIs; each in the contralateral, non-affected hemisphere including white and gray matter in line with the European Association of Nuclear Medicine (EANM) and German guidelines. To assess intra-reader variability, each scan was evaluated 10 times by each reader. The coefficient of variation (CoV) was assessed for determination of intra- and inter-reader variability. In a second step, the best method was refined by instructions for a guided background activity assessment and validated by 10 further scans. Compared to the other approaches, the crescent-shaped VOIs revealed most stable results with the lowest intra-reader variabilities (median CoV 1.52%, spherical VOI 4.20%, 2D-ROI 3.69%; p < 0.001) and inter-reader variabilities (median CoV 2.14%, spherical VOI 4.02%, 2D-ROI 3.83%; p = 0.001). Using the guided background assessment, both intra-reader variabilities (median CoV 1.10%) and inter-reader variabilities (median CoV 1.19%) could be reduced even more. The commonly applied methods for background activity assessment show different variability which might hamper 18 F-FET PET quantification and comparability in multicenter settings. The proposed background activity assessment using a (guided) crescent-shaped VOI allows minimization of both intra- and inter-reader variability and might facilitate comprehensive methodological standardization of amino acid PET which is of interest in the light of the anticipated EANM technical guidelines.

  9. The Antimicrobial Properties of Red Algae. The Fight of Your Life: Battling Bacteria.

    ERIC Educational Resources Information Center

    Case, Christine L.; Warner, Michael

    2001-01-01

    Describes a research project in which a professor and a student collaborated in the screening of macroscopic algae for antimicrobial properties. Includes background information, materials and methods, results, and a discussion of the experiment. (SAH)

  10. Comprehensive metabolomic profiling and incident cardiovascular disease: a systematic review

    USDA-ARS?s Scientific Manuscript database

    Background: Metabolomics is a promising tool of cardiovascular biomarker discovery. We systematically reviewed the literature on comprehensive metabolomic profiling in association with incident cardiovascular disease (CVD). Methods and Results: We searched MEDLINE and EMBASE from inception to Janua...

  11. Background of SAM atom-fraction profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ernst, Frank

    Atom-fraction profiles acquired by SAM (scanning Auger microprobe) have important applications, e.g. in the context of alloy surface engineering by infusion of carbon or nitrogen through the alloy surface. However, such profiles often exhibit an artifact in form of a background with a level that anti-correlates with the local atom fraction. This article presents a theory explaining this phenomenon as a consequence of the way in which random noise in the spectrum propagates into the discretized differentiated spectrum that is used for quantification. The resulting model of “energy channel statistics” leads to a useful semi-quantitative background reduction procedure, which ismore » validated by applying it to simulated data. Subsequently, the procedure is applied to an example of experimental SAM data. The analysis leads to conclusions regarding optimum experimental acquisition conditions. The proposed method of background reduction is based on general principles and should be useful for a broad variety of applications. - Highlights: • Atom-fraction–depth profiles of carbon measured by scanning Auger microprobe • Strong background, varies with local carbon concentration. • Needs correction e.g. for quantitative comparison with simulations • Quantitative theory explains background. • Provides background removal strategy and practical advice for acquisition.« less

  12. Target detection method by airborne and spaceborne images fusion based on past images

    NASA Astrophysics Data System (ADS)

    Chen, Shanjing; Kang, Qing; Wang, Zhenggang; Shen, ZhiQiang; Pu, Huan; Han, Hao; Gu, Zhongzheng

    2017-11-01

    To solve the problem that remote sensing target detection method has low utilization rate of past remote sensing data on target area, and can not recognize camouflage target accurately, a target detection method by airborne and spaceborne images fusion based on past images is proposed in this paper. The target area's past of space remote sensing image is taken as background. The airborne and spaceborne remote sensing data is fused and target feature is extracted by the means of airborne and spaceborne images registration, target change feature extraction, background noise suppression and artificial target feature extraction based on real-time aerial optical remote sensing image. Finally, the support vector machine is used to detect and recognize the target on feature fusion data. The experimental results have established that the proposed method combines the target area change feature of airborne and spaceborne remote sensing images with target detection algorithm, and obtains fine detection and recognition effect on camouflage and non-camouflage targets.

  13. a New Multi-Spectral Threshold Normalized Difference Water Index Mst-Ndwi Water Extraction Method - a Case Study in Yanhe Watershed

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.

    2018-05-01

    Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.

  14. Moving target detection method based on improved Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Ma, J. Y.; Jie, F. R.; Hu, Y. J.

    2017-07-01

    Gaussian Mixture Model is often employed to build background model in background difference methods for moving target detection. This paper puts forward an adaptive moving target detection algorithm based on improved Gaussian Mixture Model. According to the graylevel convergence for each pixel, adaptively choose the number of Gaussian distribution to learn and update background model. Morphological reconstruction method is adopted to eliminate the shadow.. Experiment proved that the proposed method not only has good robustness and detection effect, but also has good adaptability. Even for the special cases when the grayscale changes greatly and so on, the proposed method can also make outstanding performance.

  15. Novel statistical framework to identify differentially expressed genes allowing transcriptomic background differences.

    PubMed

    Ling, Zhi-Qiang; Wang, Yi; Mukaisho, Kenichi; Hattori, Takanori; Tatsuta, Takeshi; Ge, Ming-Hua; Jin, Li; Mao, Wei-Min; Sugihara, Hiroyuki

    2010-06-01

    Tests of differentially expressed genes (DEGs) from microarray experiments are based on the null hypothesis that genes that are irrelevant to the phenotype/stimulus are expressed equally in the target and control samples. However, this strict hypothesis is not always true, as there can be several transcriptomic background differences between target and control samples, including different cell/tissue types, different cell cycle stages and different biological donors. These differences lead to increased false positives, which have little biological/medical significance. In this article, we propose a statistical framework to identify DEGs between target and control samples from expression microarray data allowing transcriptomic background differences between these samples by introducing a modified null hypothesis that the gene expression background difference is normally distributed. We use an iterative procedure to perform robust estimation of the null hypothesis and identify DEGs as outliers. We evaluated our method using our own triplicate microarray experiment, followed by validations with reverse transcription-polymerase chain reaction (RT-PCR) and on the MicroArray Quality Control dataset. The evaluations suggest that our technique (i) results in less false positive and false negative results, as measured by the degree of agreement with RT-PCR of the same samples, (ii) can be applied to different microarray platforms and results in better reproducibility as measured by the degree of DEG identification concordance both intra- and inter-platforms and (iii) can be applied efficiently with only a few microarray replicates. Based on these evaluations, we propose that this method not only identifies more reliable and biologically/medically significant DEG, but also reduces the power-cost tradeoff problem in the microarray field. Source code and binaries freely available for download at http://comonca.org.cn/fdca/resources/softwares/deg.zip.

  16. Cluster signal-to-noise analysis for evaluation of the information content in an image.

    PubMed

    Weerawanich, Warangkana; Shimizu, Mayumi; Takeshita, Yohei; Okamura, Kazutoshi; Yoshida, Shoko; Yoshiura, Kazunori

    2018-01-01

    (1) To develop an observer-free method of analysing image quality related to the observer performance in the detection task and (2) to analyse observer behaviour patterns in the detection of small mass changes in cone-beam CT images. 13 observers detected holes in a Teflon phantom in cone-beam CT images. Using the same images, we developed a new method, cluster signal-to-noise analysis, to detect the holes by applying various cut-off values using ImageJ and reconstructing cluster signal-to-noise curves. We then evaluated the correlation between cluster signal-to-noise analysis and the observer performance test. We measured the background noise in each image to evaluate the relationship with false positive rates (FPRs) of the observers. Correlations between mean FPRs and intra- and interobserver variations were also evaluated. Moreover, we calculated true positive rates (TPRs) and accuracies from background noise and evaluated their correlations with TPRs from observers. Cluster signal-to-noise curves were derived in cluster signal-to-noise analysis. They yield the detection of signals (true holes) related to noise (false holes). This method correlated highly with the observer performance test (R 2 = 0.9296). In noisy images, increasing background noise resulted in higher FPRs and larger intra- and interobserver variations. TPRs and accuracies calculated from background noise had high correlation with actual TPRs from observers; R 2 was 0.9244 and 0.9338, respectively. Cluster signal-to-noise analysis can simulate the detection performance of observers and thus replace the observer performance test in the evaluation of image quality. Erroneous decision-making increased with increasing background noise.

  17. Study of improving signal-noise ratio for fluorescence channel

    NASA Astrophysics Data System (ADS)

    Wang, Guoqing; Li, Xin; Lou, Yue; Chen, Dong; Zhao, Xin; Wang, Ran; Yan, Debao; Zhao, Qi

    2017-10-01

    Laser-induced fluorescence(LIFS), which is one of most effective discrimination methods to identify the material at the molecular level by inducing fluorescence spectrum, has been popularized for its fast and accurate probe's results. According to the research, violet laser or ultraviolet laser is always used as excitation light source. While, There is no atmospheric window for violet laser and ultraviolet laser, causing laser attenuation along its propagation path. What's worse, as the laser reaching sample, part of the light is reflected. That is, excitation laser really react on sample to produce fluorescence is very poor, leading to weak fluorescence mingled with the background light collected by LIFS' processing unit, when it used outdoor. In order to spread LIFS to remote probing under the complex background, study of improving signal-noise ratio for fluorescence channel is a meaningful work. Enhancing the fluorescence intensity and inhibiting background light both can improve fluorescence' signal-noise ratio. In this article, three different approaches of inhibiting background light are discussed to improve the signal-noise ratio of LIFS. The first method is increasing fluorescence excitation area in the proportion of LIFS' collecting field by expanding laser beam, if the collecting filed is fixed. The second one is changing field angle base to accommodate laser divergence angle. The third one is setting a very narrow gating circuit to control acquisition circuit, which is shortly open only when fluorescence arriving. At some level, these methods all can reduce the background light. But after discussion, the third one is best with adding gating acquisition circuit to acquisition circuit instead of changing light path, which is effective and economic.

  18. Accuracy of neutron self-activation method with iodine-containing scintillators for quantifying 128I generation using decay-fitting technique

    NASA Astrophysics Data System (ADS)

    Nohtomi, Akihiro; Wakabayashi, Genichiro

    2015-11-01

    We evaluated the accuracy of a self-activation method with iodine-containing scintillators in quantifying 128I generation in an activation detector; the self-activation method was recently proposed for photo-neutron on-line measurements around X-ray radiotherapy machines. Here, we consider the accuracy of determining the initial count rate R0, observed just after termination of neutron irradiation of the activation detector. The value R0 is directly related to the amount of activity generated by incident neutrons; the detection efficiency of radiation emitted from the activity should be taken into account for such an evaluation. Decay curves of 128I activity were numerically simulated by a computer program for various conditions including different initial count rates (R0) and background rates (RB), as well as counting statistical fluctuations. The data points sampled at minute intervals and integrated over the same period were fit by a non-linear least-squares fitting routine to obtain the value R0 as a fitting parameter with an associated uncertainty. The corresponding background rate RB was simultaneously calculated in the same fitting routine. Identical data sets were also evaluated by a well-known integration algorithm used for conventional activation methods and the results were compared with those of the proposed fitting method. When we fixed RB = 500 cpm, the relative uncertainty σR0 /R0 ≤ 0.02 was achieved for R0/RB ≥ 20 with 20 data points from 1 min to 20 min following the termination of neutron irradiation used in the fitting; σR0 /R0 ≤ 0.01 was achieved for R0/RB ≥ 50 with the same data points. Reasonable relative uncertainties to evaluate initial count rates were reached by the decay-fitting method using practically realistic sampling numbers. These results clarified the theoretical limits of the fitting method. The integration method was found to be potentially vulnerable to short-term variations in background levels, especially instantaneous contaminations by spike-like noise. The fitting method easily detects and removes such spike-like noise.

  19. Magnetic imager and method

    DOEpatents

    Powell, James; Reich, Morris; Danby, Gordon

    1997-07-22

    A magnetic imager 10 includes a generator 18 for practicing a method of applying a background magnetic field over a concealed object, with the object being effective to locally perturb the background field. The imager 10 also includes a sensor 20 for measuring perturbations of the background field to detect the object. In one embodiment, the background field is applied quasi-statically. And, the magnitude or rate of change of the perturbations may be measured for determining location, size, and/or condition of the object.

  20. Reducing background contributions in fluorescence fluctuation time-traces for single-molecule measurements in solution.

    PubMed

    Földes-Papp, Zeno; Liao, Shih-Chu Jeff; You, Tiefeng; Barbieri, Beniamino

    2009-08-01

    We first report on the development of new microscope means that reduce background contributions in fluorescence fluctuation methods: i) excitation shutter, ii) electronic switches, and iii) early and late time-gating. The elements allow for measuring molecules at low analyte concentrations. We first found conditions of early and late time-gating with time-correlated single-photon counting that made the fluorescence signal as bright as possible compared with the fluctuations in the background count rate in a diffraction-limited optical set-up. We measured about a 140-fold increase in the amplitude of autocorrelated fluorescence fluctuations at the lowest analyte concentration of about 15 pM, which gave a signal-to-background advantage of more than two-orders of magnitude. The results of this original article pave the way for single-molecule detection in solution and in live cells without immobilization or hydrodynamic/electrokinetic focusing at longer observation times than are currently available.

  1. Verb Form Indicates Discourse Segment Type in Biological Research Papers: Experimental Evidence

    ERIC Educational Resources Information Center

    de Waard, Anita; Maat, Henk Pander

    2012-01-01

    Corpus studies suggest that verb tense is a differentiating feature between, on the one hand, text pertaining to experimental results (involving methods and results) and on the other hand, text pertaining to more abstract concepts (i.e. regarding background knowledge in a field, hypotheses, problems or claims). In this paper, we describe a user…

  2. Children of Parents with Intellectual Disability: Facing Poor Outcomes or Faring Okay?

    ERIC Educational Resources Information Center

    Collings, Susan; Llewellyn, Gwynnyth

    2012-01-01

    Background: Children of parents with intellectual disability are assumed to be at risk of poor outcomes but a comprehensive review of the literature has not previously been undertaken. Method: A database and reference search from March 2010 to March 2011 resulted in 26 studies for review. Results: Two groups of studies were identified. The first…

  3. Responses to Positive Results from Suspicionless Random Drug Tests in US Public School Districts

    ERIC Educational Resources Information Center

    Ringwalt, Chris; Vincus, Amy A.; Ennett, Susan T.; Hanley, Sean; Bowling, J. Michael; Yacoubian, George S., Jr.; Rohrbach, Louise A.

    2009-01-01

    Background: Little is known about the context in which school-based suspicionless random drug testing (SRDT) occurs. The primary purpose of the current study was to describe school districts' responses to students' first positive result in districts with SRDT programs. Methods: Data were collected in spring 2005 from 1612 drug prevention…

  4. B-spline based image tracking by detection

    NASA Astrophysics Data System (ADS)

    Balaji, Bhashyam; Sithiravel, Rajiv; Damini, Anthony; Kirubarajan, Thiagalingam; Rajan, Sreeraman

    2016-05-01

    Visual image tracking involves the estimation of the motion of any desired targets in a surveillance region using a sequence of images. A standard method of isolating moving targets in image tracking uses background subtraction. The standard background subtraction method is often impacted by irrelevant information in the images, which can lead to poor performance in image-based target tracking. In this paper, a B-Spline based image tracking is implemented. The novel method models the background and foreground using the B-Spline method followed by a tracking-by-detection algorithm. The effectiveness of the proposed algorithm is demonstrated.

  5. Improved silver staining of nucleolar organiser regions in paraffin wax sections using an inverted incubation technique.

    PubMed Central

    Coghill, G; Grant, A; Orrell, J M; Jankowski, J; Evans, A T

    1990-01-01

    A new simple modification to the silver staining of nucleolar organiser regions (AgNORs) was devised which, by performing the incubation with the slide inverted, results in minimal undesirable background staining, a persistent problem. Inverted incubation is facilitated by the use of a commercially available plastic coverplate. This technique has several additional advantages over other published staining protocols. In particular, the method is straightforward, fast, and maintains a high degree of contrast between the background and the AgNORs. Images PMID:1702451

  6. Reconfigurable Control Design with Neural Network Augmentation for a Modified F-15 Aircraft

    NASA Technical Reports Server (NTRS)

    Burken, John J.

    2007-01-01

    The viewgraphs present background information about reconfiguration control design, design methods used for paper, control failure survivability results, and results and time histories of tests. Topics examined include control reconfiguration, general information about adaptive controllers, model reference adaptive control (MRAC), the utility of neural networks, radial basis functions (RBF) neural network outputs, neurons, and results of investigations of failures.

  7. Effects of elevated temperature on growth and reproduction of biofuels crops

    EPA Science Inventory

    Background/Questions/Methods Cellulosic biofuels crops have considerable potential to reduce our carbon footprint , and to be at least neutral in terms of carbon production. However, their widespread cultivation may result in unintended ecological and health effects. We report...

  8. Holistic approach for automated background EEG assessment in asphyxiated full-term infants

    NASA Astrophysics Data System (ADS)

    Matic, Vladimir; Cherian, Perumpillichira J.; Koolen, Ninah; Naulaers, Gunnar; Swarte, Renate M.; Govaert, Paul; Van Huffel, Sabine; De Vos, Maarten

    2014-12-01

    Objective. To develop an automated algorithm to quantify background EEG abnormalities in full-term neonates with hypoxic ischemic encephalopathy. Approach. The algorithm classifies 1 h of continuous neonatal EEG (cEEG) into a mild, moderate or severe background abnormality grade. These classes are well established in the literature and a clinical neurophysiologist labeled 272 1 h cEEG epochs selected from 34 neonates. The algorithm is based on adaptive EEG segmentation and mapping of the segments into the so-called segments’ feature space. Three features are suggested and further processing is obtained using a discretized three-dimensional distribution of the segments’ features represented as a 3-way data tensor. Further classification has been achieved using recently developed tensor decomposition/classification methods that reduce the size of the model and extract a significant and discriminative set of features. Main results. Effective parameterization of cEEG data has been achieved resulting in high classification accuracy (89%) to grade background EEG abnormalities. Significance. For the first time, the algorithm for the background EEG assessment has been validated on an extensive dataset which contained major artifacts and epileptic seizures. The demonstrated high robustness, while processing real-case EEGs, suggests that the algorithm can be used as an assistive tool to monitor the severity of hypoxic insults in newborns.

  9. Variance analysis of x-ray CT sinograms in the presence of electronic noise background

    PubMed Central

    Ma, Jianhua; Liang, Zhengrong; Fan, Yi; Liu, Yan; Huang, Jing; Chen, Wufan; Lu, Hongbing

    2012-01-01

    Purpose: Low-dose x-ray computed tomography (CT) is clinically desired. Accurate noise modeling is a fundamental issue for low-dose CT image reconstruction via statistics-based sinogram restoration or statistical iterative image reconstruction. In this paper, the authors analyzed the statistical moments of low-dose CT data in the presence of electronic noise background. Methods: The authors first studied the statistical moment properties of detected signals in CT transmission domain, where the noise of detected signals is considered as quanta fluctuation upon electronic noise background. Then the authors derived, via the Taylor expansion, a new formula for the mean–variance relationship of the detected signals in CT sinogram domain, wherein the image formation becomes a linear operation between the sinogram data and the unknown image, rather than a nonlinear operation in the CT transmission domain. To get insight into the derived new formula by experiments, an anthropomorphic torso phantom was scanned repeatedly by a commercial CT scanner at five different mAs levels from 100 down to 17. Results: The results demonstrated that the electronic noise background is significant when low-mAs (or low-dose) scan is performed. Conclusions: The influence of the electronic noise background should be considered in low-dose CT imaging. PMID:22830738

  10. Label-free fluorescence strategy for sensitive detection of adenosine triphosphate using a loop DNA probe with low background noise.

    PubMed

    Lin, Chunshui; Cai, Zhixiong; Wang, Yiru; Zhu, Zhi; Yang, Chaoyong James; Chen, Xi

    2014-07-15

    A simple, rapid, label-free, and ultrasensitive fluorescence strategy for adenosine triphosphate (ATP) detection was developed using a loop DNA probe with low background noise. In this strategy, a loop DNA probe, which is the substrate for both ligation and digestion enzyme reaction, was designed. SYBR green I (SG I), a double-stranded specific dye, was applied for the readout fluorescence signal. Exonuclease I (Exo I) and exonuclease III (Exo III), sequence-independent nucleases, were selected to digest the loop DNA probe in order to minimize the background fluorescence signal. As a result, in the absence of ATP, the loop DNA was completely digested by Exo I and Exo III, leading to low background fluorescence owing to the weak electrostatic interaction between SG I and mononucleotides. On the other hand, ATP induced the ligation of the nicking site, and the sealed loop DNA resisted the digestion of Exo I and ExoIII, resulting in a remarkable increase of fluorescence response. Upon background noise reduction, the sensitivity of the ATP determination was improved significantly, and the detection limitation was found to be 1.2 pM, which is much lower than that in almost all the previously reported methods. This strategy has promise for wide application in the determination of ATP.

  11. Background noise analysis in urban airport surroundings of Brazilian cities, Congonhas Airport, São Paulo

    PubMed Central

    Scatolini, Fabio; Alves, Cláudio Jorge Pinto

    2016-01-01

    ABSTRACT OBJECTIVE To perform a quantitative analysis of the background noise at Congonhas Airport surroundings based on large sampling and measurements with no interruption. METHODS Measuring sites were chosen from 62 and 72 DNL (day-night-level) noise contours, in urban sites compatible with residential use. Fifteen sites were monitored for at least 168 hours without interruption or seven consecutive days. Data compilation was based on cross-reference between noise measurements and air traffic control records, and results were validated by airport meteorological reports. Preliminary diagnoses were established using the standard NBR-13368. Background noise values were calculated based on the Sound Exposure Level (SEL). Statistic parameters were calculated in one-hour intervals. RESULTS Only four of the fifteen sites assessed presented aircraft operations as a clear cause for the noise annoyance. Even so, it is possible to detect background noise levels above regulation limits during periods of low airport activity or when it closes at night. CONCLUSIONS All the sites monitored showed background noise levels above regulation limits between 7:00 and 21:00. In the intervals between 6:00-6:59 and 21:00-22:59 the noise data, when analyzed with the current airport operational characteristics, still allow the development of additional mitigating measures. PMID:28099658

  12. Synchronization of video recording and laser pulses including background light suppression

    NASA Technical Reports Server (NTRS)

    Kalshoven, Jr., James E. (Inventor); Tierney, Jr., Michael (Inventor); Dabney, Philip W. (Inventor)

    2004-01-01

    An apparatus for and a method of triggering a pulsed light source, in particular a laser light source, for predictable capture of the source by video equipment. A frame synchronization signal is derived from the video signal of a camera to trigger the laser and position the resulting laser light pulse in the appropriate field of the video frame and during the opening of the electronic shutter, if such shutter is included in the camera. Positioning of the laser pulse in the proper video field allows, after recording, for the viewing of the laser light image with a video monitor using the pause mode on a standard cassette-type VCR. This invention also allows for fine positioning of the laser pulse to fall within the electronic shutter opening. For cameras with externally controllable electronic shutters, the invention provides for background light suppression by increasing shutter speed during the frame in which the laser light image is captured. This results in the laser light appearing in one frame in which the background scene is suppressed with the laser light being uneffected, while in all other frames, the shutter speed is slower, allowing for the normal recording of the background scene. This invention also allows for arbitrary (manual or external) triggering of the laser with full video synchronization and background light suppression.

  13. Lessons learned in preparing method 29 filters for compliance testing audits.

    PubMed

    Martz, R F; McCartney, J E; Bursey, J T; Riley, C E

    2000-01-01

    Companies conducting compliance testing are required to analyze audit samples at the time they collect and analyze the stack samples if audit samples are available. Eastern Research Group (ERG) provides technical support to the EPA's Emission Measurements Center's Stationary Source Audit Program (SSAP) for developing, preparing, and distributing performance evaluation samples and audit materials. These audit samples are requested via the regulatory Agency and include spiked audit materials for EPA Method 29-Metals Emissions from Stationary Sources, as well as other methods. To provide appropriate audit materials to federal, state, tribal, and local governments, as well as agencies performing environmental activities and conducting emission compliance tests, ERG has recently performed testing of blank filter materials and preparation of spiked filters for EPA Method 29. For sampling stationary sources using an EPA Method 29 sampling train, the use of filters without organic binders containing less than 1.3 microg/in.2 of each of the metals to be measured is required. Risk Assessment testing imposes even stricter requirements for clean filter background levels. Three vendor sources of quartz fiber filters were evaluated for background contamination to ensure that audit samples would be prepared using filters with the lowest metal background levels. A procedure was developed to test new filters, and a cleaning procedure was evaluated to see if a greater level of cleanliness could be achieved using an acid rinse with new filters. Background levels for filters supplied by different vendors and within lots of filters from the same vendor showed a wide variation, confirmed through contact with several analytical laboratories that frequently perform EPA Method 29 analyses. It has been necessary to repeat more than one compliance test because of suspect metals background contamination levels. An acid cleaning step produced improvement in contamination level, but the difference was not significant for most of the Method 29 target metals. As a result of our studies, we conclude: Filters for Method 29 testing should be purchased in lots as large as possible. Testing firms should pre-screen new boxes and/or new lots of filters used for Method 29 testing. Random analysis of three filters (top, middle, bottom of the box) from a new box of vendor filters before allowing them to be used in field tests is a prudent approach. A box of filters from a given vendor should be screened, and filters from this screened box should be used both for testing and as field blanks in each test scenario to provide the level of quality assurance required for stationary source testing.

  14. [Target volume segmentation of PET images by an iterative method based on threshold value].

    PubMed

    Castro, P; Huerga, C; Glaría, L A; Plaza, R; Rodado, S; Marín, M D; Mañas, A; Serrada, A; Núñez, L

    2014-01-01

    An automatic segmentation method is presented for PET images based on an iterative approximation by threshold value that includes the influence of both lesion size and background present during the acquisition. Optimal threshold values that represent a correct segmentation of volumes were determined based on a PET phantom study that contained different sizes spheres and different known radiation environments. These optimal values were normalized to background and adjusted by regression techniques to a two-variable function: lesion volume and signal-to-background ratio (SBR). This adjustment function was used to build an iterative segmentation method and then, based in this mention, a procedure of automatic delineation was proposed. This procedure was validated on phantom images and its viability was confirmed by retrospectively applying it on two oncology patients. The resulting adjustment function obtained had a linear dependence with the SBR and was inversely proportional and negative with the volume. During the validation of the proposed method, it was found that the volume deviations respect to its real value and CT volume were below 10% and 9%, respectively, except for lesions with a volume below 0.6 ml. The automatic segmentation method proposed can be applied in clinical practice to tumor radiotherapy treatment planning in a simple and reliable way with a precision close to the resolution of PET images. Copyright © 2013 Elsevier España, S.L.U. and SEMNIM. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Guang, E-mail: lig2@mskcc.org; Schmidtlein, C. Ross; Humm, John L.

    Purpose: To assess and account for the impact of respiratory motion on the variability of activity and volume determination of liver tumor in positron emission tomography (PET) through a comparison between free-breathing (FB) and respiration-suspended (RS) PET images. Methods: As part of a PET/computed tomography (CT) guided percutaneous liver ablation procedure performed on a PET/CT scanner, a patient's breathing is suspended on a ventilator, allowing the acquisition of a near-motionless PET and CT reference images of the liver. In this study, baseline RS and FB PET/CT images of 20 patients undergoing thermal ablation were acquired. The RS PET provides near-motionlessmore » reference in a human study, and thereby allows a quantitative evaluation of the effect of respiratory motion on PET images obtained under FB conditions. Two methods were applied to calculate tumor activity and volume: (1) threshold-based segmentation (TBS), estimating the total lesion glycolysis (TLG) and the segmented volume and (2) histogram-based estimation (HBE), yielding the background-subtracted lesion (BSL) activity and associated volume. The TBS method employs 50% of the maximum standardized uptake value (SUV{sub max}) as the threshold for tumors with SUV{sub max} ≥ 2× SUV{sub liver-bkg}, and tumor activity above this threshold yields TLG{sub 50%}. The HBE method determines local PET background based on a Gaussian fit of the low SUV peak in a SUV-volume histogram, which is generated within a user-defined and optimized volume of interest containing both local background and lesion uptakes. Voxels with PET intensity above the fitted background were considered to have originated from the tumor and used to calculate the BSL activity and its associated lesion volume. Results: Respiratory motion caused SUV{sub max} to decrease from RS to FB by −15% ± 11% (p = 0.01). Using TBS method, there was also a decrease in SUV{sub mean} (−18% ± 9%, p = 0.01), but an increase in TLG{sub 50%} (18% ± 36%) and in the segmented volume (47% ± 52%, p = 0.01) from RS to FB PET images. The background uptake in normal liver was stable, 1% ± 9%. In contrast, using the HBE method, the differences in both BSL activity and BSL volume from RS to FB were −8% ± 10% (p = 0.005) and 0% ± 16% (p = 0.94), respectively. Conclusions: This is the first time that almost motion-free PET images of the human liver were acquired and compared to free-breathing PET. The BSL method's results are more consistent, for the calculation of both tumor activity and volume in RS and FB PET images, than those using conventional TBS. This suggests that the BSL method might be less sensitive to motion blurring and provides an improved estimation of tumor activity and volume in the presence of respiratory motion.« less

  16. Evaluating flow cytometer performance with weighted quadratic least squares analysis of LED and multi-level bead data.

    PubMed

    Parks, David R; El Khettabi, Faysal; Chase, Eric; Hoffman, Robert A; Perfetto, Stephen P; Spidlen, Josef; Wood, James C S; Moore, Wayne A; Brinkman, Ryan R

    2017-03-01

    We developed a fully automated procedure for analyzing data from LED pulses and multilevel bead sets to evaluate backgrounds and photoelectron scales of cytometer fluorescence channels. The method improves on previous formulations by fitting a full quadratic model with appropriate weighting and by providing standard errors and peak residuals as well as the fitted parameters themselves. Here we describe the details of the methods and procedures involved and present a set of illustrations and test cases that demonstrate the consistency and reliability of the results. The automated analysis and fitting procedure is generally quite successful in providing good estimates of the Spe (statistical photoelectron) scales and backgrounds for all the fluorescence channels on instruments with good linearity. The precision of the results obtained from LED data is almost always better than that from multilevel bead data, but the bead procedure is easy to carry out and provides results good enough for most purposes. Including standard errors on the fitted parameters is important for understanding the uncertainty in the values of interest. The weighted residuals give information about how well the data fits the model, and particularly high residuals indicate bad data points. Known photoelectron scales and measurement channel backgrounds make it possible to estimate the precision of measurements at different signal levels and the effects of compensated spectral overlap on measurement quality. Combining this information with measurements of standard samples carrying dyes of biological interest, we can make accurate comparisons of dye sensitivity among different instruments. Our method is freely available through the R/Bioconductor package flowQB. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  17. An improved approach for the segmentation of starch granules in microscopic images

    PubMed Central

    2010-01-01

    Background Starches are the main storage polysaccharides in plants and are distributed widely throughout plants including seeds, roots, tubers, leaves, stems and so on. Currently, microscopic observation is one of the most important ways to investigate and analyze the structure of starches. The position, shape, and size of the starch granules are the main measurements for quantitative analysis. In order to obtain these measurements, segmentation of starch granules from the background is very important. However, automatic segmentation of starch granules is still a challenging task because of the limitation of imaging condition and the complex scenarios of overlapping granules. Results We propose a novel method to segment starch granules in microscopic images. In the proposed method, we first separate starch granules from background using automatic thresholding and then roughly segment the image using watershed algorithm. In order to reduce the oversegmentation in watershed algorithm, we use the roundness of each segment, and analyze the gradient vector field to find the critical points so as to identify oversegments. After oversegments are found, we extract the features, such as the position and intensity of the oversegments, and use fuzzy c-means clustering to merge the oversegments to the objects with similar features. Experimental results demonstrate that the proposed method can alleviate oversegmentation of watershed segmentation algorithm successfully. Conclusions We present a new scheme for starch granules segmentation. The proposed scheme aims to alleviate the oversegmentation in watershed algorithm. We use the shape information and critical points of gradient vector flow (GVF) of starch granules to identify oversegments, and use fuzzy c-mean clustering based on prior knowledge to merge these oversegments to the objects. Experimental results on twenty microscopic starch images demonstrate the effectiveness of the proposed scheme. PMID:21047380

  18. Background Error Covariance Estimation using Information from a Single Model Trajectory with Application to Ocean Data Assimilation into the GEOS-5 Coupled Model

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele M.; Kovach, Robin M.; Vernieres, Guillaume; Koster, Randal D. (Editor)

    2014-01-01

    An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory. SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

  19. Background Error Covariance Estimation Using Information from a Single Model Trajectory with Application to Ocean Data Assimilation

    NASA Technical Reports Server (NTRS)

    Keppenne, Christian L.; Rienecker, Michele; Kovach, Robin M.; Vernieres, Guillaume

    2014-01-01

    An attractive property of ensemble data assimilation methods is that they provide flow dependent background error covariance estimates which can be used to update fields of observed variables as well as fields of unobserved model variables. Two methods to estimate background error covariances are introduced which share the above property with ensemble data assimilation methods but do not involve the integration of multiple model trajectories. Instead, all the necessary covariance information is obtained from a single model integration. The Space Adaptive Forecast error Estimation (SAFE) algorithm estimates error covariances from the spatial distribution of model variables within a single state vector. The Flow Adaptive error Statistics from a Time series (FAST) method constructs an ensemble sampled from a moving window along a model trajectory.SAFE and FAST are applied to the assimilation of Argo temperature profiles into version 4.1 of the Modular Ocean Model (MOM4.1) coupled to the GEOS-5 atmospheric model and to the CICE sea ice model. The results are validated against unassimilated Argo salinity data. They show that SAFE and FAST are competitive with the ensemble optimal interpolation (EnOI) used by the Global Modeling and Assimilation Office (GMAO) to produce its ocean analysis. Because of their reduced cost, SAFE and FAST hold promise for high-resolution data assimilation applications.

  20. Operation of an InGrid based X-ray detector at the CAST experiment

    NASA Astrophysics Data System (ADS)

    Krieger, Christoph; Desch, Klaus; Kaminski, Jochen; Lupberger, Michael

    2018-02-01

    The CERN Axion Solar Telescope (CAST) is searching for axions and other particles which could be candidates for DarkMatter and even Dark Energy. These particles could be produced in the Sun and detected by a conversion into soft X-ray photons inside a strong magnetic field. In order to increase the sensitivity for physics beyond the Standard Model, detectors with a threshold below 1 keV as well as efficient background rejection methods are required to compensate for low energies and weak couplings resulting in very low detection rates. Those criteria are fulfilled by a detector utilizing the combination of a pixelized readout chip with an integrated Micromegas stage. These InGrid (Integrated Grid) devices can be build by photolithographic postprocessing techniques, resulting in a close to perfect match of grid and pixels facilitating the detection of single electrons on the chip surface. The high spatial resolution allows for energy determination by simple electron counting as well as for an event-shape based analysis as background rejection method. Tests at an X-ray generator revealed the energy threshold of an InGrid based X-ray detector to be well below the carbon Kα line at 277 eV. After the successful demonstration of the detectors key features, the detector was mounted at one of CAST's four detector stations behind an X-ray telescope in 2014. After several months of successful operation without any detector related interruptions, the InGrid based X-ray detector continues data taking at CAST in 2015. During operation at the experiment, background rates in the order of 10-5 keV-1 cm-2 s-1 have been achieved by application of a likelihood based method discriminating the non-photon background originating mostly from cosmic rays. For continued operation in 2016, an upgraded InGrid based detector is to be installed among other improvements including decoupling and sampling of the signal induced on the grid as well as a veto scintillator to further lower the observed background rates and improving sensitivity.

  1. Implementation of a chemical background method (OH-CHEM) for measurements of OH using the Leeds FAGE instrument: Characterisation and observations from a coastal location

    NASA Astrophysics Data System (ADS)

    Woodward-Massey, R.; Cryer, D. R.; Whalley, L. K.; Ingham, T.; Seakins, P. W.; Heard, D. E.; Stimpson, L. M.

    2015-12-01

    The removal of pollutants and greenhouse gases in the troposphere is dominated by reactions with the hydroxyl radical (OH), which is closely coupled to the hydroperoxy radical (HO2). Comparisons of the levels of OH and HO2 observed during field campaigns to the results of detailed chemical box models serve as a vital tool to assess our understanding of the underlying chemical mechanisms involved in tropospheric oxidation. Recent measurements of OH and HO2 radicals are significantly higher than those predicted by models for some instruments measuring in certain environments, especially those influenced by high emissions of biogenic compounds such as isoprene, prompting intense laboratory research to account for such discrepancies. While current chemical mechanisms are likely incomplete, it is also possible that, at least in part, these elevated radical observations have been influenced by instrumental biases from interfering species. Recent studies have suggested that fluorescence assay by gas expansion (FAGE) instruments may be susceptible to an unknown interference in the measurement of OH. This hypothesis can be tested through the implementation of an alternative method to determine the OH background signal, whereby OH is removed by the addition of a chemical scavenger prior to sampling by FAGE. The Leeds FAGE instrument was modified to facilitate this method by the construction of an inlet pre-injector (IPI), where OH is removed through reaction with propane. The modified Leeds FAGE instrument was deployed at a coastal location in southeast England during summer 2015 as part of the ICOZA (Integrated Chemistry of OZone in the Atmosphere) project. Measurements of OH made using both background methods will be presented, alongside results from laboratory characterisation experiments and details of the IPI design.

  2. Estimation of background noise level on seismic station using statistical analysis for improved analysis accuracy

    NASA Astrophysics Data System (ADS)

    Han, S. M.; Hahm, I.

    2015-12-01

    We evaluated the background noise level of seismic stations in order to collect the observation data of high quality and produce accurate seismic information. Determining of the background noise level was used PSD (Power Spectral Density) method by McNamara and Buland (2004) in this study. This method that used long-term data is influenced by not only innate electronic noise of sensor and a pulse wave resulting from stabilizing but also missing data and controlled by the specified frequency which is affected by the irregular signals without site characteristics. It is hard and inefficient to implement process that filters out the abnormal signal within the automated system. To solve these problems, we devised a method for extracting the data which normally distributed with 90 to 99% confidence intervals at each period. The availability of the method was verified using 62-seismic stations with broadband and short-period sensors operated by the KMA (Korea Meteorological Administration). Evaluation standards were NHNM (New High Noise Model) and NLNM (New Low Noise Model) published by the USGS (United States Geological Survey). It was designed based on the western United States. However, Korean Peninsula surrounded by the ocean on three sides has a complicated geological structure and a high population density. So, we re-designed an appropriate model in Korean peninsula by statistically combined result. The important feature is that secondary-microseism peak appeared at a higher frequency band. Acknowledgements: This research was carried out as a part of "Research for the Meteorological and Earthquake Observation Technology and Its Application" supported by the 2015 National Institute of Meteorological Research (NIMR) in the Korea Meteorological Administration.

  3. Non-perturbative background field calculations

    NASA Astrophysics Data System (ADS)

    Stephens, C. R.

    1988-01-01

    New methods are developed for calculating one loop functional determinants in quantum field theory. Instead of relying on a calculation of all the eigenvalues of the small fluctuation equation, these techniques exploit the ability of the proper time formalism to reformulate an infinite dimensional field theoretic problem into a finite dimensional covariant quantum mechanical analog, thereby allowing powerful tools such as the method of Jacobi fields to be used advantageously in a field theory setting. More generally the methods developed herein should be extremely valuable when calculating quantum processes in non-constant background fields, offering a utilitarian alternative to the two standard methods of calculation—perturbation theory in the background field or taking the background field into account exactly. The formalism developed also allows for the approximate calculation of covariances of partial differential equations from a knowledge of the solutions of a homogeneous ordinary differential equation.

  4. Chinese Herbal Medicine Image Recognition and Retrieval by Convolutional Neural Network

    PubMed Central

    Sun, Xin; Qian, Huinan

    2016-01-01

    Chinese herbal medicine image recognition and retrieval have great potential of practical applications. Several previous studies have focused on the recognition with hand-crafted image features, but there are two limitations in them. Firstly, most of these hand-crafted features are low-level image representation, which is easily affected by noise and background. Secondly, the medicine images are very clean without any backgrounds, which makes it difficult to use in practical applications. Therefore, designing high-level image representation for recognition and retrieval in real world medicine images is facing a great challenge. Inspired by the recent progress of deep learning in computer vision, we realize that deep learning methods may provide robust medicine image representation. In this paper, we propose to use the Convolutional Neural Network (CNN) for Chinese herbal medicine image recognition and retrieval. For the recognition problem, we use the softmax loss to optimize the recognition network; then for the retrieval problem, we fine-tune the recognition network by adding a triplet loss to search for the most similar medicine images. To evaluate our method, we construct a public database of herbal medicine images with cluttered backgrounds, which has in total 5523 images with 95 popular Chinese medicine categories. Experimental results show that our method can achieve the average recognition precision of 71% and the average retrieval precision of 53% over all the 95 medicine categories, which are quite promising given the fact that the real world images have multiple pieces of occluded herbal and cluttered backgrounds. Besides, our proposed method achieves the state-of-the-art performance by improving previous studies with a large margin. PMID:27258404

  5. A phantom design for assessment of detectability in PET imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wollenweber, Scott D., E-mail: scott.wollenweber@g

    2016-09-15

    Purpose: The primary clinical role of positron emission tomography (PET) imaging is the detection of anomalous regions of {sup 18}F-FDG uptake, which are often indicative of malignant lesions. The goal of this work was to create a task-configurable fillable phantom for realistic measurements of detectability in PET imaging. Design goals included simplicity, adjustable feature size, realistic size and contrast levels, and inclusion of a lumpy (i.e., heterogeneous) background. Methods: The detection targets were hollow 3D-printed dodecahedral nylon features. The exostructure sphere-like features created voids in a background of small, solid non-porous plastic (acrylic) spheres inside a fillable tank. The featuresmore » filled at full concentration while the background concentration was reduced due to filling only between the solid spheres. Results: Multiple iterations of feature size and phantom construction were used to determine a configuration at the limit of detectability for a PET/CT system. A full-scale design used a 20 cm uniform cylinder (head-size) filled with a fixed pattern of features at a contrast of approximately 3:1. Known signal-present and signal-absent PET sub-images were extracted from multiple scans of the same phantom and with detectability in a challenging (i.e., useful) range. These images enabled calculation and comparison of the quantitative observer detectability metrics between scanner designs and image reconstruction methods. The phantom design has several advantages including filling simplicity, wall-less contrast features, the control of the detectability range via feature size, and a clinically realistic lumpy background. Conclusions: This phantom provides a practical method for testing and comparison of lesion detectability as a function of imaging system, acquisition parameters, and image reconstruction methods and parameters.« less

  6. On the solution of the complex eikonal equation in acoustic VTI media: A perturbation plus optimization scheme

    NASA Astrophysics Data System (ADS)

    Huang, Xingguo; Sun, Jianguo; Greenhalgh, Stewart

    2018-04-01

    We present methods for obtaining numerical and analytic solutions of the complex eikonal equation in inhomogeneous acoustic VTI media (transversely isotropic media with a vertical symmetry axis). The key and novel point of the method for obtaining numerical solutions is to transform the problem of solving the highly nonlinear acoustic VTI eikonal equation into one of solving the relatively simple eikonal equation for the background (isotropic) medium and a system of linear partial differential equations. Specifically, to obtain the real and imaginary parts of the complex traveltime in inhomogeneous acoustic VTI media, we generalize a perturbation theory, which was developed earlier for solving the conventional real eikonal equation in inhomogeneous anisotropic media, to the complex eikonal equation in such media. After the perturbation analysis, we obtain two types of equations. One is the complex eikonal equation for the background medium and the other is a system of linearized partial differential equations for the coefficients of the corresponding complex traveltime formulas. To solve the complex eikonal equation for the background medium, we employ an optimization scheme that we developed for solving the complex eikonal equation in isotropic media. Then, to solve the system of linearized partial differential equations for the coefficients of the complex traveltime formulas, we use the finite difference method based on the fast marching strategy. Furthermore, by applying the complex source point method and the paraxial approximation, we develop the analytic solutions of the complex eikonal equation in acoustic VTI media, both for the isotropic and elliptical anisotropic background medium. Our numerical results demonstrate the effectiveness of our derivations and illustrate the influence of the beam widths and the anisotropic parameters on the complex traveltimes.

  7. Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data

    PubMed Central

    2014-01-01

    Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193

  8. Dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization

    NASA Astrophysics Data System (ADS)

    Li, Li

    2018-03-01

    In order to extract target from complex background more quickly and accurately, and to further improve the detection effect of defects, a method of dual-threshold segmentation using Arimoto entropy based on chaotic bee colony optimization was proposed. Firstly, the method of single-threshold selection based on Arimoto entropy was extended to dual-threshold selection in order to separate the target from the background more accurately. Then intermediate variables in formulae of Arimoto entropy dual-threshold selection was calculated by recursion to eliminate redundant computation effectively and to reduce the amount of calculation. Finally, the local search phase of artificial bee colony algorithm was improved by chaotic sequence based on tent mapping. The fast search for two optimal thresholds was achieved using the improved bee colony optimization algorithm, thus the search could be accelerated obviously. A large number of experimental results show that, compared with the existing segmentation methods such as multi-threshold segmentation method using maximum Shannon entropy, two-dimensional Shannon entropy segmentation method, two-dimensional Tsallis gray entropy segmentation method and multi-threshold segmentation method using reciprocal gray entropy, the proposed method can segment target more quickly and accurately with superior segmentation effect. It proves to be an instant and effective method for image segmentation.

  9. Detection of enhancement in number densities of background galaxies due to magnification by massive galaxy clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiu, I.; Dietrich, J. P.; Mohr, J.

    2016-02-18

    We present a detection of the enhancement in the number densities of background galaxies induced from lensing magnification and use it to test the Sunyaev-Zel'dovich effect (SZE-) inferred masses in a sample of 19 galaxy clusters with median redshift z similar or equal to 0.42 selected from the South Pole Telescope SPT-SZ survey. These clusters are observed by the Megacam on the Magellan Clay Telescope though gri filters. Two background galaxy populations are selected for this study through their photometric colours; they have median redshifts zmedian similar or equal to 0.9 (low-z background) and z(median) similar or equal to 1.8more » (high-z background). Stacking these populations, we detect the magnification bias effect at 3.3 sigma and 1.3 sigma for the low-and high-z backgrounds, respectively. We fit Navarro, Frenk and White models simultaneously to all observed magnification bias profiles to estimate the multiplicative factor. that describes the ratio of the weak lensing mass to the mass inferred from the SZE observable-mass relation. We further quantify systematic uncertainties in. resulting from the photometric noise and bias, the cluster galaxy contamination and the estimations of the background properties. The resulting. for the combined background populations with 1 sigma uncertainties is 0.83 +/- 0.24(stat) +/- 0.074(sys), indicating good consistency between the lensing and the SZE-inferred masses. We use our best-fitting eta to predict the weak lensing shear profiles and compare these predictions with observations, showing agreement between the magnification and shear mass constraints. This work demonstrates the promise of using the magnification as a complementary method to estimate cluster masses in large surveys.« less

  10. Attractiveness Compensates for Low Status Background in the Prediction of Educational Attainment

    PubMed Central

    Bauldry, Shawn; Shanahan, Michael J.; Russo, Rosemary; Roberts, Brent W.; Damian, Rodica

    2016-01-01

    Background People who are perceived as good looking or as having a pleasant personality enjoy many advantages, including higher educational attainment. This study examines (1) whether associations between physical/personality attractiveness and educational attainment vary by parental socioeconomic resources and (2) whether parental socioeconomic resources predict these forms of attractiveness. Based on the theory of resource substitution with structural amplification, we hypothesized that both types of attractiveness would have a stronger association with educational attainment for people from disadvantaged backgrounds (resource substitution), but also that people from disadvantaged backgrounds would be less likely to be perceived as attractive (amplification). Methods This study draws on data from the National Longitudinal Study of Adolescent to Adult Health—including repeated interviewer ratings of respondents’ attractiveness—and trait-state structural equation models to examine the moderation (substitution) and mediation (amplification) of physical and personality attractiveness in the link between parental socioeconomic resources and educational attainment. Results Both perceived personality and physical attractiveness have stronger associations with educational attainment for people from families with lower levels of parental education (substitution). Further, parental education and income are associated with both dimensions of perceived attractiveness, and personality attractiveness is positively associated with educational attainment (amplification). Results do not differ by sex and race/ethnicity. Further, associations between perceived attractiveness and educational attainment remain after accounting for unmeasured family-level confounders using a sibling fixed-effects model. Conclusions Perceived attractiveness, particularly personality attractiveness, is a more important psychosocial resource for educational attainment for people from disadvantaged backgrounds than for people from advantaged backgrounds. People from disadvantaged backgrounds, however, are less likely to be perceived as attractive than people from advantaged backgrounds. PMID:27249216

  11. Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.

    PubMed

    Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M

    2016-05-05

    Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.

  12. IMPROVING DETECTION METHODS FOR ENTERIC WATERBORNE VIRUSES

    EPA Science Inventory

    Waterborne viruses are a significant cause of illness, both within the US and worldwide. These illnesses can occur as the result of outbreaks, potentially affecting hundreds or thousands of people, or as a part of a background level of endemic infection. While many of these out...

  13. IMPACT OF AN INDOOR COOK STOVE INTERVENTION ON MEASURES OF SYSTEMIC INFLAMMATION

    EPA Science Inventory

    Background and Aims: Approximately three billion people use inefficient and poorly-vented indoor cook stoves, which can result in high indoor air pollution concentrations. Few studies have evaluated the cardiovascular effects of indoor biomass burning. Methods: In this pilot s...

  14. An Investigation of Milk Sugar.

    ERIC Educational Resources Information Center

    Smith, Christopher A.; Dawson, Maureen M.

    1987-01-01

    Describes an experiment to identify lactose and estimate the concentration of lactose in a sample of milk. Gives a background of the investigation. Details the experimental method, results and calculations. Discusses the implications of the experiment to students. Suggests further experiments using the same technique used in…

  15. Safer staining method for acid fast bacilli.

    PubMed Central

    Ellis, R C; Zabrowarny, L A

    1993-01-01

    To develop a method for staining acid fast bacilli which excluded highly toxic phenol from the staining solution. A lipophilic agent, a liquid organic detergent, LOC High Studs, distributed by Amway, was substituted. The acid fast bacilli stained red; nuclei, cytoplasm, and cytoplasmic elements stained blue on a clear background. These results compare very favourably with acid fast bacilli stained by the traditional method. Detergents are efficient lipophilic agents and safer to handle than phenol. The method described here stains acid fast bacilli as efficiently as traditional carbol fuchsin methods. LOC High Suds is considerably cheaper than phenol. Images PMID:7687254

  16. Safer staining method for acid fast bacilli.

    PubMed

    Ellis, R C; Zabrowarny, L A

    1993-06-01

    To develop a method for staining acid fast bacilli which excluded highly toxic phenol from the staining solution. A lipophilic agent, a liquid organic detergent, LOC High Studs, distributed by Amway, was substituted. The acid fast bacilli stained red; nuclei, cytoplasm, and cytoplasmic elements stained blue on a clear background. These results compare very favourably with acid fast bacilli stained by the traditional method. Detergents are efficient lipophilic agents and safer to handle than phenol. The method described here stains acid fast bacilli as efficiently as traditional carbol fuchsin methods. LOC High Suds is considerably cheaper than phenol.

  17. Calculation of the detection limits for radionuclides identified in gamma-ray spectra based on post-processing peak analysis results.

    PubMed

    Korun, M; Vodenik, B; Zorko, B

    2018-03-01

    A new method for calculating the detection limits of gamma-ray spectrometry measurements is presented. The method is applicable for gamma-ray emitters, irrespective of the influences of the peaked background, the origin of the background and the overlap with other peaks. It offers the opportunity for multi-gamma-ray emitters to calculate the common detection limit, corresponding to more peaks. The detection limit is calculated by approximating the dependence of the uncertainty in the indication on its value with a second-order polynomial. In this approach the relation between the input quantities and the detection limit are described by an explicit expression and can be easy investigated. The detection limit is calculated from the data usually provided by the reports of peak-analyzing programs: the peak areas and their uncertainties. As a result, the need to use individual channel contents for calculating the detection limit is bypassed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. LAI inversion from optical reflectance using a neural network trained with a multiple scattering model

    NASA Technical Reports Server (NTRS)

    Smith, James A.

    1992-01-01

    The inversion of the leaf area index (LAI) canopy parameter from optical spectral reflectance measurements is obtained using a backpropagation artificial neural network trained using input-output pairs generated by a multiple scattering reflectance model. The problem of LAI estimation over sparse canopies (LAI < 1.0) with varying soil reflectance backgrounds is particularly difficult. Standard multiple regression methods applied to canopies within a single homogeneous soil type yield good results but perform unacceptably when applied across soil boundaries, resulting in absolute percentage errors of >1000 percent for low LAI. Minimization methods applied to merit functions constructed from differences between measured reflectances and predicted reflectances using multiple-scattering models are unacceptably sensitive to a good initial guess for the desired parameter. In contrast, the neural network reported generally yields absolute percentage errors of <30 percent when weighting coefficients trained on one soil type were applied to predicted canopy reflectance at a different soil background.

  19. Gross beta determination in drinking water using scintillating fiber array detector.

    PubMed

    Lv, Wen-Hui; Yi, Hong-Chang; Liu, Tong-Qing; Zeng, Zhi; Li, Jun-Li; Zhang, Hui; Ma, Hao

    2018-04-04

    A scintillating fiber array detector for measuring gross beta counting is developed to monitor the real-time radioactivity in drinking water. The detector, placed in a stainless-steel tank, consists of 1096 scintillating fibers, both sides of which are connected to a photomultiplier tube. The detector parameters, including working voltage, background counting rate and stability, are tested, and the detection efficiency is calibrated using standard potassium chloride solution. Water samples are measured with the detector and the results are compared with those by evaporation method. The results show consistency with those by evaporation method. The background counting rate of the detector is 38.131 ± 0.005 cps, and the detection efficiency for β particles is 0.37 ± 0.01 cps/(Bq/l). The MDAC of this system can be less than 1.0 Bq/l for β particles in 120 min without pre-concentration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Spectral characterization of natural backgrounds

    NASA Astrophysics Data System (ADS)

    Winkelmann, Max

    2017-10-01

    As the distribution and use of hyperspectral sensors is constantly increasing, the exploitation of spectral features is a threat for camouflaged objects. To improve camouflage materials at first the spectral behavior of backgrounds has to be known to adjust and optimize the spectral reflectance of camouflage materials. In an international effort, the NATO CSO working group SCI-295 "Development of Methods for Measurements and Evaluation of Natural Background EO Signatures" is developing a method how this characterization of backgrounds has to be done. It is obvious that the spectral characterization of a background will be quite an effort. To compare and exchange data internationally the measurements will have to be done in a similar way. To test and further improve this method an international field trial has been performed in Storkow, Germany. In the following we present first impressions and lessons learned from this field campaign and describe the data that has been measured.

  1. [The Impact of Immigrant Status on the Mode of Hospital Referral, Impression of Disease Severity and Length of Stay in Inpatient General Pediatric Care].

    PubMed

    Ullrich, Sebastian; Nesterko, Yuriy; Briel, Diana; Hiemisch, Andreas; Brähler, Elmar; Glaesmer, Heide

    2017-08-10

    Background National as well as international research often shows differences in health and health behavior between children and youth with and without migration background. It is also noted that there are differences in the use of health services depending on the migration background. Patients and Methods Data from 266 pediatric patients, regarding their hospitalization, general impression of health status and length of stay in a hospital, were analyzed depending on the migration background. Information on migration background and treatment-related data were obtained from the parents of the patient or from the hospital information system (SAP). 20.7% of patients (n=55) had a migration background. Results Migrants were hospitalized more often under a participation of the emergency room than non-migrants; also migrants showed a more severe illness picture. Regarding the number of diagnoses and length of stay as well as the distribution of the main diagnoses, no differences were found. Discussion Language barriers, culture-specific ideas about illness and insufficient knowledge of the German health care system were discussed as possible reasons to for the differences between migrants and non-migrants. Conclusion This study confirms already known differences in the use of health services by people with and without migration background. The results indicate worse health status in migrant patients compared to non-migrants by hospitalization. Future research with greater numbers of participants should specifically investigate on this point. © Georg Thieme Verlag KG Stuttgart · New York.

  2. The Use of Religious Coping Methods in a Secular Society: A Survey Study Among Cancer Patients in Sweden.

    PubMed

    Ahmadi, Nader; Ahmadi, Fereshteh

    2017-07-01

    In the present article, based on results from a survey study in Sweden among 2,355 cancer patients, the role of religion in coping is discussed. The survey study, in turn, was based on earlier findings from a qualitative study of cancer patients in Sweden. The purpose of the present survey study was to determine to what extent results obtained in the qualitative study can be applied to a wider population of cancer patients in Sweden. The present study shows that use of religious coping methods is infrequent among cancer patients in Sweden. Besides the two methods that are ranked in 12th and 13th place, that is, in the middle (Listening to religious music and Praying to God to make things better), the other religious coping methods receive the lowest rankings, showing how nonsignificant such methods are in coping with cancer in Sweden. However, the question of who turns to God and who is self-reliant in a critical situation is too complicated to be resolved solely in terms of the strength of individuals' religious commitments. In addition to background and situational factors, the culture in which the individual was socialized is an important factor. Regarding the influence of background variables, the present results show that gender, age , and area of upbringing played an important role in almost all of the religious coping methods our respondents used. In general, people in the oldest age-group, women, and people raised in places with 20,000 or fewer residents had a higher average use of religious coping methods than did younger people, men, and those raised in larger towns.

  3. The Use of Religious Coping Methods in a Secular Society

    PubMed Central

    Ahmadi, Nader

    2015-01-01

    In the present article, based on results from a survey study in Sweden among 2,355 cancer patients, the role of religion in coping is discussed. The survey study, in turn, was based on earlier findings from a qualitative study of cancer patients in Sweden. The purpose of the present survey study was to determine to what extent results obtained in the qualitative study can be applied to a wider population of cancer patients in Sweden. The present study shows that use of religious coping methods is infrequent among cancer patients in Sweden. Besides the two methods that are ranked in 12th and 13th place, that is, in the middle (Listening to religious music and Praying to God to make things better), the other religious coping methods receive the lowest rankings, showing how nonsignificant such methods are in coping with cancer in Sweden. However, the question of who turns to God and who is self-reliant in a critical situation is too complicated to be resolved solely in terms of the strength of individuals’ religious commitments. In addition to background and situational factors, the culture in which the individual was socialized is an important factor. Regarding the influence of background variables, the present results show that gender, age, and area of upbringing played an important role in almost all of the religious coping methods our respondents used. In general, people in the oldest age-group, women, and people raised in places with 20,000 or fewer residents had a higher average use of religious coping methods than did younger people, men, and those raised in larger towns. PMID:28690385

  4. SU-E-I-96: A Study About the Influence of ROI Variation On Tumor Segmentation in PET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, L; Tan, S; Lu, W

    2014-06-01

    Purpose: To study the influence of different regions of interest (ROI) on tumor segmentation in PET. Methods: The experiments were conducted on a cylindrical phantom. Six spheres with different volumes (0.5ml, 1ml, 6ml, 12ml, 16ml and 20 ml) were placed inside a cylindrical container to mimic tumors of different sizes. The spheres were filled with 11C solution as sources and the cylindrical container was filled with 18F-FDG solution as the background. The phantom was continuously scanned in a Biograph-40 True Point/True View PET/CT scanner, and 42 images were reconstructed with source-to-background ratio (SBR) ranging from 16:1 to 1.8:1. We tookmore » a large and a small ROI for each sphere, both of which contain the whole sphere and does not contain any other spheres. Six other ROIs of different sizes were then taken between the large and the small ROI. For each ROI, all images were segmented by eitht thresholding methods and eight advanced methods, respectively. The segmentation results were evaluated by dice similarity index (DSI), classification error (CE) and volume error (VE). The robustness of different methods to ROI variation was quantified using the interrun variation and a generalized Cohen's kappa. Results: With the change of ROI, the segmentation results of all tested methods changed more or less. Compared with all advanced methods, thresholding methods were less affected by the ROI change. In addition, most of the thresholding methods got more accurate segmentation results for all sphere sizes. Conclusion: The results showed that the segmentation performance of all tested methods was affected by the change of ROI. Thresholding methods were more robust to this change and they can segment the PET image more accurately. This work was supported in part by National Natural Science Foundation of China (NNSFC), under Grant Nos. 60971112 and 61375018, and Fundamental Research Funds for the Central Universities, under Grant No. 2012QN086. Wei Lu was supported in part by the National Institutes of Health (NIH) Grant No. R01 CA172638.« less

  5. Research on vehicle detection based on background feature analysis in SAR images

    NASA Astrophysics Data System (ADS)

    Zhang, Bochuan; Tang, Bo; Zhang, Cong; Hu, Ruiguang; Yun, Hongquan; Xiao, Liping

    2017-10-01

    Aiming at vehicle detection on the ground through low resolution SAR images, a method is proposed for determining the region of the vehicles first and then detecting the target in the specific region. The experimental results show that this method not only reduces the target detection area, but also reduces the influence of terrain clutter on the detection, which greatly improves the reliability of the target detection.

  6. Synthesis, Transfer, and Characterization of Nanoscale 2-Dimensional Materials

    DTIC Science & Technology

    2015-09-01

    deposition systems, leading to reduced operating costs. Transfer has been achieved using polymer-assisted methods , and material quality has been...Introduction and Background 1 2. Materials and Methods 1 3. Results and Discussion 2 3.1 Copper Foil Preparation 2 3.2 Graphene Synthesis and...magnification. a) 10 K, b) 18 K, c) 20 K, and d) 40 K. The red arrows indicate wrinkles in the film

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Narciso, L

    Purpose: Commercial workstations usually have their own software to calculate dynamic renal functions. However, usually they have low flexibility and subjectivity on delimiting kidney and background areas. The aim of this paper is to present a public domain software, called RenalQuant, capable to semi-automatically draw regions of interest on dynamic renal scintigraphies, extracting data and generating renal function quantification parameters. Methods: The software was developed in Java and written as an ImageJ-based plugin. The preprocessing and segmentation steps include the user’s selection of one time frame with higher activity in kidney’s region, compared with background, and low activity in themore » liver. Next, the chosen time frame is smoothed using a Gaussian low pass spatial filter (σ = 3) for noise reduction and better delimitation of kidneys. The maximum entropy thresholding method is used for segmentation. A background area is automatically placed below each kidney, and the user confirms if these regions are correctly segmented and positioned. Quantitative data are extracted and each renogram and relative renal function (RRF) value is calculated and displayed. Results: RenalQuant plugin was validated using retrospective 20 patients’ 99mTc-DTPA exams, and compared with results produced by commercial workstation software, referred as reference. The renograms intraclass correlation coefficients (ICC) were calculated and false-negative and false-positive RRF values were analyzed. The results showed that ICC values between RenalQuant plugin and reference software for both kidneys’ renograms were higher than 0.75, showing excellent reliability. Conclusion: Our results indicated RenalQuant plugin can be trustingly used to generate renograms, using DICOM dynamic renal scintigraphy exams as input. It is user friendly and user’s interaction occurs at a minimum level. Further studies have to investigate how to increase RRF accuracy and explore how to solve limitations in the segmentation step, mainly when background region has higher activity compared to kidneys. Financial support by CAPES.« less

  8. Characteristics of ovulation method acceptors: a cross-cultural assessment.

    PubMed

    Klaus, H; Labbok, M; Barker, D

    1988-01-01

    Five programs of instruction in the ovulation method (OM) in diverse geographic and cultural settings are described, and characteristics of approximately 200 consecutive OM acceptors in each program are examined. Major findings include: the religious background and family size of acceptors are variable, as is the level of previous contraceptive use. Acceptors are drawn from a wide range of socioeconomic and religious backgrounds; however, family planning intention was similarly distributed in all five countries. In sum, the ovulation method is accepted by persons from a variety of backgrounds within and between cultural setting.

  9. Comparing transformation methods for DNA microarray data

    PubMed Central

    Thygesen, Helene H; Zwinderman, Aeilko H

    2004-01-01

    Background When DNA microarray data are used for gene clustering, genotype/phenotype correlation studies, or tissue classification the signal intensities are usually transformed and normalized in several steps in order to improve comparability and signal/noise ratio. These steps may include subtraction of an estimated background signal, subtracting the reference signal, smoothing (to account for nonlinear measurement effects), and more. Different authors use different approaches, and it is generally not clear to users which method they should prefer. Results We used the ratio between biological variance and measurement variance (which is an F-like statistic) as a quality measure for transformation methods, and we demonstrate a method for maximizing that variance ratio on real data. We explore a number of transformations issues, including Box-Cox transformation, baseline shift, partial subtraction of the log-reference signal and smoothing. It appears that the optimal choice of parameters for the transformation methods depends on the data. Further, the behavior of the variance ratio, under the null hypothesis of zero biological variance, appears to depend on the choice of parameters. Conclusions The use of replicates in microarray experiments is important. Adjustment for the null-hypothesis behavior of the variance ratio is critical to the selection of transformation method. PMID:15202953

  10. Improved background suppression in ¹H MAS NMR using composite pulses.

    PubMed

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of ¹H MAS NMR spectroscopy, particularly of solids where the concentration of ¹H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from ¹H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the ¹H MAS spectrum, while those experiencing a low B₁ field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the ¹H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the ¹H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to ¹H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Improved background suppression in 1H MAS NMR using composite pulses

    NASA Astrophysics Data System (ADS)

    Odedra, Smita; Wimperis, Stephen

    2012-08-01

    A well known feature of 1H MAS NMR spectroscopy, particularly of solids where the concentration of 1H nuclei is low, is the presence in the spectrum of a significant broad "background" signal arising from 1H nuclei that are outside the MAS rotor and radiofrequency coil, probably located on the surfaces of the static components of the probehead. A popular method of suppressing this unwanted signal is the "depth pulse" method, consisting of a 90° pulse followed by one or two 180° pulses that are phase cycled according to the "Exorcycle" scheme, which removes signal associated with imperfect 180° pulses. Consequently, only spins in the centre of the radiofrequency coil contribute to the 1H MAS spectrum, while those experiencing a low B1 field outside the coil are suppressed. Although very effective at removing background signal from the spectrum, one drawback with this approach is that significant loss of the desired signal from the sample also occurs. Here we investigate the 1H background suppression problem and, in particular, the use of novel antisymmetric passband composite pulses to replace the simple pulses in a depth pulse experiment. We show that it is possible to improve the intensity of the 1H signals of interest while still maintaining effective background suppression. We expect that these results will be relevant to 1H MAS NMR studies of, for example, nominally perdeuterated biological samples or nominally anhydrous inorganic materials.

  12. Automated tumour boundary delineation on 18F-FDG PET images using active contour coupled with shifted-optimal thresholding method

    NASA Astrophysics Data System (ADS)

    Khamwan, Kitiwat; Krisanachinda, Anchali; Pluempitiwiriyawej, Charnchai

    2012-10-01

    This study presents an automatic method to trace the boundary of the tumour in positron emission tomography (PET) images. It has been discovered that Otsu's threshold value is biased when the within-class variances between the object and the background are significantly different. To solve the problem, a double-stage threshold search that minimizes the energy between the first Otsu's threshold and the maximum intensity value is introduced. Such shifted-optimal thresholding is embedded into a region-based active contour so that both algorithms are performed consecutively. The efficiency of the method is validated using six sphere inserts (0.52-26.53 cc volume) of the IEC/2001 torso phantom. Both spheres and phantom were filled with 18F solution with four source-to-background ratio (SBR) measurements of PET images. The results illustrate that the tumour volumes segmented by combined algorithm are of higher accuracy than the traditional active contour. The method had been clinically implemented in ten oesophageal cancer patients. The results are evaluated and compared with the manual tracing by an experienced radiation oncologist. The advantage of the algorithm is the reduced erroneous delineation that improves the precision and accuracy of PET tumour contouring. Moreover, the combined method is robust, independent of the SBR threshold-volume curves, and it does not require prior lesion size measurement.

  13. Quantitative trace analysis of polyfluorinated alkyl substances (PFAS) in ambient air samples from Mace Head (Ireland): A method intercomparison

    NASA Astrophysics Data System (ADS)

    Jahnke, Annika; Barber, Jonathan L.; Jones, Kevin C.; Temme, Christian

    A method intercomparison study of analytical methods for the determination of neutral, volatile polyfluorinated alkyl substances (PFAS) was carried out in March, 2006. Environmental air samples were collected in triplicate at the European background site Mace Head on the west coast of Ireland, a site dominated by 'clean' westerly winds coming across the Atlantic. Extraction and analysis were performed at two laboratories active in PFAS research using their in-house methods. Airborne polyfluorinated telomer alcohols (FTOHs), fluorooctane sulfonamides and sulfonamidoethanols (FOSAs/FOSEs) as well as additional polyfluorinated compounds were investigated. Different native and isotope-labelled internal standards (IS) were applied at various steps in the analytical procedure to evaluate the different quantification strategies. Field blanks revealed no major blank problems. European background concentrations observed at Mace Head were found to be in a similar range to Arctic data reported in the literature. Due to trace-levels at the remote site, only FTOH data sets were complete and could therefore be compared between the laboratories. Additionally, FOSEs could partly be included. Data comparison revealed that despite the challenges inherent in analysis of airborne PFAS and the low concentrations, all methods applied in this study obtained similar results. However, application of isotope-labelled IS early in the analytical procedure leads to more precise results and is therefore recommended.

  14. Evidence and Clinical Trials.

    NASA Astrophysics Data System (ADS)

    Goodman, Steven N.

    1989-11-01

    This dissertation explores the use of a mathematical measure of statistical evidence, the log likelihood ratio, in clinical trials. The methods and thinking behind the use of an evidential measure are contrasted with traditional methods of analyzing data, which depend primarily on a p-value as an estimate of the statistical strength of an observed data pattern. It is contended that neither the behavioral dictates of Neyman-Pearson hypothesis testing methods, nor the coherency dictates of Bayesian methods are realistic models on which to base inference. The use of the likelihood alone is applied to four aspects of trial design or conduct: the calculation of sample size, the monitoring of data, testing for the equivalence of two treatments, and meta-analysis--the combining of results from different trials. Finally, a more general model of statistical inference, using belief functions, is used to see if it is possible to separate the assessment of evidence from our background knowledge. It is shown that traditional and Bayesian methods can be modeled as two ends of a continuum of structured background knowledge, methods which summarize evidence at the point of maximum likelihood assuming no structure, and Bayesian methods assuming complete knowledge. Both schools are seen to be missing a concept of ignorance- -uncommitted belief. This concept provides the key to understanding the problem of sampling to a foregone conclusion and the role of frequency properties in statistical inference. The conclusion is that statistical evidence cannot be defined independently of background knowledge, and that frequency properties of an estimator are an indirect measure of uncommitted belief. Several likelihood summaries need to be used in clinical trials, with the quantitative disparity between summaries being an indirect measure of our ignorance. This conclusion is linked with parallel ideas in the philosophy of science and cognitive psychology.

  15. Oxidative DNA damage background estimated by a system model of base excision repair

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sokhansanj, B A; Wilson, III, D M

    Human DNA can be damaged by natural metabolism through free radical production. It has been suggested that the equilibrium between innate damage and cellular DNA repair results in an oxidative DNA damage background that potentially contributes to disease and aging. Efforts to quantitatively characterize the human oxidative DNA damage background level based on measuring 8-oxoguanine lesions as a biomarker have led to estimates varying over 3-4 orders of magnitude, depending on the method of measurement. We applied a previously developed and validated quantitative pathway model of human DNA base excision repair, integrating experimentally determined endogenous damage rates and model parametersmore » from multiple sources. Our estimates of at most 100 8-oxoguanine lesions per cell are consistent with the low end of data from biochemical and cell biology experiments, a result robust to model limitations and parameter variation. Our results show the power of quantitative system modeling to interpret composite experimental data and make biologically and physiologically relevant predictions for complex human DNA repair pathway mechanisms and capacity.« less

  16. Application research for 4D technology in flood forecasting and evaluation

    NASA Astrophysics Data System (ADS)

    Li, Ziwei; Liu, Yutong; Cao, Hongjie

    1998-08-01

    In order to monitor the region which disaster flood happened frequently in China, satisfy the great need of province governments for high accuracy monitoring and evaluated data for disaster and improve the efficiency for repelling disaster, under the Ninth Five-year National Key Technologies Programme, the method was researched for flood forecasting and evaluation using satellite and aerial remoted sensed image and land monitor data. The effective and practicable flood forecasting and evaluation system was established and DongTing Lake was selected as the test site. Modern Digital photogrammetry, remote sensing and GIS technology was used in this system, the disastrous flood could be forecasted and loss can be evaluated base on '4D' (DEM -- Digital Elevation Model, DOQ -- Digital OrthophotoQuads, DRG -- Digital Raster Graph, DTI -- Digital Thematic Information) disaster background database. The technology of gathering and establishing method for '4D' disaster environment background database, application technology for flood forecasting and evaluation based on '4D' background data and experimental results for DongTing Lake test site were introduced in detail in this paper.

  17. Detection of Foreign Matter in Transfusion Solution Based on Gaussian Background Modeling and an Optimized BP Neural Network

    PubMed Central

    Zhou, Fuqiang; Su, Zhen; Chai, Xinghua; Chen, Lipeng

    2014-01-01

    This paper proposes a new method to detect and identify foreign matter mixed in a plastic bottle filled with transfusion solution. A spin-stop mechanism and mixed illumination style are applied to obtain high contrast images between moving foreign matter and a static transfusion background. The Gaussian mixture model is used to model the complex background of the transfusion image and to extract moving objects. A set of features of moving objects are extracted and selected by the ReliefF algorithm, and optimal feature vectors are fed into the back propagation (BP) neural network to distinguish between foreign matter and bubbles. The mind evolutionary algorithm (MEA) is applied to optimize the connection weights and thresholds of the BP neural network to obtain a higher classification accuracy and faster convergence rate. Experimental results show that the proposed method can effectively detect visible foreign matter in 250-mL transfusion bottles. The misdetection rate and false alarm rate are low, and the detection accuracy and detection speed are satisfactory. PMID:25347581

  18. Use of capillary gas chromatography with negative ion-chemical ionization mass spectrometry for the determination of perfluorocarbon tracers in the atmosphere.

    PubMed

    Cooke, K M; Simmonds TPG; Nickless, G; Makepeace, A P

    2001-09-01

    A sensitive and selective technique for the quantitative measurement of atmospheric perfluorocarbon trace species at the sub part per quadrillion (10(-15)) levels is presented. The method utilizes advances in adsorbent enrichment techniques coupled with benchtop capillary gas chromatography and negative ion-chemical ionization mass spectrometry. The development and enhancement of sampling technology for tracer experiments is described, and the results from background measurements and a preliminary field experiment are presented. The overall precision of the analytical method with respect to the preferred tracer for these atmospheric transport studies, perfluoromethylcyclohexane, was +/-1.7%. The background concentrations of perfluorodimethylcyclobutane, perfluoromethylcyclopentane, and perfluoromethylcyclohexane at a remote coastal location (Mace Head, Ireland, 53 degrees N, 10 degrees W) were found to be 2.5 (+/-0.4), 6.8 (+/-1.0), and 5.2 fL L(-1) (+/-1.3), respectively. Background concentrations within an urban conurbation (Bristol, U.K.) were slightly greater at 3.0 (+/-1.5), 8.1 (+/-1.8), and 6.3 fL L(-1) (+/-1.1), respectively.

  19. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  20. Recovery of intrinsic fluorescence from single-point interstitial measurements for quantification of doxorubicin concentration.

    PubMed

    Baran, Timothy M; Foster, Thomas H

    2013-10-01

    We developed a method for the recovery of intrinsic fluorescence from single-point measurements in highly scattering and absorbing samples without a priori knowledge of the sample optical properties. The goal of the study was to demonstrate accurate recovery of fluorophore concentration in samples with widely varying background optical properties, while simultaneously recovering the optical properties. Tissue-simulating phantoms containing doxorubicin, MnTPPS, and Intralipid-20% were created, and fluorescence measurements were performed using a single isotropic probe. The resulting spectra were analyzed using a forward-adjoint fluorescence model in order to recover the fluorophore concentration and background optical properties. We demonstrated recovery of doxorubicin concentration with a mean error of 11.8%. The concentration of the background absorber was recovered with an average error of 23.2% and the scattering spectrum was recovered with a mean error of 19.8%. This method will allow for the determination of local concentrations of fluorescent drugs, such as doxorubicin, from minimally invasive fluorescence measurements. This is particularly interesting in the context of transarterial chemoembolization (TACE) treatment of liver cancer. © 2013 Wiley Periodicals, Inc.

  1. Heterogeneous CPU-GPU moving targets detection for UAV video

    NASA Astrophysics Data System (ADS)

    Li, Maowen; Tang, Linbo; Han, Yuqi; Yu, Chunlei; Zhang, Chao; Fu, Huiquan

    2017-07-01

    Moving targets detection is gaining popularity in civilian and military applications. On some monitoring platform of motion detection, some low-resolution stationary cameras are replaced by moving HD camera based on UAVs. The pixels of moving targets in the HD Video taken by UAV are always in a minority, and the background of the frame is usually moving because of the motion of UAVs. The high computational cost of the algorithm prevents running it at higher resolutions the pixels of frame. Hence, to solve the problem of moving targets detection based UAVs video, we propose a heterogeneous CPU-GPU moving target detection algorithm for UAV video. More specifically, we use background registration to eliminate the impact of the moving background and frame difference to detect small moving targets. In order to achieve the effect of real-time processing, we design the solution of heterogeneous CPU-GPU framework for our method. The experimental results show that our method can detect the main moving targets from the HD video taken by UAV, and the average process time is 52.16ms per frame which is fast enough to solve the problem.

  2. Spatiotemporal models for the simulation of infrared backgrounds

    NASA Astrophysics Data System (ADS)

    Wilkes, Don M.; Cadzow, James A.; Peters, R. Alan, II; Li, Xingkang

    1992-09-01

    It is highly desirable for designers of automatic target recognizers (ATRs) to be able to test their algorithms on targets superimposed on a wide variety of background imagery. Background imagery in the infrared spectrum is expensive to gather from real sources, consequently, there is a need for accurate models for producing synthetic IR background imagery. We have developed a model for such imagery that will do the following: Given a real, infrared background image, generate another image, distinctly different from the one given, that has the same general visual characteristics as well as the first and second-order statistics of the original image. The proposed model consists of a finite impulse response (FIR) kernel convolved with an excitation function, and histogram modification applied to the final solution. A procedure for deriving the FIR kernel using a signal enhancement algorithm has been developed, and the histogram modification step is a simple memoryless nonlinear mapping that imposes the first order statistics of the original image onto the synthetic one, thus the overall model is a linear system cascaded with a memoryless nonlinearity. It has been found that the excitation function relates to the placement of features in the image, the FIR kernel controls the sharpness of the edges and the global spectrum of the image, and the histogram controls the basic coloration of the image. A drawback to this method of simulating IR backgrounds is that a database of actual background images must be collected in order to produce accurate FIR and histogram models. If this database must include images of all types of backgrounds obtained at all times of the day and all times of the year, the size of the database would be prohibitive. In this paper we propose improvements to the model described above that enable time-dependent modeling of the IR background. This approach can greatly reduce the number of actual IR backgrounds that are required to produce a sufficiently accurate mathematical model for synthesizing a similar IR background for different times of the day. Original and synthetic IR backgrounds will be presented. Previous research in simulating IR backgrounds was performed by Strenzwilk, et al., Botkin, et al., and Rapp. The most recent work of Strenzwilk, et al. was based on the use of one-dimensional ARMA models for synthesizing the images. Their results were able to retain the global statistical and spectral behavior of the original image, but the synthetic image was not visually very similar to the original. The research presented in this paper is the result of an attempt to improve upon their results, and represents a significant improvement in quality over previously obtained results.

  3. Improved tomographic reconstructions using adaptive time-dependent intensity normalization.

    PubMed

    Titarenko, Valeriy; Titarenko, Sofya; Withers, Philip J; De Carlo, Francesco; Xiao, Xianghui

    2010-09-01

    The first processing step in synchrotron-based micro-tomography is the normalization of the projection images against the background, also referred to as a white field. Owing to time-dependent variations in illumination and defects in detection sensitivity, the white field is different from the projection background. In this case standard normalization methods introduce ring and wave artefacts into the resulting three-dimensional reconstruction. In this paper the authors propose a new adaptive technique accounting for these variations and allowing one to obtain cleaner normalized data and to suppress ring and wave artefacts. The background is modelled by the product of two time-dependent terms representing the illumination and detection stages. These terms are written as unknown functions, one scaled and shifted along a fixed direction (describing the illumination term) and one translated by an unknown two-dimensional vector (describing the detection term). The proposed method is applied to two sets (a stem Salix variegata and a zebrafish Danio rerio) acquired at the parallel beam of the micro-tomography station 2-BM at the Advanced Photon Source showing significant reductions in both ring and wave artefacts. In principle the method could be used to correct for time-dependent phenomena that affect other tomographic imaging geometries such as cone beam laboratory X-ray computed tomography.

  4. Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency.

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-05-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Figures of merit for detectors in digital radiography. II. Finite number of secondaries and structured backgrounds.

    PubMed

    Pineda, Angel R; Barrett, Harrison H

    2004-02-01

    The current paradigm for evaluating detectors in digital radiography relies on Fourier methods. Fourier methods rely on a shift-invariant and statistically stationary description of the imaging system. The theoretical justification for the use of Fourier methods is based on a uniform background fluence and an infinite detector. In practice, the background fluence is not uniform and detector size is finite. We study the effect of stochastic blurring and structured backgrounds on the correlation between Fourier-based figures of merit and Hotelling detectability. A stochastic model of the blurring leads to behavior similar to what is observed by adding electronic noise to the deterministic blurring model. Background structure does away with the shift invariance. Anatomical variation makes the covariance matrix of the data less amenable to Fourier methods by introducing long-range correlations. It is desirable to have figures of merit that can account for all the sources of variation, some of which are not stationary. For such cases, we show that the commonly used figures of merit based on the discrete Fourier transform can provide an inaccurate estimate of Hotelling detectability.

  6. The International College of Neuropsychopharmacology (CINP) Treatment Guidelines for Bipolar Disorder in Adults (CINP-BD-2017), Part 1: Background and Methods of the Development of Guidelines

    PubMed Central

    Young, Allan; Yatham, Lakshmi; Grunze, Heinz; Vieta, Eduard; Blier, Pierre; Moeller, Hans Jurgen; Kasper, Siegfried

    2017-01-01

    Abstract Background: This paper includes a short description of the important clinical aspects of Bipolar Disorder with emphasis on issues that are important for the therapeutic considerations, including mixed and psychotic features, predominant polarity, and rapid cycling as well as comorbidity. Methods: The workgroup performed a review and critical analysis of the literature concerning grading methods and methods for the development of guidelines. Results: The workgroup arrived at a consensus to base the development of the guideline on randomized controlled trials and related meta-analyses alone in order to follow a strict evidence-based approach. A critical analysis of the existing methods for the grading of treatment options was followed by the development of a new grading method to arrive at efficacy and recommendation levels after the analysis of 32 distinct scenarios of available data for a given treatment option. Conclusion: The current paper reports details on the design, method, and process for the development of CINP guidelines for the treatment of Bipolar Disorder. The rationale and the method with which all data and opinions are combined in order to produce an evidence-based operationalized but also user-friendly guideline and a specific algorithm are described in detail in this paper. PMID:27815414

  7. A deviation display method for visualising data in mobile gamma-ray spectrometry.

    PubMed

    Kock, Peder; Finck, Robert R; Nilsson, Jonas M C; Ostlund, Karl; Samuelsson, Christer

    2010-09-01

    A real time visualisation method, to be used in mobile gamma-spectrometric search operations using standard detector systems is presented. The new method, called deviation display, uses a modified waterfall display to present relative changes in spectral data over energy and time. Using unshielded (137)Cs and (241)Am point sources and different natural background environments, the behaviour of the deviation displays is demonstrated and analysed for two standard detector types (NaI(Tl) and HPGe). The deviation display enhances positive significant changes while suppressing the natural background fluctuations. After an initialization time of about 10min this technique leads to a homogeneous display dominated by the background colour, where even small changes in spectral data are easy to discover. As this paper shows, the deviation display method works well for all tested gamma energies and natural background radiation levels and with both tested detector systems.

  8. Background noise measurements from jet exit vanes designed to reduced flow pulsations in an open-jet wind tunnel

    NASA Technical Reports Server (NTRS)

    Hoad, D. R.; Martin, R. M.

    1985-01-01

    Many open jet wind tunnels experience pulsations of the flow which are typically characterized by periodic low frequency velocity and pressure variations. One method of reducing these fluctuations is to install vanes around the perimeter of the jet exit to protrude into the flow. Although these vanes were shown to be effective in reducing the fluctuation content, they can also increase the test section background noise level. The results of an experimental acoustic program in the Langley 4- by 7-Meter Tunnel is presented which evaluates the effect on tunnel background noise of such modifications to the jet exit nozzle. Noise levels for the baseline tunnel configuration are compared with those for three jet exit nozzle modifications, including an enhanced noise reduction configuration that minimizes the effect of the vanes on the background noise. Although the noise levels for this modified vane configuration were comparable to baseline tunnel background noise levels in this facility, installation of these modified vanes in an acoustic tunnel may be of concern because the noise levels for the vanes could be well above background noise levels in a quiet facility.

  9. Optimization of cDNA microarrays procedures using criteria that do not rely on external standards

    PubMed Central

    Bruland, Torunn; Anderssen, Endre; Doseth, Berit; Bergum, Hallgeir; Beisvag, Vidar; Lægreid, Astrid

    2007-01-01

    Background The measurement of gene expression using microarray technology is a complicated process in which a large number of factors can be varied. Due to the lack of standard calibration samples such as are used in traditional chemical analysis it may be a problem to evaluate whether changes done to the microarray procedure actually improve the identification of truly differentially expressed genes. The purpose of the present work is to report the optimization of several steps in the microarray process both in laboratory practices and in data processing using criteria that do not rely on external standards. Results We performed a cDNA microarry experiment including RNA from samples with high expected differential gene expression termed "high contrasts" (rat cell lines AR42J and NRK52E) compared to self-self hybridization, and optimized a pipeline to maximize the number of genes found to be differentially expressed in the "high contrasts" RNA samples by estimating the false discovery rate (FDR) using a null distribution obtained from the self-self experiment. The proposed high-contrast versus self-self method (HCSSM) requires only four microarrays per evaluation. The effects of blocking reagent dose, filtering, and background corrections methodologies were investigated. In our experiments a dose of 250 ng LNA (locked nucleic acid) dT blocker, no background correction and weight based filtering gave the largest number of differentially expressed genes. The choice of background correction method had a stronger impact on the estimated number of differentially expressed genes than the choice of filtering method. Cross platform microarray (Illumina) analysis was used to validate that the increase in the number of differentially expressed genes found by HCSSM was real. Conclusion The results show that HCSSM can be a useful and simple approach to optimize microarray procedures without including external standards. Our optimizing method is highly applicable to both long oligo-probe microarrays which have become commonly used for well characterized organisms such as man, mouse and rat, as well as to cDNA microarrays which are still of importance for organisms with incomplete genome sequence information such as many bacteria, plants and fish. PMID:17949480

  10. [Situation and assessment of heavy metal pollution in river and mud in one city in Henan Province].

    PubMed

    Xi, Jingzhuan; Li, Cuimei; Wang, Shouying; Jiang, Zhigang; Zhang, Miaomiao; Han, Guangliang

    2010-11-01

    To study the heavy metal contamination status in river water and mud in the suburb of a city in Henan Province. Typical sampling method is used to select a farmland irrigation river of the suburb of a city. Use the atomic absorption spectrophotometry, and measure the heavy metal cadmium (Cd), copper (Cu), lead (Pb) in the river water samples and mud samples by graphite furnace method and flame method, respectively. The results of water were compared with GB 3838-2002, Environmental Quality Standards for Surface Water, and GB 5084-2005, Standards for Irrigation Water Quality. The results of mud were compared with national soil background value. The contents of Cu and Cd in the river samples do not exceed the standard, and that of Pb is 3 to 6 times higher than the standard. According to the single factor pollution index method, the single factor pollution indice of Cu, and Cd in the river are less than 0.2 and are of clean level, while that of Pb reaches 6.84, indicating the Pb pollution in river water is severe. Cu in mud is more than 4 times of the soil background value, and that of Cd is more than 69 times of the soil background value, and that of Pb is more than 2 times of the soil background value. The single item pollution index indicates, in mud, the pollution index of Pb is 2.5, medium level pollution. The pollution indice of Cu and Cd in mud are more than 3, is severe pollution, and the Cd pollution is especially heavy, and the single pollution index reaches 67.76. The comprehensive pollution indice of the river and the mud are 5.346 and 84.115, respectively, indicating that both are at heavy pollution level. The main pollution source of the river originates from Pb, and that of the mud is from Cd and it is required to take measure and control as early as possible.

  11. Nonlocal interactions in color perception: nonlinear processing of chromatic signals from remote inducers.

    PubMed

    Wachtler, T; Albright, T D; Sejnowski, T J

    2001-05-01

    The perceived color of an object depends on the chromaticity of its immediate background. But color appearance is also influenced by remote chromaticities. To quantify these influences, the effects of remote color fields on the appearance of a fixated 2 degrees test field were measured using a forced-choice method. Changes in the appearance of the test field were induced by chromaticity changes of the background and of 2 degrees color fields not adjacent to the test field. The appearance changes induced by the color of the background corresponded to a fraction of between 0.5 and 0.95 of the cone contrast of the background change, depending on the observer. The magnitude of induction by the background color was modulated on average by 7.6% by chromaticity changes in the remote color fields. Chromaticity changes in the remote fields had virtually no inducing effect when they occurred without a change in background color. The spatial range of these chromatic interactions extended over at least 10 degrees from the fovea. They were established within the first few hundred milliseconds after the change of background color and depended only weakly on the number of inducing fields. These results may be interpreted as reflecting rapid chromatic interactions that support robustness of color vision under changing viewing conditions.

  12. Super-resolved all-refocused image with a plenoptic camera

    NASA Astrophysics Data System (ADS)

    Wang, Xiang; Li, Lin; Hou, Guangqi

    2015-12-01

    This paper proposes an approach to produce the super-resolution all-refocused images with the plenoptic camera. The plenoptic camera can be produced by putting a micro-lens array between the lens and the sensor in a conventional camera. This kind of camera captures both the angular and spatial information of the scene in one single shot. A sequence of digital refocused images, which are refocused at different depth, can be produced after processing the 4D light field captured by the plenoptic camera. The number of the pixels in the refocused image is the same as that of the micro-lens in the micro-lens array. Limited number of the micro-lens will result in poor low resolution refocused images. Therefore, not enough details will exist in these images. Such lost details, which are often high frequency information, are important for the in-focus part in the refocused image. We decide to super-resolve these in-focus parts. The result of image segmentation method based on random walks, which works on the depth map produced from the 4D light field data, is used to separate the foreground and background in the refocused image. And focusing evaluation function is employed to determine which refocused image owns the clearest foreground part and which one owns the clearest background part. Subsequently, we employ single image super-resolution method based on sparse signal representation to process the focusing parts in these selected refocused images. Eventually, we can obtain the super-resolved all-focus image through merging the focusing background part and the focusing foreground part in the way of digital signal processing. And more spatial details will be kept in these output images. Our method will enhance the resolution of the refocused image, and just the refocused images owning the clearest foreground and background need to be super-resolved.

  13. Numerical Simulation of Vortex Ring Formation in the Presence of Background Flow: Implications for Squid Propulsion

    NASA Astrophysics Data System (ADS)

    Jiang, Houshuo; Grosenbaugh, Mark A.

    2002-11-01

    Numerical simulations are used to study the laminar vortex ring formation in the presence of background flow. The numerical setup includes a round-headed axisymmetric body with a sharp-wedged opening at the posterior end where a column of fluid is pushed out by a piston inside the body. The piston motion is explicitly included into the simulations by using a deforming mesh. The numerical method is verified by simulating the standard vortex ring formation process in quiescent fluid for a wide range of piston stroke to cylinder diameter ratios (Lm/D). The results from these simulations confirm the existence of a universal formation time scale (formation number) found by others from experimental and numerical studies. For the case of vortex ring formation by the piston/cylinder arrangement in a constant background flow (i.e. the background flow is in the direction of the piston motion), the results show that a smaller fraction of the ejected circulation is delivered into the leading vortex ring, thereby decreasing the formation number. The mechanism behind this reduction is believed to be related to the modification of the shear layer profile between the jet flow and the background flow by the external boundary layer on the outer surface of the cylinder. In effect, the vorticity in the jet is cancelled by the opposite signed vorticity in the external boundary layer. Simulations using different end geometries confirm the general nature of the phenomenon. The thrust generated from the jet and the drag forces acting on the body are calculated with and without background flow for different piston programs. The implications of these results for squid propulsion are discussed.

  14. Low Altitude AVIRIS Data for Mapping Landform Types on West Ship Island, Mississippi

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph; Otvos, Ervin; Giardino, Marco

    2002-01-01

    This paper presents a viewgraph presentation on low altitude AVIRIS data for mapping landform types on West Ship Island, Mississippi. The topics of discussion include: 1) Project background; 2) Mapping methods; 3) Examples of results; 4) Apparent trends; and 5) Final remarks.

  15. Does Entrepreneurship Education Matter? Business Students' Perspectives

    ERIC Educational Resources Information Center

    Egerová, Dana; Eger, Ludvík; Micík, Michal

    2017-01-01

    The paper presents the findings of a mixed-methods study investigating the perceptions of business students in the Czech Republic towards entrepreneurship education, and examining the factors influencing their level of intention to be entrepreneurs. The results indicate that family background significantly influences the student's entrepreneurial…

  16. Pretend Play of Children with Cerebral Palsy

    ERIC Educational Resources Information Center

    Pfeifer, Luzia Iara; Pacciulio, Amanda Mota; dos Santos, Camila Abrao; dos Santos, Jair Licio; Stagnitti, Karen Ellen

    2011-01-01

    Background and Purpose: Evaluate self-initiated pretend play of children with cerebral palsy. Method: Twenty preschool children participated in the study. Pretend play ability was measured by using the child-initiated pretend play assessment culturally adapted to Brazil. Results: There were significant negative correlations between the children's…

  17. Extracurricular Activities and Bullying Perpetration: Results from a Nationally Representative Sample

    ERIC Educational Resources Information Center

    Riese, Alison; Gjelsvik, Annie; Ranney, Megan L.

    2015-01-01

    Background: Bullying is a widespread problem for school-aged children and adolescents. Interventions to reduce bullying are not well disseminated. Extracurricular involvement is, however, common. This study aims to examine the relationship between parent-reported participation in extracurricular activities and bullying perpetration. Methods: Using…

  18. Practitioner Review: Non-Pharmacological Treatments for ADHD: A Lifespan Approach

    ERIC Educational Resources Information Center

    Young, Susan; Amarasinghe, J. Myanthi

    2010-01-01

    Background: Attention-deficit/hyperactivity disorder (ADHD) is a chronic and pervasive developmental disorder that is not restricted to the childhood years. Methods: This paper reviews non-pharmacological interventions that are available at present for preschoolers, school-age children, adolescents and adults. Results: The most appropriate…

  19. The Low Energy Neutrino Spectrometry (LENS) Experiment and LENS prototype, μLENS, initial results

    NASA Astrophysics Data System (ADS)

    Yokley, Zachary

    2012-03-01

    LENS is a low energy solar neutrino detector that will measure the solar neutrino spectrum above 115 keV, >95% of the solar neutrino flux, in real time. The fundamental neutrino reaction in LENS is charged-current based capture on 115-In detected in a liquid scintillator medium. The reaction yields the prompt emission of an electron and the delayed emission of 2 gamma rays that serve as a time & space coincidence tag. Sufficient spatial resolution is used to exploit this signature and suppress background, particularly due to 115-In beta decay. A novel design of optical segmentation (Scintillation Lattice or SL) channels the signal light along the three primary axes. The channeling is achieved via total internal reflection by suitable low index gaps in the segmentation. The spatial resolution of a nuclear event is obtained digitally, much more precisely than possible by common time of flight methods. Advanced Geant4 analysis methods have been developed to suppress adequately the severe background due to 115-In beta decay, achieving at the same time high detection efficiency. LENS physics and detection methods along with initial results characterizing light transport in the as built μLENS prototype will be presented.

  20. Gastroenterology Curriculum in the Canadian Medical School System.

    PubMed

    Dang, ThucNhi Tran; Wong, Clarence; Bistritz, Lana

    2017-01-01

    Background and Purpose. Gastroenterology is a diverse subspecialty that covers a wide array of topics. The preclinical gastroenterology curriculum is often the only formal training that medical students receive prior to becoming residents. There is no Canadian consensus on learning objectives or instructional methods and a general lack of awareness of curriculum at other institutions. This results in variable background knowledge for residents and lack of guidance for course development. Objectives. (1) Elucidate gastroenterology topics being taught at the preclinical level. (2) Determine instructional methods employed to teach gastroenterology content. Results . A curriculum map of gastroenterology topics was constructed from 10 of the medical schools that responded. Topics often not taught included pediatric GI diseases, surgery and trauma, food allergies/intolerances, and obesity. Gastroenterology was taught primarily by gastroenterologists and surgeons. Didactic and small group teaching was the most employed teaching method. Conclusion. This study is the first step in examining the Canadian gastroenterology curriculum at a preclinical level. The data can be used to inform curriculum development so that topics generally lacking are better incorporated in the curriculum. The study can also be used as a guide for further curriculum design and alignment across the country.

  1. Segmentation of the Speaker's Face Region with Audiovisual Correlation

    NASA Astrophysics Data System (ADS)

    Liu, Yuyu; Sato, Yoichi

    The ability to find the speaker's face region in a video is useful for various applications. In this work, we develop a novel technique to find this region within different time windows, which is robust against the changes of view, scale, and background. The main thrust of our technique is to integrate audiovisual correlation analysis into a video segmentation framework. We analyze the audiovisual correlation locally by computing quadratic mutual information between our audiovisual features. The computation of quadratic mutual information is based on the probability density functions estimated by kernel density estimation with adaptive kernel bandwidth. The results of this audiovisual correlation analysis are incorporated into graph cut-based video segmentation to resolve a globally optimum extraction of the speaker's face region. The setting of any heuristic threshold in this segmentation is avoided by learning the correlation distributions of speaker and background by expectation maximization. Experimental results demonstrate that our method can detect the speaker's face region accurately and robustly for different views, scales, and backgrounds.

  2. Image Segmentation Using Minimum Spanning Tree

    NASA Astrophysics Data System (ADS)

    Dewi, M. P.; Armiati, A.; Alvini, S.

    2018-04-01

    This research aim to segmented the digital image. The process of segmentation is to separate the object from the background. So the main object can be processed for the other purposes. Along with the development of technology in digital image processing application, the segmentation process becomes increasingly necessary. The segmented image which is the result of the segmentation process should accurate due to the next process need the interpretation of the information on the image. This article discussed the application of minimum spanning tree on graph in segmentation process of digital image. This method is able to separate an object from the background and the image will change to be the binary images. In this case, the object that being the focus is set in white, while the background is black or otherwise.

  3. Countermeasure against blinding attacks on low-noise detectors with a background-noise-cancellation scheme

    NASA Astrophysics Data System (ADS)

    Lee, Min Soo; Park, Byung Kwon; Woo, Min Ki; Park, Chang Hoon; Kim, Yong-Su; Han, Sang-Wook; Moon, Sung

    2016-12-01

    We developed a countermeasure against blinding attacks on low-noise detectors with a background-noise-cancellation scheme in quantum key distribution (QKD) systems. Background-noise cancellation includes self-differencing and balanced avalanche photon diode (APD) schemes and is considered a promising solution for low-noise APDs, which are critical components in high-performance QKD systems. However, its vulnerability to blinding attacks has been recently reported. In this work, we propose a countermeasure that prevents this potential security loophole from being used in detector blinding attacks. An experimental QKD setup is implemented and various tests are conducted to verify the feasibility and performance of the proposed method. The obtained measurement results show that the proposed scheme successfully detects occurring blinding-attack-based hacking attempts.

  4. Strain-Specific Induction of Experimental Autoimmune Prostatitis (EAP) in Mice

    PubMed Central

    Jackson, Christopher M.; Flies, Dallas B.; Mosse, Claudio A.; Parwani, Anil; Hipkiss, Edward L.; Drake, Charles G.

    2013-01-01

    BACKGROUND Prostatitis, a clinical syndrome characterized by pelvic pain and inflammation, is common in adult males. Although several induced and spontaneous murine models of prostatitis have been explored, the role of genetic background on induction has not been well-defined. METHODS Using a standard methodology for the induction of experimental autoimmune prostatitis (EAP), we investigated both acute and chronic inflammation on several murine genetic backgrounds. RESULTS In our colony, nonobese diabetic (NOD) mice evinced spontaneous prostatitis that was not augmented by immunization with rat prostate extract (RPE). In contrast, the standard laboratory strain Balb/c developed chronic inflammation in response to RPE immunization. Development of EAP in other strains was variable. CONCLUSIONS These data suggest that Balb/c mice injected with RPE may provide a useful model for chronic prostatic inflammation. PMID:23129407

  5. Continuous Glucose Monitoring in Subjects with Type 1 Diabetes: Improvement in Accuracy by Correcting for Background Current

    PubMed Central

    Youssef, Joseph El; Engle, Julia M.; Massoud, Ryan G.; Ward, W. Kenneth

    2010-01-01

    Abstract Background A cause of suboptimal accuracy in amperometric glucose sensors is the presence of a background current (current produced in the absence of glucose) that is not accounted for. We hypothesized that a mathematical correction for the estimated background current of a commercially available sensor would lead to greater accuracy compared to a situation in which we assumed the background current to be zero. We also tested whether increasing the frequency of sensor calibration would improve sensor accuracy. Methods This report includes analysis of 20 sensor datasets from seven human subjects with type 1 diabetes. Data were divided into a training set for algorithm development and a validation set on which the algorithm was tested. A range of potential background currents was tested. Results Use of the background current correction of 4 nA led to a substantial improvement in accuracy (improvement of absolute relative difference or absolute difference of 3.5–5.5 units). An increase in calibration frequency led to a modest accuracy improvement, with an optimum at every 4 h. Conclusions Compared to no correction, a correction for the estimated background current of a commercially available glucose sensor led to greater accuracy and better detection of hypoglycemia and hyperglycemia. The accuracy-optimizing scheme presented here can be implemented in real time. PMID:20879968

  6. US Forest Service and National Park Service Wilderness Aircraft Overflight Study: Sociological background and study plans

    NASA Technical Reports Server (NTRS)

    Harrison, Robin T.; Hartmann, Lawrence

    1990-01-01

    The background and sociological aspects of the combined U.S. Forest Service and National Park Service Wilderness Aircraft Overflight Study (WACOS) are presented. The WACOS broaches a new area of research by combining aspects of outdoor recreation sociology and aircraft noise response studies. The tasks faced create new challenges and require innovative solutions. Background information on the WACOS is presented with special emphasis on sociological considerations. At the time of this writing, no data have yet been collected, so this paper will present background information, related issues, and plans for data collection. Some recent studies indicate that managers of Forest Service wildernesses and National Park Service areas consider aircraft overflights to be a problem to their users in some areas. Additional relevant background research from outdoor recreation sociology is discussed, followed by presentation of the authors' opinions of the most salient sociological issues faced by this study. The goals and desired end products are identified next, followed by a review of the methods anticipated to be used to obtain these results. Finally, a discussion and conclusion section is provided.

  7. Comparison of Standard Link Color Visibility Between Young Adults and Elderly Adults

    NASA Astrophysics Data System (ADS)

    Saito, Daisuke; Saito, Keiichi; Notomi, Kazuhiro; Saito, Masao

    The rapid dissemination of the World Wide Web raises the issue of the Web accessibility, and one of the important things is the combination of a foreground color and a background color. In our previous study, the visibility of web-safe colors on the white background was examined, and the blue used for unvisited standard link color was found high visibility in wide range of ages. Since the usage of the blue and an underline are recommended as a link, in this study, we examined high-visibility background colors to the unvisited standard link color, i.e. blue. One hundred and twenty three background colors to the blue were examined using pair comparison method, and the relationship between the visibility and the color difference was discussed on the uniform color space, CIELAB (L*a*b* color space). As the result, effective background colors to the standard link color were determined on the CIE LAB, that is, L* larger than 68, a* smaller than 50, and b* larger than -50 provided high visibility in wide range of ages.

  8. Central Stars of Planetary Nebulae in the LMC

    NASA Technical Reports Server (NTRS)

    Bianchi, Luciana

    2004-01-01

    In FUSE cycle 2's program B001 we studied Central Stars of Planetary Nebulae (CSPN) in the Large Magellanic Could. All FUSE observations have been successfully completed and have been reduced, analyzed and published. The analysis and the results are summarized below. The FUSE data were reduced using the latest available version of the FUSE calibration pipeline (CALFUSE v2.2.2). The flux of these LMC post-AGB objects is at the threshold of FUSE's sensitivity, and thus special care in the background subtraction was needed during the reduction. Because of their faintness, the targets required many orbit-long exposures, each of which typically had low (target) count-rates. Each calibrated extracted sequence was checked for unacceptable count-rate variations (a sign of detector drift), misplaced extraction windows, and other anomalies. All the good calibrated exposures were combined using FUSE pipeline routines. The default FUSE pipeline attempts to model the background measured off-target and subtracts it from the target spectrum. We found that, for these faint objects, the background appeared to be over-estimated by this method, particularly at shorter wavelengths (i.e., < 1000 A). We therefore tried two other reductions. In the first method, subtraction of the measured background is turned off and and the background is taken to be the model scattered-light scaled by the exposure time. In the second one, the first few steps of the pipeline were run on the individual exposures (correcting for effects unique to each exposure such as Doppler shift, grating motions, etc). Then the photon lists from the individual exposures were combined, and the remaining steps of the pipeline run on the combined file. Thus, more total counts for both the target and background allowed for a better extraction.

  9. A new, fast and semi-automated size determination method (SASDM) for studying multicellular tumor spheroids

    PubMed Central

    Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats

    2005-01-01

    Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948

  10. 3-D imaging of large scale buried structure by 1-D inversion of very early time electromagnetic (VETEM) data

    USGS Publications Warehouse

    Aydmer, A.A.; Chew, W.C.; Cui, T.J.; Wright, D.L.; Smith, D.V.; Abraham, J.D.

    2001-01-01

    A simple and efficient method for large scale three-dimensional (3-D) subsurface imaging of inhomogeneous background is presented. One-dimensional (1-D) multifrequency distorted Born iterative method (DBIM) is employed in the inversion. Simulation results utilizing synthetic scattering data are given. Calibration of the very early time electromagnetic (VETEM) experimental waveforms is detailed along with major problems encountered in practice and their solutions. This discussion is followed by the results of a large scale application of the method to the experimental data provided by the VETEM system of the U.S. Geological Survey. The method is shown to have a computational complexity that is promising for on-site inversion.

  11. Verification of chemistry reference ranges using a simple method in sub-Saharan Africa

    PubMed Central

    Taylor, Douglas; Mandala, Justin; Nanda, Kavita; Van Campenhout, Christel; Agingu, Walter; Madurai, Lorna; Barsch, Eva-Maria; Deese, Jennifer; Van Damme, Lut; Crucitti, Tania

    2016-01-01

    Background Chemistry safety assessments are interpreted by using chemistry reference ranges (CRRs). Verification of CRRs is time consuming and often requires a statistical background. Objectives We report on an easy and cost-saving method to verify CRRs. Methods Using a former method introduced by Sigma Diagnostics, three study sites in sub-Saharan Africa, Bondo, Kenya, and Pretoria and Bloemfontein, South Africa, verified the CRRs for hepatic and renal biochemistry assays performed during a clinical trial of HIV antiretroviral pre-exposure prophylaxis. The aspartate aminotransferase/alanine aminotransferase, creatinine and phosphorus results from 10 clinically-healthy participants at the screening visit were used. In the event the CRRs did not pass the verification, new CRRs had to be calculated based on 40 clinically-healthy participants. Results Within a few weeks, the study sites accomplished verification of the CRRs without additional costs. The aspartate aminotransferase reference ranges for the Bondo, Kenya site and the alanine aminotransferase reference ranges for the Pretoria, South Africa site required adjustment. The phosphorus CRR passed verification and the creatinine CRR required adjustment at every site. The newly-established CRR intervals were narrower than the CRRs used previously at these study sites due to decreases in the upper limits of the reference ranges. As a result, more toxicities were detected. Conclusion To ensure the safety of clinical trial participants, verification of CRRs should be standard practice in clinical trials conducted in settings where the CRR has not been validated for the local population. This verification method is simple, inexpensive, and can be performed by any medical laboratory. PMID:28879112

  12. A comparison of a two-dimensional variational analysis method and a median filter for NSCAT ambiguity removal

    NASA Astrophysics Data System (ADS)

    Henderson, J. M.; Hoffman, R. N.; Leidner, S. M.; Atlas, R.; Brin, E.; Ardizzone, J. V.

    2003-06-01

    The ocean surface vector wind can be measured from space by scatterometers. For a set of measurements observed from several viewing directions and collocated in space and time, there will usually exist two, three, or four consistent wind vectors. These multiple wind solutions are known as ambiguities. Ambiguity removal procedures select one ambiguity at each location. We compare results of two different ambiguity removal algorithms, the operational median filter (MF) used by the Jet Propulsion Laboratory (JPL) and a two-dimensional variational analysis method (2d-VAR). We applied 2d-VAR to the entire NASA Scatterometer (NSCAT) mission, orbit by orbit, using European Centre for Medium-Range Weather Forecasts (ECMWF) 10-m wind analyses as background fields. We also applied 2d-VAR to a 51-day subset of the NSCAT mission using National Centers for Environmental Prediction (NCEP) 1000-hPa wind analyses as background fields. This second data set uses the same background fields as the MF data set. When both methods use the same NCEP background fields as a starting point for ambiguity removal, agreement is very good: Approximately only 3% of the wind vector cells (WVCs) have different ambiguity selections; however, most of the WVCs with changes occur in coherent patches. Since at least one of the selections is in error, this implies that errors due to ambiguity selection are not isolated, but are horizontally correlated. When we examine ambiguity selection differences at synoptic scales, we often find that the 2d-VAR selections are more meteorologically reasonable and more consistent with cloud imagery.

  13. Global universe anisotropy probed by the alignment of structures in the cosmic microwave background.

    PubMed

    Wiaux, Y; Vielva, P; Martínez-González, E; Vandergheynst, P

    2006-04-21

    We question the global universe isotropy by probing the alignment of local structures in the cosmic microwave background (CMB) radiation. The original method proposed relies on a steerable wavelet decomposition of the CMB signal on the sphere. The analysis of the first-year Wilkinson Microwave Anisotropy Probe data identifies a mean preferred plane with a normal direction close to the CMB dipole axis, and a mean preferred direction in this plane, very close to the ecliptic poles axis. Previous statistical anisotropy results are thereby synthesized, but further analyses are still required to establish their origin.

  14. Accuracy of Cochlear Implant Recipients on Speech Reception in Background Music

    PubMed Central

    Gfeller, Kate; Turner, Christopher; Oleson, Jacob; Kliethermes, Stephanie; Driscoll, Virginia

    2012-01-01

    Objectives This study (a) examined speech recognition abilities of cochlear implant (CI) recipients in the spectrally complex listening condition of three contrasting types of background music, and (b) compared performance based upon listener groups: CI recipients using conventional long-electrode (LE) devices, Hybrid CI recipients (acoustic plus electric stimulation), and normal-hearing (NH) adults. Methods We tested 154 LE CI recipients using varied devices and strategies, 21 Hybrid CI recipients, and 49 NH adults on closed-set recognition of spondees presented in three contrasting forms of background music (piano solo, large symphony orchestra, vocal solo with small combo accompaniment) in an adaptive test. Outcomes Signal-to-noise thresholds for speech in music (SRTM) were examined in relation to measures of speech recognition in background noise and multi-talker babble, pitch perception, and music experience. Results SRTM thresholds varied as a function of category of background music, group membership (LE, Hybrid, NH), and age. Thresholds for speech in background music were significantly correlated with measures of pitch perception and speech in background noise thresholds; auditory status was an important predictor. Conclusions Evidence suggests that speech reception thresholds in background music change as a function of listener age (with more advanced age being detrimental), structural characteristics of different types of music, and hearing status (residual hearing). These findings have implications for everyday listening conditions such as communicating in social or commercial situations in which there is background music. PMID:23342550

  15. Automatic background updating for video-based vehicle detection

    NASA Astrophysics Data System (ADS)

    Hu, Chunhai; Li, Dongmei; Liu, Jichuan

    2008-03-01

    Video-based vehicle detection is one of the most valuable techniques for the Intelligent Transportation System (ITS). The widely used video-based vehicle detection technique is the background subtraction method. The key problem of this method is how to subtract and update the background effectively. In this paper an efficient background updating scheme based on Zone-Distribution for vehicle detection is proposed to resolve the problems caused by sudden camera perturbation, sudden or gradual illumination change and the sleeping person problem. The proposed scheme is robust and fast enough to satisfy the real-time constraints of vehicle detection.

  16. Detection of telomerase on upconversion nanoparticle modified cellulose paper.

    PubMed

    Wang, Faming; Li, Wen; Wang, Jiasi; Ren, Jinsong; Qu, Xiaogang

    2015-07-25

    Herein we report a convenient and sensitive method for the detection of telomerase activity based on upconversion nanoparticle (UCNP) modified cellulose paper. Compared with many solution-phase systems, this paper chip is more stable and easily stores the test results. What's more, the low background fluorescence of the UCNPs increases the sensitivity of this method, and the low telomerase levels in different cell lines can clearly be discriminated by the naked eye.

  17. Factor analysis as a tool for spectral line component separation 21cm emission in the direction of L1780

    NASA Technical Reports Server (NTRS)

    Toth, L. V.; Mattila, K.; Haikala, L.; Balazs, L. G.

    1992-01-01

    The spectra of the 21cm HI radiation from the direction of L1780, a small high-galactic latitude dark/molecular cloud, were analyzed by multivariate methods. Factor analysis was performed on HI (21cm) spectra in order to separate the different components responsible for the spectral features. The rotated, orthogonal factors explain the spectra as a sum of radiation from the background (an extended HI emission layer), and from the L1780 dark cloud. The coefficients of the cloud-indicator factors were used to locate the HI 'halo' of the molecular cloud. Our statistically derived 'background' and 'cloud' spectral profiles, as well as the spatial distribution of the HI halo emission distribution were compared to the results of a previous study which used conventional methods analyzing nearly the same data set.

  18. Ocean Predictability and Uncertainty Forecasts Using Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, M.; Hogan, P. J.; Rowley, C. D.; Smedstad, O. M.; Wallcraft, A. J.; Penny, S. G.

    2017-12-01

    Ocean predictability and uncertainty are studied with an ensemble system that has been developed based on the US Navy's operational HYCOM using the Local Ensemble Transfer Kalman Filter (LETKF) technology. One of the advantages of this method is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates operational observations using ensemble method. The background covariance during this assimilation process is implicitly supplied with the ensemble avoiding the difficult task of developing tangent linear and adjoint models out of HYCOM with the complicated hybrid isopycnal vertical coordinate for 4D-VAR. The flow-dependent background covariance from the ensemble will be an indispensable part in the next generation hybrid 4D-Var/ensemble data assimilation system. The predictability and uncertainty for the ocean forecasts are studied initially for the Gulf of Mexico. The results are compared with another ensemble system using Ensemble Transfer (ET) method which has been used in the Navy's operational center. The advantages and disadvantages are discussed.

  19. ArtDeco: a beam-deconvolution code for absolute cosmic microwave background measurements

    NASA Astrophysics Data System (ADS)

    Keihänen, E.; Reinecke, M.

    2012-12-01

    We present a method for beam-deconvolving cosmic microwave background (CMB) anisotropy measurements. The code takes as input the time-ordered data along with the corresponding detector pointings and known beam shapes, and produces as output the harmonic aTlm, aElm, and aBlm coefficients of the observed sky. From these one can derive temperature and Q and U polarisation maps. The method is applicable to absolute CMB measurements with wide sky coverage, and is independent of the scanning strategy. We tested the code with extensive simulations, mimicking the resolution and data volume of Planck 30 GHz and 70 GHz channels, but with exaggerated beam asymmetry. We applied it to multipoles up to l = 1700 and examined the results in both pixel space and harmonic space. We also tested the method in presence of white noise. The code is released under the terms of the GNU General Public License and can be obtained from http://sourceforge.net/projects/art-deco/

  20. Image processing of metal surface with structured light

    NASA Astrophysics Data System (ADS)

    Luo, Cong; Feng, Chang; Wang, Congzheng

    2014-09-01

    In structured light vision measurement system, the ideal image of structured light strip, in addition to black background , contains only the gray information of the position of the stripe. However, the actual image contains image noise, complex background and so on, which does not belong to the stripe, and it will cause interference to useful information. To extract the stripe center of mental surface accurately, a new processing method was presented. Through adaptive median filtering, the noise can be preliminary removed, and the noise which introduced by CCD camera and measured environment can be further removed with difference image method. To highlight fine details and enhance the blurred regions between the stripe and noise, the sharping algorithm is used which combine the best features of Laplacian operator and Sobel operator. Morphological opening operation and closing operation are used to compensate the loss of information.Experimental results show that this method is effective in the image processing, not only to restrain the information but also heighten contrast. It is beneficial for the following processing.

  1. Probabilistic segmentation and intensity estimation for microarray images.

    PubMed

    Gottardo, Raphael; Besag, Julian; Stephens, Matthew; Murua, Alejandro

    2006-01-01

    We describe a probabilistic approach to simultaneous image segmentation and intensity estimation for complementary DNA microarray experiments. The approach overcomes several limitations of existing methods. In particular, it (a) uses a flexible Markov random field approach to segmentation that allows for a wider range of spot shapes than existing methods, including relatively common 'doughnut-shaped' spots; (b) models the image directly as background plus hybridization intensity, and estimates the two quantities simultaneously, avoiding the common logical error that estimates of foreground may be less than those of the corresponding background if the two are estimated separately; and (c) uses a probabilistic modeling approach to simultaneously perform segmentation and intensity estimation, and to compute spot quality measures. We describe two approaches to parameter estimation: a fast algorithm, based on the expectation-maximization and the iterated conditional modes algorithms, and a fully Bayesian framework. These approaches produce comparable results, and both appear to offer some advantages over other methods. We use an HIV experiment to compare our approach to two commercial software products: Spot and Arrayvision.

  2. Physical Activity Engagement in Young People with Down Syndrome: Investigating Parental Beliefs

    ERIC Educational Resources Information Center

    Alesi, Marianna; Pepi, Annamaria

    2017-01-01

    Background: Despite the wide documentation of the physical/psychological benefits derived from regular physical activity (PA), high levels of inactivity are reported among people with Down syndrome. Method: Semi-structured interviews were conducted with 13 parents of young people with Down syndrome. Results Three facilitation themes were…

  3. Providing Counseling for Individuals with Alzheimer's Disease and Their Caregivers

    ERIC Educational Resources Information Center

    Granello, Paul F.; Fleming, Matthew S.

    2008-01-01

    Alzheimer's disease is a progressive condition that results in brain wasting and eventual death. With its increasing diagnosis rate, counselors will likely acquire clients with Alzheimer's disease or their caregivers. Important background information and several practical counseling methods are provided that may assist counselors working with this…

  4. Association of objectively measured physical activity with cardiovascular risk in mobility-limited older adults

    USDA-ARS?s Scientific Manuscript database

    Background: Data are sparse regarding the impacts of habitual physical activity (PA) and sedentary behavior on cardiovascular (CV) risk in older adults with mobility limitations. Methods and Results: This study examined the baseline, cross-sectional association between CV risk and objectively measur...

  5. Evaluation of the Special Tertiary Admissions Test (STAT)

    ERIC Educational Resources Information Center

    Coates, Hamish; Friedman, Tim

    2010-01-01

    This paper reports findings from the first national Australian study of the predictive validity of the Special Tertiary Admissions Test (STAT). Background on tertiary admissions procedures in Australia is presented, followed by information on STAT and the research methods. The results affirm that STAT, through the provision of baseline and…

  6. Predictors of Care-Giver Stress in Families of Preschool-Aged Children with Developmental Disabilities

    ERIC Educational Resources Information Center

    Plant, K. M.; Sanders, M. R.

    2007-01-01

    Background: This study examined the predictors, mediators and moderators of parent stress in families of preschool-aged children with developmental disability. Method: One hundred and five mothers of preschool-aged children with developmental disability completed assessment measures addressing the key variables. Results: Analyses demonstrated that…

  7. Annotation: Childhood-Onset Schizophrenia--Clinical and Treatment Issues

    ERIC Educational Resources Information Center

    Asarnow, Joan Rosenbaum; Tompson, Martha C.; McGrath, Emily P.

    2004-01-01

    Background: In the past 10 years, there has been increased research on childhood-onset schizophrenia and clear advances have been achieved. Method: This annotation reviews the recent clinical and treatment literature on childhood-onset schizophrenia. Results: There is now strong evidence that the syndrome of childhood-onset schizophrenia exists…

  8. High dose simvastatin exhibits enhanced lipid lowering effects relative to simvastatin/ezetimibe combination therapy

    USDA-ARS?s Scientific Manuscript database

    Technical Abstract: Background: Statins are the frontline in cholesterol reduction therapies; however use in combination with agents that possess complimentary mechanisms of action may achieve further reduce in LDL-C. Methods and Results: Thirty-nine patients were treated with either 80mg simvasta...

  9. Multi-year effects of feral sorghum spp under ambient and global change conditions in sunlit mesocosms

    EPA Science Inventory

    Background/Questions/Methods Biofuel crops, proposed as a means to reduce dependence on fossil fuels, raise concerns regarding ecological risks of their escape from cultivation. We report here second year results of our study on potential effects of feral biofuel crops on nati...

  10. An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.

    ERIC Educational Resources Information Center

    Moehs, Peter J.; Levine, Samuel

    1982-01-01

    A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…

  11. Exergame Apps and Physical Activity: The Results of the ZOMBIE Trial

    ERIC Educational Resources Information Center

    Cowdery, Joan; Majeske, Paul; Frank, Rebecca; Brown, Devin

    2015-01-01

    Background: Although there are thousands of health and fitness smartphone apps currently available, little research exists regarding the effects of mobile app technology on physical activity behavior. Purpose: The purpose of this study was to test whether Exergame smartphone applications increase physical activity levels. Methods: This was a…

  12. A whole-genome assembly of the domestic cow, Bos taurus

    USDA-ARS?s Scientific Manuscript database

    Background: The genome of the domestic cow, Bos taurus, was sequenced using a mixture of hierarchical and whole-genome shotgun sequencing methods. Results: We have assembled the 35 million sequence reads and applied a variety of assembly improvement techniques, creating an assembly of 2.86 billion b...

  13. Aetiology of Autism: Findings and Questions

    ERIC Educational Resources Information Center

    Rutter, M.

    2005-01-01

    Background Although there is good evidence that autism is a multifactorial disorder, an adequate understanding of the genetic and non-genetic causes has yet to be achieved. Methods Empirical research findings and conceptual reviews are reviewed with respect to evidence on possible causal influences. Results Much the strongest evidence concerns the…

  14. Happiness in Midlife Parental Roles: A Contextual Mixed Methods Analysis

    ERIC Educational Resources Information Center

    Mitchell, Barbara A.

    2010-01-01

    This article focuses on midlife parental role satisfaction using date from a culturally diverse sample of 490 Metro Vancouver, British Columbia, Canada, parents. Results show that most parents are happy in their roles. Income satisfaction, intergenerational relationship quality, parents' main activity, health, age, ethnic background, and…

  15. Graph-based analysis of connectivity in spatially-explicit population models: HexSim and the Connectivity Analysis Toolkit

    EPA Science Inventory

    Background / Question / Methods Planning for the recovery of threatened species is increasingly informed by spatially-explicit population models. However, using simulation model results to guide land management decisions can be difficult due to the volume and complexity of model...

  16. Emanuel Miller Lecture: Confusions and Controversies about Asperger Syndrome

    ERIC Educational Resources Information Center

    Frith, Uta

    2004-01-01

    Background: Hans Asperger drew attention to individuals who show the core symptoms of autism in the presence of high verbal intelligence. Methods: A review of the literature explores current issues concerning the diagnosis and nature of Asperger syndrome. Results: The behavioural and neurophysiological evidence to date suggests that Asperger…

  17. Silviculture of southwestern mixed conifers and aspen: the status of our knowledge

    Treesearch

    John R. Jones

    1974-01-01

    Describes the status of our knowledge about mixed conifer silviculture in the interior Southwest. Ecological background is reviewed first, followed by description of silvicultural methods. Relevant literature is discussed, along with observations, experience, and results of unpublished research. Contains unpublished input by subject-matter specialists and southwestern...

  18. Rapidly measured indicators of recreational water quality andswimming-associated illness at marine beaches: a prospective cohort study

    EPA Science Inventory

    Background: In the United States and elsewhere, recreational water is monitored for fecal indicator bacteria to prevent illness. Standard methods to measure fecal indicator bacteria take at least 24 hours to obtain results. Molecular approaches such as quantitative polymerase cha...

  19. Configurations of Common Childhood Psychosocial Risk Factors

    ERIC Educational Resources Information Center

    Copeland, William; Shanahan, Lilly; Costello, E. Jane; Angold, Adrian

    2009-01-01

    Background: Co-occurrence of psychosocial risk factors is commonplace, but little is known about psychiatrically-predictive configurations of psychosocial risk factors. Methods: Latent class analysis (LCA) was applied to 17 putative psychosocial risk factors in a representative population sample of 920 children ages 9 to 17. The resultant class…

  20. Selenium geochemistry in reclaimed phosphate mine soils and its relationship with plant bioavailability

    USDA-ARS?s Scientific Manuscript database

    Background and Aims Selenium contamination and accumulation in vegetation have resulted in Se toxicity in livestock and wildlife in reclaimed phosphate mine soils in Southeastern Idaho. Methods Plant and soil samples were collected from five study sites near phosphate mines. Soil physiochemical pr...

  1. Apples to Oranges: Benchmarking Vocational Education and Training Programmes

    ERIC Educational Resources Information Center

    Bogetoft, Peter; Wittrup, Jesper

    2017-01-01

    This paper discusses methods for benchmarking vocational education and training colleges and presents results from a number of models. It is conceptually difficult to benchmark vocational colleges. The colleges typically offer a wide range of course programmes, and the students come from different socioeconomic backgrounds. We solve the…

  2. Prediction of Transport Properties of Permeants through Polymer Films. A Simple Gravimetric Experiment.

    ERIC Educational Resources Information Center

    Britton, L. N.; And Others

    1988-01-01

    Considers the applicability of the simple emersion/weight-gain method for predicting diffusion coefficients, solubilities, and permeation rates of chemicals in polymers that do not undergo physical and chemical deterioration. Presents the theoretical background, procedures and typical results related to this activity. (CW)

  3. The Relations among Cumulative Risk, Parenting, and Behavior Problems during Early Childhood

    ERIC Educational Resources Information Center

    Trentacosta, Christopher J.; Hyde, Luke W.; Shaw, Daniel S.; Dishion, Thomas J.; Gardner, Frances; Wilson, Melvin

    2008-01-01

    Background: This study examined relations among cumulative risk, nurturant and involved parenting, and behavior problems across early childhood. Methods: Cumulative risk, parenting, and behavior problems were measured in a sample of low-income toddlers participating in a family-centered program to prevent conduct problems. Results: Path analysis…

  4. Instruments Assessing Anxiety in Adults with Intellectual Disabilities: A Systematic Review

    ERIC Educational Resources Information Center

    Hermans, Heidi; van der Pas, Femke H.; Evenhuis, Heleen M.

    2011-01-01

    Background: In the last decades several instruments measuring anxiety in adults with intellectual disabilities have been developed. Aim: To give an overview of the characteristics and psychometric properties of self-report and informant-report instruments measuring anxiety in this group. Method: Systematic review of the literature. Results:…

  5. Effects of biocontrol on short-term nutrient dynamics in a tamarix-invaded riparian ecosystem

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods Saltcedar (Tamarix ramosissima) invasion and subsequent dominance in biologically and functionally diverse riparian ecosystems across the western U.S. has lead to release of the leaf beetle (Diorhabda elongata) as a biological control agent, and has resulted in large-sca...

  6. An Experiment on Isomerism in Metal-Amino Acid Complexes.

    ERIC Educational Resources Information Center

    Harrison, R. Graeme; Nolan, Kevin B.

    1982-01-01

    Background information, laboratory procedures, and discussion of results are provided for syntheses of cobalt (III) complexes, I-III, illustrating three possible bonding modes of glycine to a metal ion (the complex cations II and III being linkage/geometric isomers). Includes spectrophotometric and potentiometric methods to distinguish among the…

  7. A Community Study of Association between Parenting Dimensions and Externalizing Behaviors

    ERIC Educational Resources Information Center

    Sharma, Vandana; Sandhu, Gurpreet K.

    2006-01-01

    Background: Association between parenting dimensions and externalizing behaviors in children was examined. Method: Data on children from the middle class families of Patiala (N = 240) were collected from schools and families. Parents completed questionnaires on parenting dimensions and externalizing behaviors of children. Results: Analysis of…

  8. Differential Outcomes in Agency-Based Mental Health Care between Minority and Majority Youth

    ERIC Educational Resources Information Center

    Patterson, David A.; Dulmus, Catherine N.; Maguin, Eugene; Perkins, Jacob

    2016-01-01

    Background: Childhood mental health problems represent a significant public health concern globally. There is a converging discussion among researchers and practitioners alike that the research results of effectiveness studies are not fully generalizable and applicable to ethnoracial minority groups in real-world practice settings. Methods:…

  9. Cosmic 21 cm delensing of microwave background polarization and the minimum detectable energy scale of inflation.

    PubMed

    Sigurdson, Kris; Cooray, Asantha

    2005-11-18

    We propose a new method for removing gravitational lensing from maps of cosmic microwave background (CMB) polarization anisotropies. Using observations of anisotropies or structures in the cosmic 21 cm radiation, emitted or absorbed by neutral hydrogen atoms at redshifts 10 to 200, the CMB can be delensed. We find this method could allow CMB experiments to have increased sensitivity to a background of inflationary gravitational waves (IGWs) compared to methods relying on the CMB alone and may constrain models of inflation which were heretofore considered to have undetectable IGW amplitudes.

  10. Identification of source velocities on 3D structures in non-anechoic environments: Theoretical background and experimental validation of the inverse patch transfer functions method

    NASA Astrophysics Data System (ADS)

    Aucejo, M.; Totaro, N.; Guyader, J.-L.

    2010-08-01

    In noise control, identification of the source velocity field remains a major problem open to investigation. Consequently, methods such as nearfield acoustical holography (NAH), principal source projection, the inverse frequency response function and hybrid NAH have been developed. However, these methods require free field conditions that are often difficult to achieve in practice. This article presents an alternative method known as inverse patch transfer functions, designed to identify source velocities and developed in the framework of the European SILENCE project. This method is based on the definition of a virtual cavity, the double measurement of the pressure and particle velocity fields on the aperture surfaces of this volume, divided into elementary areas called patches and the inversion of impedances matrices, numerically computed from a modal basis obtained by FEM. Theoretically, the method is applicable to sources with complex 3D geometries and measurements can be carried out in a non-anechoic environment even in the presence of other stationary sources outside the virtual cavity. In the present paper, the theoretical background of the iPTF method is described and the results (numerical and experimental) for a source with simple geometry (two baffled pistons driven in antiphase) are presented and discussed.

  11. Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.

    2018-02-01

    The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.

  12. Compressive sensing method for recognizing cat-eye effect targets.

    PubMed

    Li, Li; Li, Hui; Dang, Ersheng; Liu, Bo

    2013-10-01

    This paper proposes a cat-eye effect target recognition method with compressive sensing (CS) and presents a recognition method (sample processing before reconstruction based on compressed sensing, or SPCS) for image processing. In this method, the linear projections of original image sequences are applied to remove dynamic background distractions and extract cat-eye effect targets. Furthermore, the corresponding imaging mechanism for acquiring active and passive image sequences is put forward. This method uses fewer images to recognize cat-eye effect targets, reduces data storage, and translates the traditional target identification, based on original image processing, into measurement vectors processing. The experimental results show that the SPCS method is feasible and superior to the shape-frequency dual criteria method.

  13. Evaluation of a Method Using Three Genomic Guided Escherichia coli Markers for Phylogenetic Typing of E. coli Isolates of Various Genetic Backgrounds

    PubMed Central

    Hamamoto, Kouta; Ueda, Shuhei; Yamamoto, Yoshimasa

    2015-01-01

    Genotyping and characterization of bacterial isolates are essential steps in the identification and control of antibiotic-resistant bacterial infections. Recently, one novel genotyping method using three genomic guided Escherichia coli markers (GIG-EM), dinG, tonB, and dipeptide permease (DPP), was reported. Because GIG-EM has not been fully evaluated using clinical isolates, we assessed this typing method with 72 E. coli collection of reference (ECOR) environmental E. coli reference strains and 63 E. coli isolates of various genetic backgrounds. In this study, we designated 768 bp of dinG, 745 bp of tonB, and 655 bp of DPP target sequences for use in the typing method. Concatenations of the processed marker sequences were used to draw GIG-EM phylogenetic trees. E. coli isolates with identical sequence types as identified by the conventional multilocus sequence typing (MLST) method were localized to the same branch of the GIG-EM phylogenetic tree. Sixteen clinical E. coli isolates were utilized as test isolates without prior characterization by conventional MLST and phylogenetic grouping before GIG-EM typing. Of these, 14 clinical isolates were assigned to a branch including only isolates of a pandemic clone, E. coli B2-ST131-O25b, and these results were confirmed by conventional typing methods. Our results suggested that the GIG-EM typing method and its application to phylogenetic trees might be useful tools for the molecular characterization and determination of the genetic relationships among E. coli isolates. PMID:25809972

  14. Development of wheelchair caster testing equipment and preliminary testing of caster models

    PubMed Central

    Mhatre, Anand; Ott, Joseph

    2017-01-01

    Background Because of the adverse environmental conditions present in less-resourced environments (LREs), the World Health Organization (WHO) has recommended that specialised wheelchair test methods may need to be developed to support product quality standards in these environments. A group of experts identified caster test methods as a high priority because of their common failure in LREs, and the insufficiency of existing test methods described in the International Organization for Standardization (ISO) Wheelchair Testing Standards (ISO 7176). Objectives To develop and demonstrate the feasibility of a caster system test method. Method Background literature and expert opinions were collected to identify existing caster test methods, caster failures common in LREs and environmental conditions present in LREs. Several conceptual designs for the caster testing method were developed, and through an iterative process using expert feedback, a final concept and a design were developed and a prototype was fabricated. Feasibility tests were conducted by testing a series of caster systems from wheelchairs used in LREs, and failure modes were recorded and compared to anecdotal reports about field failures. Results The new caster testing system was developed and it provides the flexibility to expose caster systems to typical conditions in LREs. Caster failures such as stem bolt fractures, fork fractures, bearing failures and tire cracking occurred during testing trials and are consistent with field failures. Conclusion The new caster test system has the capability to incorporate necessary test factors that degrade caster quality in LREs. Future work includes developing and validating a testing protocol that results in failure modes common during wheelchair use in LRE. PMID:29062762

  15. Evaluation of a Method Using Three Genomic Guided Escherichia coli Markers for Phylogenetic Typing of E. coli Isolates of Various Genetic Backgrounds.

    PubMed

    Hamamoto, Kouta; Ueda, Shuhei; Yamamoto, Yoshimasa; Hirai, Itaru

    2015-06-01

    Genotyping and characterization of bacterial isolates are essential steps in the identification and control of antibiotic-resistant bacterial infections. Recently, one novel genotyping method using three genomic guided Escherichia coli markers (GIG-EM), dinG, tonB, and dipeptide permease (DPP), was reported. Because GIG-EM has not been fully evaluated using clinical isolates, we assessed this typing method with 72 E. coli collection of reference (ECOR) environmental E. coli reference strains and 63 E. coli isolates of various genetic backgrounds. In this study, we designated 768 bp of dinG, 745 bp of tonB, and 655 bp of DPP target sequences for use in the typing method. Concatenations of the processed marker sequences were used to draw GIG-EM phylogenetic trees. E. coli isolates with identical sequence types as identified by the conventional multilocus sequence typing (MLST) method were localized to the same branch of the GIG-EM phylogenetic tree. Sixteen clinical E. coli isolates were utilized as test isolates without prior characterization by conventional MLST and phylogenetic grouping before GIG-EM typing. Of these, 14 clinical isolates were assigned to a branch including only isolates of a pandemic clone, E. coli B2-ST131-O25b, and these results were confirmed by conventional typing methods. Our results suggested that the GIG-EM typing method and its application to phylogenetic trees might be useful tools for the molecular characterization and determination of the genetic relationships among E. coli isolates. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  16. Modeling of proton-induced radioactivation background in hard X-ray telescopes: Geant4-based simulation and its demonstration by Hitomi ’s measurement in a low Earth orbit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi

    Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less

  17. Modeling of proton-induced radioactivation background in hard X-ray telescopes: Geant4-based simulation and its demonstration by Hitomi ’s measurement in a low Earth orbit

    DOE PAGES

    Odaka, Hirokazu; Asai, Makoto; Hagino, Kouichi; ...

    2018-02-19

    Hard X-ray astronomical observatories in orbit suffer from a significant amount of background due to radioactivation induced by cosmic-ray protons and/or geomagnetically trapped protons. Within the framework of a full Monte Carlo simulation, we present modeling of in-orbit instrumental background which is dominated by radioactivation. To reduce the computation time required by straightforward simulations of delayed emissions from activated isotopes, we insert a semi-analytical calculation that converts production probabilities of radioactive isotopes by interaction of the primary protons into decay rates at measurement time of all secondary isotopes. Therefore, our simulation method is separated into three steps: (1) simulation ofmore » isotope production, (2) semi-analytical conversion to decay rates, and (3) simulation of decays of the isotopes at measurement time. This method is verified by a simple setup that has a CdTe semiconductor detector, and shows a 100-fold improvement in efficiency over the straightforward simulation. To demonstrate its experimental performance, the simulation framework was tested against data measured with a CdTe sensor in the Hard X-ray Imager onboard the Hitomi X-ray Astronomy Satellite, which was put into a low Earth orbit with an altitude of 570 km and an inclination of 31°, and thus experienced a large amount of irradiation from geomagnetically trapped protons during its passages through the South Atlantic Anomaly. The simulation is able to treat full histories of the proton irradiation and multiple measurement windows. As a result, the simulation results agree very well with the measured data, showing that the measured background is well described by the combination of proton-induced radioactivation of the CdTe detector itself and thick Bi 4Ge 3O 12 scintillator shields, leakage of cosmic X-ray background and albedo gamma-ray radiation, and emissions from naturally contaminated isotopes in the detector system.« less

  18. Retrievals of Thick Cloud Optical Depth from the Geoscience Laser Altimeter System (GLAS) by Calibration of Solar Background Signal

    NASA Technical Reports Server (NTRS)

    Yang, Yuekui; Marshak, Alexander; Chiu, J. Christine; Wiscombe, Warren J.; Palm, Stephen P.; Davis, Anthony B.; Spangenberg, Douglas A.; Nguyen, Louis; Spinhirne, James D.; Minnis, Patrick

    2008-01-01

    Laser beams emitted from the Geoscience Laser Altimeter System (GLAS), as well as other space-borne laser instruments, can only penetrate clouds to a limit of a few optical depths. As a result, only optical depths of thinner clouds (< about 3 for GLAS) are retrieved from the reflected lidar signal. This paper presents a comprehensive study of possible retrievals of optical depth of thick clouds using solar background light and treating GLAS as a solar radiometer. To do so we first calibrate the reflected solar radiation received by the photon-counting detectors of GLAS' 532 nm channel, which is the primary channel for atmospheric products. The solar background radiation is regarded as a noise to be subtracted in the retrieval process of the lidar products. However, once calibrated, it becomes a signal that can be used in studying the properties of optically thick clouds. In this paper, three calibration methods are presented: (I) calibration with coincident airborne and GLAS observations; (2) calibration with coincident Geostationary Operational Environmental Satellite (GOES) and GLAS observations of deep convective clouds; (3) calibration from the first principles using optical depth of thin water clouds over ocean retrieved by GLAS active remote sensing. Results from the three methods agree well with each other. Cloud optical depth (COD) is retrieved from the calibrated solar background signal using a one-channel retrieval. Comparison with COD retrieved from GOES during GLAS overpasses shows that the average difference between the two retrievals is 24%. As an example, the COD values retrieved from GLAS solar background are illustrated for a marine stratocumulus cloud field that is too thick to be penetrated by the GLAS laser. Based on this study, optical depths for thick clouds will be provided as a supplementary product to the existing operational GLAS cloud products in future GLAS data releases.

  19. The Background to Current Theories of Scuffing

    DTIC Science & Technology

    1973-01-01

    attention because, by neglecting axial flow , it can be treated in two dimensions. This has resulted in a fairly complete theoretical analysis...the contact. This method was essentially one of measuring the volume rate of flow through the contact, which was directly related to the pad...exit constriction. The pressure and temperature were also measured in the axial direction (105) and the results indicated that side leakage was

  20. Ground-to-air flow visualization using Solar Calcium-K line Background-Oriented Schlieren

    NASA Astrophysics Data System (ADS)

    Hill, Michael A.; Haering, Edward A.

    2017-01-01

    The Calcium-K Eclipse Background-Oriented Schlieren experiment was performed as a proof of concept test to evaluate the effectiveness of using the solar disk as a background to perform the Background-Oriented Schlieren (BOS) method of flow visualization. A ground-based imaging system was equipped with a Calcium-K line optical etalon filter to enable the use of the chromosphere of the sun as the irregular background to be used for BOS. A US Air Force T-38 aircraft performed three supersonic runs which eclipsed the sun as viewed from the imaging system. The images were successfully post-processed using optical flow methods to qualitatively reveal the density gradients in the flow around the aircraft.

  1. genRE: A Method to Extend Gridded Precipitation Climatology Data Sets in Near Real-Time for Hydrological Forecasting Purposes

    NASA Astrophysics Data System (ADS)

    van Osnabrugge, B.; Weerts, A. H.; Uijlenhoet, R.

    2017-11-01

    To enable operational flood forecasting and drought monitoring, reliable and consistent methods for precipitation interpolation are needed. Such methods need to deal with the deficiencies of sparse operational real-time data compared to quality-controlled offline data sources used in historical analyses. In particular, often only a fraction of the measurement network reports in near real-time. For this purpose, we present an interpolation method, generalized REGNIE (genRE), which makes use of climatological monthly background grids derived from existing gridded precipitation climatology data sets. We show how genRE can be used to mimic and extend climatological precipitation data sets in near real-time using (sparse) real-time measurement networks in the Rhine basin upstream of the Netherlands (approximately 160,000 km2). In the process, we create a 1.2 × 1.2 km transnational gridded hourly precipitation data set for the Rhine basin. Precipitation gauge data are collected, spatially interpolated for the period 1996-2015 with genRE and inverse-distance squared weighting (IDW), and then evaluated on the yearly and daily time scale against the HYRAS and EOBS climatological data sets. Hourly fields are compared qualitatively with RADOLAN radar-based precipitation estimates. Two sources of uncertainty are evaluated: station density and the impact of different background grids (HYRAS versus EOBS). The results show that the genRE method successfully mimics climatological precipitation data sets (HYRAS/EOBS) over daily, monthly, and yearly time frames. We conclude that genRE is a good interpolation method of choice for real-time operational use. genRE has the largest added value over IDW for cases with a low real-time station density and a high-resolution background grid.

  2. Attitudes and Perceptions of Patients, Caregivers, and Health Care Providers toward Background Music in Patient Care Areas: An Exploratory Study

    PubMed Central

    Perez-Cruz, Pedro; Nguyen, Linh; Rhondali, Wadih; Hui, David; Palmer, J. Lynn; Sevy, Ingrid; Richardson, Michael

    2012-01-01

    Abstract Background Background music can be used to distract from ordinary sounds and improve wellbeing in patient care areas. Little is known about individuals' attitudes and beliefs about music versus ordinary sound in this setting. Objectives To assess the preferences of patients, caregivers and healthcare providers regarding background music or ordinary sound in outpatient and inpatient care areas, and to explore their attitudes and perceptions towards music in general. Methods All participants were exposed to background music in outpatient or inpatient clinical settings. 99 consecutive patients, 101 caregivers and 65 out of 70 eligible healthcare providers (93%) completed a survey about music attitudes and preferences. The primary outcome was a preference for background music over ordinary sound in patient care areas. Results Preference for background music was high and similar across groups (70 patients (71%), 71 caregivers (71%) and 46 providers (71%), p=0.58). The three groups had very low disapproval for background music in patient care areas (10%, 9% and 12%, respectively; p=0.91). Black ethnicity independently predicted lower preference for background music (OR: 0.47, 95%CI: 0.23, 0.98). Patients, caregivers and providers reported recent use of music for themselves for the purpose of enjoyment (69%, 80% and 86% respectively p=0.02). Age, gender, religion and education level significantly predicted preferences for specific music styles. Conclusion Background music in patient care areas was preferred to ordinary sound by patients, caregivers and providers. Demographics of the population are strong determinants of music style preferences. PMID:22957677

  3. Migration background is associated with caries in Viennese school children, even if parents have received a higher education

    PubMed Central

    2014-01-01

    Background A low level of education and the migration background of parents are associated with the development of caries in children. The aim of this study was to evaluate whether a higher educational level of parents can overcome risks for the development of caries in immigrants in Vienna, Austria. Methods The educational level of the parents, the school type, and the caries status of 736 randomly selected twelve-year-old children with and without migration background was determined in this cross sectional study. In children attending school in Vienna the decayed, missing, and filled teeth (DMFT) index was determined. For statistical analysis, a mixed negative-binomial-model was used. Results The caries status of the children with migration background was significantly worse compared to that of the native Viennese population. A significant interaction was found between migration background and the educational level of the parents (p = 0.045). No interaction was found between the school type and either the migration background (p = 0.220) or the education level of the parents (p = 0.08). In parents with a higher scholarly education level, migration background (p < 0.01) and school type (p = 0.018) showed an association with DMFT values. In parents with a low education level, however, migration background and school type had no significant association with DMFT values. Conclusion These data indicate that children with a migration background are at higher risk to acquire caries than other Viennese children, even when the parents have received a higher education. PMID:24886105

  4. Fingerprint Liveness Detection in the Presence of Capable Intruders.

    PubMed

    Sequeira, Ana F; Cardoso, Jaime S

    2015-06-19

    Fingerprint liveness detection methods have been developed as an attempt to overcome the vulnerability of fingerprint biometric systems to spoofing attacks. Traditional approaches have been quite optimistic about the behavior of the intruder assuming the use of a previously known material. This assumption has led to the use of supervised techniques to estimate the performance of the methods, using both live and spoof samples to train the predictive models and evaluate each type of fake samples individually. Additionally, the background was often included in the sample representation, completely distorting the decision process. Therefore, we propose that an automatic segmentation step should be performed to isolate the fingerprint from the background and truly decide on the liveness of the fingerprint and not on the characteristics of the background. Also, we argue that one cannot aim to model the fake samples completely since the material used by the intruder is unknown beforehand. We approach the design by modeling the distribution of the live samples and predicting as fake the samples very unlikely according to that model. Our experiments compare the performance of the supervised approaches with the semi-supervised ones that rely solely on the live samples. The results obtained differ from the ones obtained by the more standard approaches which reinforces our conviction that the results in the literature are misleadingly estimating the true vulnerability of the biometric system.

  5. Tools for Implementing Science Practice in a Large Introductory Class

    NASA Astrophysics Data System (ADS)

    Prothero, W. A.

    2008-12-01

    Scientists must have in-depth background knowledge of their subject area and know where current knowledge can be advanced. They perform experiments that gather data to test new or existing theories, present their findings at meetings, publish their results, critically review the results of others, and respond to the reviews of their own work. In the context of a course, these activities correspond to learning the background material by listening to lectures or reading a text, formulating a problem, exploring data using student friendly data access and plotting software, giving brief talks to classmates in a small class or lab setting, writing a science paper or lab report, reviewing the writing of their peers, and receiving feedback (and grades) from their instructors and/or peers. These activities can be supported using course management software and online resources. The "LearningWithData" software system allows solid Earth (focused on plate tectonics) data exploration and plotting. Ocean data access, display, and plotting are also supported. Background material is delivered using animations and slide show type displays. Students are accountable for their learning through included homework assignments. Lab and small group activities provide support for data exploration and interpretation. Writing is most efficiently implemented using the "Calibrated Peer Review" method. This methodology is available at http://cpr.molsci.ucla.edu/. These methods have been successfully implemented in a large oceanography class at UCSB.

  6. Fingerprint Liveness Detection in the Presence of Capable Intruders

    PubMed Central

    Sequeira, Ana F.; Cardoso, Jaime S.

    2015-01-01

    Fingerprint liveness detection methods have been developed as an attempt to overcome the vulnerability of fingerprint biometric systems to spoofing attacks. Traditional approaches have been quite optimistic about the behavior of the intruder assuming the use of a previously known material. This assumption has led to the use of supervised techniques to estimate the performance of the methods, using both live and spoof samples to train the predictive models and evaluate each type of fake samples individually. Additionally, the background was often included in the sample representation, completely distorting the decision process. Therefore, we propose that an automatic segmentation step should be performed to isolate the fingerprint from the background and truly decide on the liveness of the fingerprint and not on the characteristics of the background. Also, we argue that one cannot aim to model the fake samples completely since the material used by the intruder is unknown beforehand. We approach the design by modeling the distribution of the live samples and predicting as fake the samples very unlikely according to that model. Our experiments compare the performance of the supervised approaches with the semi-supervised ones that rely solely on the live samples. The results obtained differ from the ones obtained by the more standard approaches which reinforces our conviction that the results in the literature are misleadingly estimating the true vulnerability of the biometric system. PMID:26102491

  7. A noninvasive method for measuring the velocity of diffuse hydrothermal flow by tracking moving refractive index anomalies

    NASA Astrophysics Data System (ADS)

    Mittelstaedt, Eric; Davaille, Anne; van Keken, Peter E.; Gracias, Nuno; Escartin, Javier

    2010-10-01

    Diffuse flow velocimetry (DFV) is introduced as a new, noninvasive, optical technique for measuring the velocity of diffuse hydrothermal flow. The technique uses images of a motionless, random medium (e.g., rocks) obtained through the lens of a moving refraction index anomaly (e.g., a hot upwelling). The method works in two stages. First, the changes in apparent background deformation are calculated using particle image velocimetry (PIV). The deformation vectors are determined by a cross correlation of pixel intensities across consecutive images. Second, the 2-D velocity field is calculated by cross correlating the deformation vectors between consecutive PIV calculations. The accuracy of the method is tested with laboratory and numerical experiments of a laminar, axisymmetric plume in fluids with both constant and temperature-dependent viscosity. Results show that average RMS errors are ˜5%-7% and are most accurate in regions of pervasive apparent background deformation which is commonly encountered in regions of diffuse hydrothermal flow. The method is applied to a 25 s video sequence of diffuse flow from a small fracture captured during the Bathyluck'09 cruise to the Lucky Strike hydrothermal field (September 2009). The velocities of the ˜10°C-15°C effluent reach ˜5.5 cm/s, in strong agreement with previous measurements of diffuse flow. DFV is found to be most accurate for approximately 2-D flows where background objects have a small spatial scale, such as sand or gravel.

  8. MADGiC: a model-based approach for identifying driver genes in cancer

    PubMed Central

    Korthauer, Keegan D.; Kendziorski, Christina

    2015-01-01

    Motivation: Identifying and prioritizing somatic mutations is an important and challenging area of cancer research that can provide new insights into gene function as well as new targets for drug development. Most methods for prioritizing mutations rely primarily on frequency-based criteria, where a gene is identified as having a driver mutation if it is altered in significantly more samples than expected according to a background model. Although useful, frequency-based methods are limited in that all mutations are treated equally. It is well known, however, that some mutations have no functional consequence, while others may have a major deleterious impact. The spatial pattern of mutations within a gene provides further insight into their functional consequence. Properly accounting for these factors improves both the power and accuracy of inference. Also important is an accurate background model. Results: Here, we develop a Model-based Approach for identifying Driver Genes in Cancer (termed MADGiC) that incorporates both frequency and functional impact criteria and accommodates a number of factors to improve the background model. Simulation studies demonstrate advantages of the approach, including a substantial increase in power over competing methods. Further advantages are illustrated in an analysis of ovarian and lung cancer data from The Cancer Genome Atlas (TCGA) project. Availability and implementation: R code to implement this method is available at http://www.biostat.wisc.edu/ kendzior/MADGiC/. Contact: kendzior@biostat.wisc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25573922

  9. A novel visual saliency detection method for infrared video sequences

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Zhang, Yuzhen; Ning, Chen

    2017-12-01

    Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.

  10. A new method for detecting small and dim targets in starry background

    NASA Astrophysics Data System (ADS)

    Yao, Rui; Zhang, Yanning; Jiang, Lei

    2011-08-01

    Small visible optical space targets detection is one of the key issues in the research of long-range early warning and space debris surveillance. The SNR(Signal to Noise Ratio) of the target is very low because of the self influence of image device. Random noise and background movement also increase the difficulty of target detection. In order to detect small visible optical space targets effectively and rapidly, we bring up a novel detecting method based on statistic theory. Firstly, we get a reasonable statistical model of visible optical space image. Secondly, we extract SIFT(Scale-Invariant Feature Transform) feature of the image frames, and calculate the transform relationship, then use the transform relationship to compensate whole visual field's movement. Thirdly, the influence of star was wiped off by using interframe difference method. We find segmentation threshold to differentiate candidate targets and noise by using OTSU method. Finally, we calculate statistical quantity to judge whether there is the target for every pixel position in the image. Theory analysis shows the relationship of false alarm probability and detection probability at different SNR. The experiment result shows that this method could detect target efficiently, even the target passing through stars.

  11. A method of measuring gold nanoparticle concentrations by x-ray fluorescence for biomedical applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu Di; Li Yuhua; Wong, Molly D.

    Purpose: This paper reports a technique that enables the quantitative determination of the concentration of gold nanoparticles (GNPs) through the accurate detection of their fluorescence radiation in the diagnostic x-ray spectrum. Methods: Experimentally, x-ray fluorescence spectra of 1.9 and 15 nm GNP solutions are measured using an x-ray spectrometer, individually and within chicken breast tissue samples. An optimal combination of excitation and emission filters is determined to segregate the fluorescence spectra at 66.99 and 68.80 keV from the background scattering. A roadmap method is developed that subtracts the scattered radiation (acquired before the insertion of GNP solutions) from the signalmore » radiation acquired after the GNP solutions are inserted. Results: The methods effectively minimize the background scattering in the spectrum measurements, showing linear relationships between GNP solutions from 0.1% to 10% weight concentration and from 0.1% to 1.0% weight concentration inside a chicken breast tissue sample. Conclusions: The investigation demonstrated the potential of imaging gold nanoparticles quantitatively in vivo for in-tissue studies, but future studies will be needed to investigate the ability to apply this method to clinical applications.« less

  12. Dissipation function and adaptive gradient reconstruction based smoke detection in video

    NASA Astrophysics Data System (ADS)

    Li, Bin; Zhang, Qiang; Shi, Chunlei

    2017-11-01

    A method for smoke detection in video is proposed. The camera monitoring the scene is assumed to be stationary. With the atmospheric scattering model, dissipation function is reflected transmissivity between the background objects in the scene and the camera. Dark channel prior and fast bilateral filter are used for estimating dissipation function which is only the function of the depth of field. Based on dissipation function, visual background extractor (ViBe) can be used for detecting smoke as a result of smoke's motion characteristics as well as detecting other moving targets. Since smoke has semi-transparent parts, the things which are covered by these parts can be recovered by poisson equation adaptively. The similarity between the recovered parts and the original background parts in the same position is calculated by Normalized Cross Correlation (NCC) and the original background's value is selected from the frame which is nearest to the current frame. The parts with high similarity are considered as smoke parts.

  13. Paper-polymer composite devices with minimal fluorescence background.

    PubMed

    Wang, Chang-Ming; Chen, Chong-You; Liao, Wei-Ssu

    2017-04-22

    Polymer film incorporated paper-based devices show advantages in simplicity and rugged backing. However, their applications are restricted by the high fluorescence background interference of conventional laminating pouches. Herein, we report a straightforward approach for minimal fluorescence background device fabrication, in which filter paper was shaped and laminated in between two biaxially oriented polypropylene (OPP) and polyvinyl butyral (PVB) composite films. This composite film provides mechanical strength for enhanced device durability, protection from environmental contamination, and prevents reagent degradation. This approach was tested by the determination of copper ions with a fluorescent probe, while the detection of glucose was used to illustrate the improved device durability. Our results show that lamination by the polymer composite lengthens device lifetime, while allowing for fluorescence detection methods combination with greatly reduced fluorescent background widely present in commercially available lamination pouches. By the combination of rapid device prototyping with low cost materials, we believe that this composite design would further expand the potential of paper-based devices. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Infrared image segmentation method based on spatial coherence histogram and maximum entropy

    NASA Astrophysics Data System (ADS)

    Liu, Songtao; Shen, Tongsheng; Dai, Yao

    2014-11-01

    In order to segment the target well and suppress background noises effectively, an infrared image segmentation method based on spatial coherence histogram and maximum entropy is proposed. First, spatial coherence histogram is presented by weighting the importance of the different position of these pixels with the same gray-level, which is obtained by computing their local density. Then, after enhancing the image by spatial coherence histogram, 1D maximum entropy method is used to segment the image. The novel method can not only get better segmentation results, but also have a faster computation time than traditional 2D histogram-based segmentation methods.

  15. Accurate quantification of fluorescent targets within turbid media based on a decoupled fluorescence Monte Carlo model.

    PubMed

    Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming

    2015-07-01

    We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.

  16. The EPIC-MOS Particle-Induced Background Spectrum

    NASA Technical Reports Server (NTRS)

    Kuntz, K. D.; Snowden, S. L.

    2006-01-01

    We have developed a method for constructing a spectrum of the particle-induced instrumental background of the XMM-Newton EPIC MOS detectors that can be used for observations of the diffuse background and extended sources that fill a significant fraction of the instrument field of view. The strength and spectrum of the particle-induced background, that is, the background due to the interaction of particles with the detector and the detector surroundings, is temporally variable as well as spatially variable over individual chips. Our method uses a combination of the filter-wheel-closed data and a database of unexposed-region data to construct a spectrum of the "quiescent" background. We show that, using this method of background subtraction, the differences between independent observations of the same region of "blank sky" are consistent with the statistical uncertainties except when there is clear evidence of solar wind charge exchange emission. We use the blank sky observations to show that contamination by SWCX emission is a strong function of the solar wind proton flux, and that observations through the flanks of the magnetosheath appear to be contaminated only at much higher solar wind fluxes. We have also developed a spectral model of the residual soft proton flares, which allows their effects to be removed to a substantial degree during spectral fitting.

  17. Real-time airborne gamma-ray background estimation using NASVD with MLE and radiation transport for calibration

    NASA Astrophysics Data System (ADS)

    Kulisek, J. A.; Schweppe, J. E.; Stave, S. C.; Bernacki, B. E.; Jordan, D. V.; Stewart, T. N.; Seifert, C. E.; Kernan, W. J.

    2015-06-01

    Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this challenge, we have developed a new technique for real-time estimation of background gamma radiation from aerial measurements without the need for human analyst intervention. The method can be calibrated using radiation transport simulations along with data from previous flights over areas for which the isotopic composition need not be known. Over the examined measured and simulated data sets, the method generated accurate background estimates even in the presence of a strong, 60Co source. The potential to track large and abrupt changes in background spectral shape and magnitude was demonstrated. The method can be implemented fairly easily in most modern computing languages and environments.

  18. Effects of Systematic Cue Exposure Through Virtual Reality on Cigarette Craving

    PubMed Central

    Pericot-Valverde, Irene; Secades-Villa, Roberto; Gutiérrez-Maldonado, José

    2014-01-01

    Introduction: Cigarette cravings have been associated with less successful attempts to quit smoking and a greater likelihood of relapse after smoking cessation. Background craving refers to a relatively steady and continuous experience of craving, while cue-induced craving refers to phases of intense craving triggered by cues associated with smoking. Cue exposure treatment (CET) involves repeated exposure to stimuli associated with substance use in order to reduce craving responses. However, mixed results have been found regarding the effect of CET on both types of craving. The aim of this study was to assess the effect of systematic virtual reality cue exposure treatment (VR-CET) on background and cue-induced cravings. Methods: Participants were 48 treatment-seeking smokers. The VR-CET consisted of prolonged exposure sessions to several interactive virtual environments. The VR-CET was applied once a week over 5 weeks. An individualized hierarchy of exposure was drawn up for each patient starting from the easiest virtual environment. Background and cue-induced cravings were recorded in each session. Results: Cue-induced craving decreased over each session as a result of prolonged exposure. VR-CET also reduced cue-induced and background cravings across the 5 sessions, showing a cumulative effect across the exposure sessions. Conclusions: Our results evidenced the utility of VR-CET in reducing both types of cigarette craving. A combination of CET through VR with psychological treatments may improve current treatments for smoking cessation. PMID:24962558

  19. One-dimensional backreacting holographic superconductors with exponential nonlinear electrodynamics

    NASA Astrophysics Data System (ADS)

    Ghotbabadi, B. Binaei; Zangeneh, M. Kord; Sheykhi, A.

    2018-05-01

    In this paper, we investigate the effects of nonlinear exponential electrodynamics as well as backreaction on the properties of one-dimensional s-wave holographic superconductors. We continue our study both analytically and numerically. In analytical study, we employ the Sturm-Liouville method while in numerical approach we perform the shooting method. We obtain a relation between the critical temperature and chemical potential analytically. Our results show a good agreement between analytical and numerical methods. We observe that the increase in the strength of both nonlinearity and backreaction parameters causes the formation of condensation in the black hole background harder and critical temperature lower. These results are consistent with those obtained for two dimensional s-wave holographic superconductors.

  20. Cat-eye effect target recognition with single-pixel detectors

    NASA Astrophysics Data System (ADS)

    Jian, Weijian; Li, Li; Zhang, Xiaoyue

    2015-12-01

    A prototype of cat-eye effect target recognition with single-pixel detectors is proposed. Based on the framework of compressive sensing, it is possible to recognize cat-eye effect targets by projecting a series of known random patterns and measuring the backscattered light with three single-pixel detectors in different locations. The prototype only requires simpler, less expensive detectors and extends well beyond the visible spectrum. The simulations are accomplished to evaluate the feasibility of the proposed prototype. We compared our results to that obtained from conventional cat-eye effect target recognition methods using area array sensor. The experimental results show that this method is feasible and superior to the conventional method in dynamic and complicated backgrounds.

  1. Standardization and validation of a cytometric bead assay to assess antibodies to multiple Plasmodium falciparum recombinant antigens

    PubMed Central

    2012-01-01

    Background Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Methods Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Results Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. Conclusion With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA. PMID:23259607

  2. Evaluation of phage assay for rapid phenotypic detection of rifampicin resistance in Mycobacterium tuberculosis

    PubMed Central

    Yzquierdo, Sergio Luis; Lemus, Dihadenys; Echemendia, Miguel; Montoro, Ernesto; McNerney, Ruth; Martin, Anandi; Palomino, Juan Carlos

    2006-01-01

    Background Conventional methods for susceptibility testing require several months before results can be reported. However, rapid methods to determine drug susceptibility have been developed recently. Phage assay have been reported as a rapid useful tools for antimicrobial susceptibility testing. The aim of this study was to apply the Phage assay for rapid detection of resistance on Mycobacterium tuberculosis strains in Cuba. Methods Phage D29 assay was performed on 102 M. tuberculosis strains to detect rifampicin resistance. The results were compared with the proportion method (gold standard) to evaluate the sensitivity and specificity of Phage assay. Results Phage assay results were available in 2 days whereas Proportion Methods results were obtain in 42 days. A total of 44 strains were detected as rifampicin resistant by both methods. However, one strains deemed resistant by Proportion Methods was susceptible by Phage assay. The sensitivity and specificity of Phage assay were 97.8 % and 100% respectively. Conclusion Phage assay provides rapid and reliable results for susceptibility testing; it's easy to perform, requires no specialized equipment and is applicable to drug susceptibility testing in low income countries where tuberculosis is a major public health problem. PMID:16630356

  3. Reduced background autofluorescence for cell imaging using nanodiamonds and lanthanide chelates.

    PubMed

    Cordina, Nicole M; Sayyadi, Nima; Parker, Lindsay M; Everest-Dass, Arun; Brown, Louise J; Packer, Nicolle H

    2018-03-14

    Bio-imaging is a key technique in tracking and monitoring important biological processes and fundamental biomolecular interactions, however the interference of background autofluorescence with targeted fluorophores is problematic for many bio-imaging applications. This study reports on two novel methods for reducing interference with cellular autofluorescence for bio-imaging. The first method uses fluorescent nanodiamonds (FNDs), containing nitrogen vacancy centers. FNDs emit at near-infrared wavelengths typically higher than most cellular autofluorescence; and when appropriately functionalized, can be used for background-free imaging of targeted biomolecules. The second method uses europium-chelating tags with long fluorescence lifetimes. These europium-chelating tags enhance background-free imaging due to the short fluorescent lifetimes of cellular autofluorescence. In this study, we used both methods to target E-selectin, a transmembrane glycoprotein that is activated by inflammation, to demonstrate background-free fluorescent staining in fixed endothelial cells. Our findings indicate that both FND and Europium based staining can improve fluorescent bio-imaging capabilities by reducing competition with cellular autofluorescence. 30 nm nanodiamonds coated with the E-selectin antibody was found to enable the most sensitive detective of E-selectin in inflamed cells, with a 40-fold increase in intensity detected.

  4. Mapping gravitational-wave backgrounds using methods from CMB analysis: Application to pulsar timing arrays

    NASA Astrophysics Data System (ADS)

    Gair, Jonathan; Romano, Joseph D.; Taylor, Stephen; Mingarelli, Chiara M. F.

    2014-10-01

    We describe an alternative approach to the analysis of gravitational-wave backgrounds, based on the formalism used to characterize the polarization of the cosmic microwave background. In contrast to standard analyses, this approach makes no assumptions about the nature of the background and so has the potential to reveal much more about the physical processes that generated it. An arbitrary background can be decomposed into modes whose angular dependence on the sky is given by gradients and curls of spherical harmonics. We derive the pulsar timing overlap reduction functions for the individual modes, which are given by simple combinations of spherical harmonics evaluated at the pulsar locations. We show how these can be used to recover the components of an arbitrary background, giving explicit results for both isotropic and anisotropic uncorrelated backgrounds. We also find that the response of a pulsar timing array to curl modes is identically zero, so half of the gravitational-wave sky will never be observed using pulsar timing, no matter how many pulsars are included in the array. An isotropic, unpolarized and uncorrelated background can be accurately represented using only three modes, and so a search of this type will be only slightly more complicated than the standard cross-correlation search using the Hellings and Downs overlap reduction function. However, by measuring the components of individual modes of the background and checking for consistency with isotropy, this approach has the potential to reveal much more information. Each individual mode on its own describes a background that is correlated between different points on the sky. A measurement of the components that indicates the presence of correlations in the background on large angular scales would suggest startling new physics.

  5. SU-G-IeP2-08: Investigation On Signal Detectability in Volumetric Cone Beam CT Images with Anatomical Background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Han, M; Baek, J

    2016-06-15

    Purpose: To investigate the slice direction dependent detectability in cone beam CT images with anatomical background. Methods: We generated 3D anatomical background images using breast anatomy model. To generate 3D breast anatomy, we filtered 3D Gaussian noise with a square root of 1/f{sup 3}, and then assigned the attenuation coefficient of glandular (0.8cm{sup −1}) and adipose (0.46 cm{sup −1}) tissues based on voxel values. Projections were acquired by forward projection, and quantum noise was added to the projection data. The projection data were reconstructed by FDK algorithm. We compared the detectability of a 3 mm spherical signal in the imagemore » reconstructed from four different backprojection Methods: Hanning weighted ramp filter with linear interpolation (RECON1), Hanning weighted ramp filter with Fourier interpolation (RECON2), ramp filter with linear interpolation (RECON3), and ramp filter with Fourier interpolation (RECON4), respectively. We computed task SNR of the spherical signal in transverse and longitudinal planes using channelized Hotelling observer with Laguerre-Gauss channels. Results: Transverse plane has similar task SNR values for different backprojection methods, while longitudinal plane has a maximum task SNR value in RECON1. For all backprojection methods, longitudinal plane has higher task SNR than transverse plane. Conclusion: In this work, we investigated detectability for different slice direction in cone beam CT images with anatomical background. Longitudinal plane has a higher task SNR than transverse plane, and backprojection with hanning weighted ramp filter with linear interpolation method (i.e., RECON1) produced the highest task SNR among four different backprojection methods. This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the IT Consilience Creative Programs(IITP-2015-R0346-15-1008) supervised by the IITP (Institute for Information & Communications Technology Promotion), Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the MSIP (2015R1C1A1A01052268) and framework of international cooperation program managed by NRF (NRF-2015K2A1A2067635).« less

  6. Vehicle license plate recognition in dense fog based on improved atmospheric scattering model

    NASA Astrophysics Data System (ADS)

    Tang, Chunming; Lin, Jun; Chen, Chunkai; Dong, Yancheng

    2018-04-01

    An effective method based on improved atmospheric scattering model is proposed in this paper to handle the problem of the vehicle license plate location and recognition in dense fog. Dense fog detection is performed firstly by the top-hat transformation and the vertical edge detection, and the moving vehicle image is separated from the traffic video image. After the vehicle image is decomposed into two layers: structure and texture layers, the glow layer is separated from the structure layer to get the background layer. Followed by performing the mean-pooling and the bicubic interpolation algorithm, the atmospheric light map of the background layer can be predicted, meanwhile the transmission of the background layer is estimated through the grayed glow layer, whose gray value is altered by linear mapping. Then, according to the improved atmospheric scattering model, the final restored image can be obtained by fusing the restored background layer and the optimized texture layer. License plate location is performed secondly by a series of morphological operations, connected domain analysis and various validations. Characters extraction is achieved according to the projection. Finally, an offline trained pattern classifier of hybrid discriminative restricted boltzmann machines (HDRBM) is applied to recognize the characters. Experimental results on thorough data sets are reported to demonstrate that the proposed method can achieve high recognition accuracy and works robustly in the dense fog traffic environment during 24h or one day.

  7. Toward real-time quantification of fluorescence molecular probes using target/background ratio for guiding biopsy and endoscopic therapy of esophageal neoplasia.

    PubMed

    Jiang, Yang; Gong, Yuanzheng; Rubenstein, Joel H; Wang, Thomas D; Seibel, Eric J

    2017-04-01

    Multimodal endoscopy using fluorescence molecular probes is a promising method of surveying the entire esophagus to detect cancer progression. Using the fluorescence ratio of a target compared to a surrounding background, a quantitative value is diagnostic for progression from Barrett's esophagus to high-grade dysplasia (HGD) and esophageal adenocarcinoma (EAC). However, current quantification of fluorescent images is done only after the endoscopic procedure. We developed a Chan-Vese-based algorithm to segment fluorescence targets, and subsequent morphological operations to generate background, thus calculating target/background (T/B) ratios, potentially to provide real-time guidance for biopsy and endoscopic therapy. With an initial processing speed of 2 fps and by calculating the T/B ratio for each frame, our method provides quasireal-time quantification of the molecular probe labeling to the endoscopist. Furthermore, an automatic computer-aided diagnosis algorithm can be applied to the recorded endoscopic video, and the overall T/B ratio is calculated for each patient. The receiver operating characteristic curve was employed to determine the threshold for classification of HGD/EAC using leave-one-out cross-validation. With 92% sensitivity and 75% specificity to classify HGD/EAC, our automatic algorithm shows promising results for a surveillance procedure to help manage esophageal cancer and other cancers inspected by endoscopy.

  8. A ligation-triggered DNAzyme cascade for amplified fluorescence detection of biological small molecules with zero-background signal.

    PubMed

    Lu, Li-Min; Zhang, Xiao-Bing; Kong, Rong-Mei; Yang, Bin; Tan, Weihong

    2011-08-03

    Many types of fluorescent sensing systems have been reported for biological small molecules. Particularly, several methods have been developed for the recognition of ATP or NAD(+), but they only show moderate sensitivity, and they cannot discriminate either ATP or NAD(+) from their respective analogues. We have addressed these limitations and report here a dual strategy which combines split DNAzyme-based background reduction with catalytic and molecular beacon (CAMB)-based amplified detection to develop a ligation-triggered DNAzyme cascade, resulting in ultrahigh sensitivity. First, the 8-17 DNAzyme is split into two separate oligonucleotide fragments as the building blocks for the DNA ligation reaction, thereby providing a zero-background signal to improve overall sensitivity. Next, a CAMB strategy is further employed for amplified signal detection achieved through cycling and regenerating the DNAzyme to realize the true enzymatic multiple turnover (one enzyme catalyzes the cleavage of several substrates) of catalytic beacons. This combination of zero-background signal and signal amplification significantly improves the sensitivity of the sensing systems, resulting in detection limits of 100 and 50 pM for ATP and NAD(+), respectively, much lower than those of previously reported biosensors. Moreover, by taking advantage of the highly specific biomolecule-dependence of the DNA ligation reaction, the developed DNAzyme cascades show significantly high selectivity toward the target cofactor (ATP or NAD(+)), and the target biological small molecule can be distinguished from its analogues. Therefore, as a new and universal platform for the design of DNA ligation reaction-based sensing systems, this novel ligation-triggered DNAzyme cascade method may find a broad spectrum of applications in both environmental and biomedical fields.

  9. Subspace-based optimization method for inverse scattering problems with an inhomogeneous background medium

    NASA Astrophysics Data System (ADS)

    Chen, Xudong

    2010-07-01

    This paper proposes a version of the subspace-based optimization method to solve the inverse scattering problem with an inhomogeneous background medium where the known inhomogeneities are bounded in a finite domain. Although the background Green's function at each discrete point in the computational domain is not directly available in an inhomogeneous background scenario, the paper uses the finite element method to simultaneously obtain the Green's function at all discrete points. The essence of the subspace-based optimization method is that part of the contrast source is determined from the spectrum analysis without using any optimization, whereas the orthogonally complementary part is determined by solving a lower dimension optimization problem. This feature significantly speeds up the convergence of the algorithm and at the same time makes it robust against noise. Numerical simulations illustrate the efficacy of the proposed algorithm. The algorithm presented in this paper finds wide applications in nondestructive evaluation, such as through-wall imaging.

  10. An improved algorithm of laser spot center detection in strong noise background

    NASA Astrophysics Data System (ADS)

    Zhang, Le; Wang, Qianqian; Cui, Xutai; Zhao, Yu; Peng, Zhong

    2018-01-01

    Laser spot center detection is demanded in many applications. The common algorithms for laser spot center detection such as centroid and Hough transform method have poor anti-interference ability and low detection accuracy in the condition of strong background noise. In this paper, firstly, the median filtering was used to remove the noise while preserving the edge details of the image. Secondly, the binarization of the laser facula image was carried out to extract target image from background. Then the morphological filtering was performed to eliminate the noise points inside and outside the spot. At last, the edge of pretreated facula image was extracted and the laser spot center was obtained by using the circle fitting method. In the foundation of the circle fitting algorithm, the improved algorithm added median filtering, morphological filtering and other processing methods. This method could effectively filter background noise through theoretical analysis and experimental verification, which enhanced the anti-interference ability of laser spot center detection and also improved the detection accuracy.

  11. Accurate spectroscopic redshift of the multiply lensed quasar PSOJ0147 from the Pan-STARRS survey

    NASA Astrophysics Data System (ADS)

    Lee, C.-H.

    2017-09-01

    Context. The gravitational lensing time delay method provides a one-step determination of the Hubble constant (H0) with an uncertainty level on par with the cosmic distance ladder method. However, to further investigate the nature of the dark energy, a H0 estimate down to 1% level is greatly needed. This requires dozens of strongly lensed quasars that are yet to be delivered by ongoing and forthcoming all-sky surveys. Aims: In this work we aim to determine the spectroscopic redshift of PSOJ0147, the first strongly lensed quasar candidate found in the Pan-STARRS survey. The main goal of our work is to derive an accurate redshift estimate of the background quasar for cosmography. Methods: To obtain timely spectroscopically follow-up, we took advantage of the fast-track service programme that is carried out by the Nordic Optical Telescope. Using a grism covering 3200-9600 Å, we identified prominent emission line features, such as Lyα, N V, O I, C II, Si IV, C IV, and [C III] in the spectra of the background quasar of the PSOJ0147 lens system. This enables us to determine accurately the redshift of the background quasar. Results: The spectrum of the background quasar exhibits prominent absorption features bluewards of the strong emission lines, such as Lyα, N V, and C IV. These blue absorption lines indicate that the background source is a broad absorption line (BAL) quasar. Unfortunately, the BAL features hamper an accurate determination of redshift using the above-mentioned strong emission lines. Nevertheless, we are able to determine a redshift of 2.341 ± 0.001 from three of the four lensed quasar images with the clean forbidden line [C III]. In addition, we also derive a maximum outflow velocity of 9800 km s-1 with the broad absorption features bluewards of the C IV emission line. This value of maximum outflow velocity is in good agreement with other BAL quasars.

  12. Calculation of the compounded uncertainty of 14C AMS measurements

    NASA Astrophysics Data System (ADS)

    Nadeau, Marie-Josée; Grootes, Pieter M.

    2013-01-01

    The correct method to calculate conventional 14C ages from the carbon isotopic ratios was summarised 35 years ago by Stuiver and Polach (1977) and is now accepted as the only method to calculate 14C ages. There is, however, no consensus regarding the treatment of AMS data, mainly of the uncertainty of the final result. The estimation and treatment of machine background, process blank, and/or in situ contamination is not uniform between laboratories, leading to differences in 14C results, mainly for older ages. As Donahue (1987) and Currie (1994), among others, mentioned, some laboratories find it important to use the scatter of several measurements as uncertainty while others prefer to use Poisson statistics. The contribution of the scatter of the standards, machine background, process blank, and in situ contamination to the uncertainty of the final 14C result is also treated in different ways. In the early years of AMS, several laboratories found it important to describe their calculation process in details. In recent years, this practise has declined. We present an overview of the calculation process for 14C AMS measurements looking at calculation practises published from the beginning of AMS until present.

  13. Background oriented schlieren measurement of the refractive index field of air induced by a hot, cylindrical measurement object.

    PubMed

    Beermann, Rüdiger; Quentin, Lorenz; Pösch, Andreas; Reithmeier, Eduard; Kästner, Markus

    2017-05-10

    To optically capture the topography of a hot measurement object with high precision, the light deflection by the inhomogeneous refractive index field-induced by the heat transfer from the measurement object to the ambient medium-has to be considered. We used the 2D background oriented schlieren method with illuminated wavelet background, an optical flow algorithm, and Ciddor's equation to quantify the refractive index field located directly above a red-glowing, hot measurement object. A heat transfer simulation has been implemented to verify the magnitude and the shape of the measured refractive index field. Provided that no forced external flow is disturbing the shape of the convective flow originating from the hot object, a laminar flow can be observed directly above the object, resulting in a sharply bounded, inhomogeneous refractive index field.

  14. ViBe: a universal background subtraction algorithm for video sequences.

    PubMed

    Barnich, Olivier; Van Droogenbroeck, Marc

    2011-06-01

    This paper presents a technique for motion detection that incorporates several innovative mechanisms. For example, our proposed technique stores, for each pixel, a set of values taken in the past at the same location or in the neighborhood. It then compares this set to the current pixel value in order to determine whether that pixel belongs to the background, and adapts the model by choosing randomly which values to substitute from the background model. This approach differs from those based upon the classical belief that the oldest values should be replaced first. Finally, when the pixel is found to be part of the background, its value is propagated into the background model of a neighboring pixel. We describe our method in full details (including pseudo-code and the parameter values used) and compare it to other background subtraction techniques. Efficiency figures show that our method outperforms recent and proven state-of-the-art methods in terms of both computation speed and detection rate. We also analyze the performance of a downscaled version of our algorithm to the absolute minimum of one comparison and one byte of memory per pixel. It appears that even such a simplified version of our algorithm performs better than mainstream techniques.

  15. Using Intervention Mapping to develop a programme to prevent sexually transmittable infections, including HIV, among heterosexual migrant men

    PubMed Central

    Wolfers, Mireille EG; van den Hoek, Caty; Brug, Johannes; de Zwart, Onno

    2007-01-01

    Background There is little experience with carefully developed interventions in the HIV/STI prevention field aimed at adult heterosexual target groups in the Netherlands. The ability to apply intervention development protocols, like Intervention Mapping, in daily practice outside of academia, is a matter of concern. An urgent need also exists for interventions aimed at the prevention of STI in migrant populations in the Netherlands. This article describes the theory and evidence based development of HIV/STI prevention interventions by the Municipal Public Health Service Rotterdam Area (MPHS), the Netherlands, for heterosexual migrant men with Surinamese, Dutch-Caribbean, Cape Verdean, Turkish and Moroccan backgrounds. Methods First a needs assessment was carried out. Then, a literature review was done, key figures were interviewed and seven group discussions were held. Subsequently, the results were translated into specific objectives ("change objectives") and used in intervention development for two subgroups: men with an Afro-Caribbean background and unmarried men with a Turkish and Moroccan background. A matrix of change objectives was made for each subgroup and suitable theoretical methods and practical strategies were selected. Culturally-tailored interventions were designed and were pre-tested among the target groups. Results This development process resulted in two interventions for specific subgroups that were appreciated by both the target groups and the migrant prevention workers. The project took place in collaboration with a university center, which provided an opportunity to get expert advice at every step of the Intervention Mapping process. At relevant points of the development process, migrant health educators and target group members provided advice and feedback on the draft intervention materials. Conclusion This intervention development project indicates that careful well-informed intervention development using Intervention Mapping is feasible in the daily practice of the MPHS, provided that sufficient time and expertise on this approach is available. Further research should test the effectiveness of these interventions. PMID:17615052

  16. The Effects of Steel Profile and Cleanliness on Coating Performance

    DTIC Science & Technology

    1986-01-01

    describes the results of this extensive 5-year study. •-0 BACKGROUND Abrasive blasting of steel is generally the preferred method of preparing steel...environment (SSPC used procedure 6061 of Federal Test Method Standard No. 141), and two specimens after 15 and 57 months of exposure at Kwajalein. In the...UNIVERSITY OF XASHI.%GTON Dept of Crsil Engr (Dr Mattock). Seattle WA. Librars. Seattle. WA UNIVERSITY OF WISCONSIN Great Lakes Studtes Citr. Mtis~ wacee

  17. THE NEW YORK CITY URBAN DISPERSION PROGRAM MARCH 2005 FIELD STUDY: TRACER METHODS AND RESULTS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    WATSON, T.B.; HEISER, J.; KALB, P.

    The Urban Dispersion Program March 2005 Field Study tracer releases, sampling, and analytical methods are described in detail. There were two days where tracer releases and sampling were conducted. A total of 16.0 g of six tracers were released during the first test day or Intensive Observation Period (IOP) 1 and 15.7 g during IOP 2. Three types of sampling instruments were used in this study. Sequential air samplers, or SAS, collected six-minute samples, while Brookhaven atmospheric tracer samplers (BATS) and personal air samplers (PAS) collected thirty-minute samples. There were a total of 1300 samples resulting from the two IOPs.more » Confidence limits in the sampling and analysis method were 20% as determined from 100 duplicate samples. The sample recovery rate was 84%. The integrally averaged 6-minute samples were compared to the 30-minute samples. The agreement was found to be good in most cases. The validity of using a background tracer to calculate sample volumes was examined and also found to have a confidence level of 20%. Methods for improving sampling and analysis are discussed. The data described in this report are available as Excel files. An additional Excel file of quality assured tracer data for use in model validation efforts is also available. The file consists of extensively quality assured BATS tracer data with background concentrations subtracted.« less

  18. pKWmEB: integration of Kruskal-Wallis test with empirical Bayes under polygenic background control for multi-locus genome-wide association study.

    PubMed

    Ren, Wen-Long; Wen, Yang-Jun; Dunwell, Jim M; Zhang, Yuan-Ming

    2018-03-01

    Although nonparametric methods in genome-wide association studies (GWAS) are robust in quantitative trait nucleotide (QTN) detection, the absence of polygenic background control in single-marker association in genome-wide scans results in a high false positive rate. To overcome this issue, we proposed an integrated nonparametric method for multi-locus GWAS. First, a new model transformation was used to whiten the covariance matrix of polygenic matrix K and environmental noise. Using the transferred model, Kruskal-Wallis test along with least angle regression was then used to select all the markers that were potentially associated with the trait. Finally, all the selected markers were placed into multi-locus model, these effects were estimated by empirical Bayes, and all the nonzero effects were further identified by a likelihood ratio test for true QTN detection. This method, named pKWmEB, was validated by a series of Monte Carlo simulation studies. As a result, pKWmEB effectively controlled false positive rate, although a less stringent significance criterion was adopted. More importantly, pKWmEB retained the high power of Kruskal-Wallis test, and provided QTN effect estimates. To further validate pKWmEB, we re-analyzed four flowering time related traits in Arabidopsis thaliana, and detected some previously reported genes that were not identified by the other methods.

  19. [Establishment of background color to discriminate among tablets: sharper and more feasible with color-weak simulation as access to safe medication].

    PubMed

    Ishizaki, Makiko; Maeda, Hatsuo; Okamoto, Ikuko

    2014-01-01

    Color-weak persons, who in Japan represent approximately 5% of male and 0.2% of female population, may not be able to discriminate among colors of tablets. Thus using color-weak simulation by Variantor™ we evaluated the effects of background colors (light, medium, and dark gray, purple, blue, and blue green) on discrimination among yellow, yellow red, red, and mixed group tablets by our established method. In addition, the influence of white 10-mm ruled squares on background sheets was examined, and the change in color of the tablets and background sheets through the simulation measured. Variance analysis of the data obtained from 42 volunteers demonstrated that with color-weak vision, the best discrimination among yellow, yellow red, or mixed group tablets was achieved on a dark gray background sheet, and a blue background sheet was useful to discriminate among each tablet group in all colors including red. These results were compared with those previously obtained with healthy and cataractous vision, suggesting that gap in color hue and chroma as well as value between background sheets and tablets affects discrimination with color-weak vision. The observed positive effects of white ruled squares, in contrast to those observed on healthy and cataractous vision, demonstrate that a background sheet arranged by two colors allows color-weak persons to discriminate among all sets of tablets in a sharp and feasible manner.

  20. [Immigrated Physicians: Chances and Challenges].

    PubMed

    Hohmann, Isabel; Glaesmer, Heide; Nesterko, Yuriy

    2018-01-19

    In the health care infrastructure of Germany a demand for physicians with immigrant background exists. The situation of immigrated physicians is largely unexplored so far. In the framework of a pilot study stressors and resources of physicians with immigrant background have been explored concerning their migration-related experiences at German hospitals, and within the medical team. As part of a qualitative analysis 8 physicians with immigrant background have been interviewed (problem-centered interview) from July to September 2014. The respondents stemmed from countries of the European Union and of non-EU countries. They have worked for 1-4,5 years in different German hospitals. Stressors and challenges derived from a lack in German language skills, different medical skills, cooperation in the team, and from dealing with a new health care system. Perceived discrimination by colleagues and patients represented a particular burden. In the meantime physicians with immigrant background disposed resources on different levels as on communicational, medical, social and organizational levels. The results highlight the particular demands that physicians with immigrant background face. Future research should explore potentials of stressors and resources for physicians with immigrant background by using quantitative methods; in terms of a multi-perspective approach German colleagues and patients should be included. © Georg Thieme Verlag KG Stuttgart · New York.

  1. Preparations of Meiotic Pachytene Chromosomes and Extended DNA Fibers from Cotton Suitable for Fluorescence In Situ Hybridization

    PubMed Central

    Liu, Fang; Ling, Jian; Wang, Chunying; Li, Shaohui; Zhang, Xiangdi; Wang, Yuhong; Wang, Kunbo

    2012-01-01

    Fluorescence in situ hybridization (FISH) has become one of the most important techniques applied in plant molecular cytogenetics. However, the application of this technique in cotton has lagged behind because of difficulties in chromosome preparation. The focus of this article was FISH performed not only on cotton pachytene chromosomes, but also on cotton extended DNA fibers. The cotton pollen mother cells (PMCs) instead of buds or anthers were directly digested in enzyme to completely breakdown the cell wall. Before the routine acetic acid treatment, PMCs were incubated in acetic acid and enzyme mixture to remove the cytoplasm and clear the background. The method of ice-cold Carnoy's solution spreading chromosome was adopted instead of nitrogen removed method to avoid chromosomes losing and fully stretch chromosome. With the above-improved steps, the high-quality well-differentiated pachytene chromosomes with clear background were obtained. FISH results demonstrated that a mature protocol of cotton pachytene chromosomes preparation was presented. Intact and no debris cotton nuclei were obtained by chopping from etiolation cotyledons instead of the conventional liquid nitrogen grinding method. After incubating the nuclei with nucleus lysis buffer on slide, the parallel and clear background DNA fibers were acquired along the slide. This method overcomes the twist, accumulation and fracture of DNA fibers compared with other methods. The entire process of DNA fibers preparation requires only 30 min, in contrast, it takes 3 h with routine nitrogen grinding method. The poisonous mercaptoethanol in nucleus lysis buffer is replaced by nonpoisonous dithiothreitol. PVP40 in nucleus isolation buffer is used to prevent oxidation. The probability of success in isolating nuclei for DNA fiber preparation is almost 100% tested with this method in cotton. So a rapid, safe, and efficient method for the preparation of cotton extended DNA fibers suitable for FISH was established. PMID:22442728

  2. RELATIVE CONTRIBUTIONS OF THREE DESCRIPTIVE METHODS: IMPLICATIONS FOR BEHAVIORAL ASSESSMENT

    PubMed Central

    Pence, Sacha T; Roscoe, Eileen M; Bourret, Jason C; Ahearn, William H

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods—the ABC method, the conditional probability method, and the conditional and background probability method—to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior participated. Functional analyses indicated that participants' problem behavior was maintained by social positive reinforcement (n  =  2), social negative reinforcement (n  =  2), or automatic reinforcement (n  =  2). Results showed that for all but 1 participant, descriptive analysis outcomes were similar across methods. In addition, for all but 1 participant, the descriptive analysis outcome differed substantially from the functional analysis outcome. This supports the general finding that descriptive analysis is a poor means of determining functional relations. PMID:19949536

  3. SRS 2010 Vegetation Inventory GeoStatistical Mapping Results for Custom Reaction Intensity and Total Dead Fuels.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Lloyd A.; Paresol, Bernard

    This report of the geostatistical analysis results of the fire fuels response variables, custom reaction intensity and total dead fuels is but a part of an SRS 2010 vegetation inventory project. For detailed description of project, theory and background including sample design, methods, and results please refer to USDA Forest Service Savannah River Site internal report “SRS 2010 Vegetation Inventory GeoStatistical Mapping Report”, (Edwards & Parresol 2013).

  4. Sensitive enumeration of Listeria monocytogenes and other Listeria species in various naturally contaminated matrices using a membrane filtration method.

    PubMed

    Barre, Léna; Brasseur, Emilie; Doux, Camille; Lombard, Bertrand; Besse, Nathalie Gnanou

    2015-06-01

    For the enumeration of Listeria monocytogenes (L. monocytogenes) in food, a sensitive enumeration method has been recently developed. This method is based on a membrane filtration of the food suspension followed by transfer of the filter on a selective medium to enumerate L. monocytogenes. An evaluation of this method was performed with several categories of foods naturally contaminated with L. monocytogenes. The results obtained with this technique were compared with those obtained from the modified reference EN ISO 11290-2 method for the enumeration of L. monocytogenes in food, and are found to provide more precise results. In most cases, the filtration method enabled to examine a greater quantity of food thus greatly improving the sensitivity of the enumeration. However, it was hardly applicable to some food categories because of filtration problems and background microbiota interference. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Reprint of "Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency".

    PubMed

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-08-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Methods to Identify Changes in Background Water-Quality Conditions Using Dissolved-Solids Concentrations and Loads as Indicators, Arkansas River and Fountain Creek, in the Vicinity of Pueblo, Colorado

    USGS Publications Warehouse

    Ortiz, Roderick F.

    2004-01-01

    Effective management of existing water-storage capacity in the Arkansas River Basin is anticipated to help satisfy the need for water in southeastern Colorado. A strategy to meet these needs has been developed, but implementation could affect the water quality of the Arkansas River and Fountain Creek in the vicinity of Pueblo, Colorado. Because no known methods are available to determine what effects future changes in operations will have on water quality, the U.S. Geological Survey, in cooperation with the Southeastern Colorado Water Activity Enterprise, began a study in 2002 to develop methods that could identify if future water-quality conditions have changed significantly from background (preexisting) water-quality conditions. A method was developed to identify when significant departures from background (preexisting) water-quality conditions occur in the lower Arkansas River and Fountain Creek in the vicinity of Pueblo, Colorado. Additionally, the methods described in this report provide information that can be used by various water-resource agencies for an internet-based decision-support tool. Estimated dissolved-solids concentrations at five sites in the study area were evaluated to designate historical background conditions and to calculate tolerance limits used to identify statistical departures from background conditions. This method provided a tool that could be applied with defined statistical probabilities associated with specific tolerance limits. Drought data from 2002 were used to test the method. Dissolved-solids concentrations exceeded the tolerance limits at all four sites on the Arkansas River at some point during 2002. The number of exceedances was particularly evident when streamflow from Pueblo Reservoir was reduced, and return flows and ground-water influences to the river were more prevalent. No exceedances were observed at the site on Fountain Creek. These comparisons illustrated the need to adjust the concentration data to account for varying streamflow. As such, similar comparisons between flow-adjusted data were done. At the site Arkansas River near Avondale, nearly all the 2002 flow-adjusted concentration data were less than the flow-adjusted tolerance limit which illustrated the effects of using flow-adjusted concentrations. Numerous exceedances of the flow-adjusted tolerance limits, however, were observed at the sites Arkansas River above Pueblo and Arkansas River at Pueblo. These results indicated that the method was able to identify a change in the ratio of source waters under drought conditions. Additionally, tolerance limits were calculated for daily dissolved-solids load and evaluated in a similar manner. Several other mass-load approaches were presented to help identify long-term changes in water quality. These included comparisons of cumulative mass load at selected sites and comparisons of mass load contributed at the Arkansas River near Avondale site by measured and unmeasured sources.

  7. Enhancing School Asthma Action Plans: Qualitative Results from Southeast Minnesota Beacon Stakeholder Groups

    ERIC Educational Resources Information Center

    Egginton, Jason S.; Textor, Lauren; Knoebel, Erin; McWilliams, Deborah; Aleman, Marty; Yawn, Barbara

    2013-01-01

    Background: This study explores ways southeast Minnesota schools currently address asthma problems, identifies areas for improvement, and assesses the potential value of asthma action plans (AAPs) in schools. Methods: Focus groups were used to query stakeholder groups on asthma care in schools. Groups were held separately for elementary school…

  8. Relationships among Cyberbullying, School Bullying, and Mental Health in Taiwanese Adolescents

    ERIC Educational Resources Information Center

    Chang, Fong-Ching; Lee, Ching Mei; Chiu, Chiung-Hui; Hsi, Wen-Yun; Huang, Tzu-Fu; Pan, Yun-Chieh

    2013-01-01

    Background: This study examined the relationships among cyberbullying, school bullying, and mental health in adolescents. Methods: In 2010, a total of 2992 10th grade students recruited from 26 high schools in Taipei, Taiwan completed questionnaires. Results: More than one third of students had either engaged in cyberbullying or had been the…

  9. Health Services: Results from the School Health Policies and Programs Study 2006

    ERIC Educational Resources Information Center

    Brener, Nancy D.; Wheeler, Lani; Wolfe, Linda C.; Vernon-Smiley, Mary; Caldart-Olson, Linda

    2007-01-01

    Background: The specific health services provided to students at school and the model for delivering these services vary across districts and schools. This article describes the characteristics of school health services in the United States, including state- and district-level policies and school practices. Methods: The Centers for Disease Control…

  10. Teachers' and Parental Attribution for School Performance of Ethnic Majority and Minority Children

    ERIC Educational Resources Information Center

    Wissink, Inge B.; de Haan, Mariette

    2013-01-01

    This study examines whether teachers' and parental attributions for children's school performance differ depending on the ethnic background of the child. Using both quantitative and qualitative methods, real-life attributions within 54 teacher-parent conversations (15 ethnic majority; 39 minority) were examined. The results indicated that,…

  11. Bullying and Symptoms of Depression in Chilean Middle School Students

    ERIC Educational Resources Information Center

    Fleming, Lila C.; Jacobsen, Kathryn H.

    2009-01-01

    Background: The goal of this study was to assess the association between bullying and symptoms of depression among middle school students in Chile. Methods: Secondary data analysis of Chile's 2004 Global School-Based Health Survey. Results: A total of 8131 middle school students participated in the study. Forty-seven percent of students reported…

  12. Ring Chromosome 7 in an Indian Woman

    ERIC Educational Resources Information Center

    Kaur, Anupam; Dhillon, Sumit; Garg, P. D.; Singh, Jai Rup

    2008-01-01

    Background: Ring chromosome 7 [r(7)] is a rare cytogenetic aberration, with only 16 cases (including 3 females) reported in the literature to date. This is the first reported case of r(7) from India. Method: Clinical and cytogenetic investigations were carried out in an adult female with microcephaly and intellectual disability. Results: Ring…

  13. Sex Attractants of the Banana Moth, Opogona sacchari Bojer (Lepidoptera: Tineidae): Provisional Identification and Field Evaluation

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: The banana moth, Opogona sacchari Bojer, is a ployphagous agricultural pest in many tropical areas of the world. The identification of an attractant for male O. sacchari could offer new methods for detection, study and control. RESULTS: A male electroantennographically active compound w...

  14. Health Issues and Quality of Life in Women with Intellectual Disability

    ERIC Educational Resources Information Center

    Kyrkou, M.

    2005-01-01

    Background: Although there is anecdotal evidence of an increase in both period pain and premenstrual syndrome (PMS) in women with intellectual disabilities (ID), there are only brief mentions of it in the literature. Methods: Questionnaires were distributed to parents of women with Down syndrome (DS) or Autism Spectrum Disorder (ASD), resulting in…

  15. Meanings and Experiences of Menstruation: Perceptions of Institutionalized Women with an Intellectual Disability

    ERIC Educational Resources Information Center

    Chou, Yueh-Ching; Lu, Zxy-Yann Jane; Wang, Frank T. Y.; Lan, Chang-Fu; Lin, Li-Chan

    2008-01-01

    Background: No studies have ever been conducted concerning menstrual experiences among women with an intellectual disability in Taiwan. Materials and Methods: An in-depth interview was conducted at three public institutions and perceptions and experiences regarding menstruation were elicited from 55 women aged 21-65 years. Results: The…

  16. The Association of Physical Activity and Academic Behavior: A Systematic Review

    ERIC Educational Resources Information Center

    Sullivan, Rachel A.; Kuzel, AnnMarie H.; Vaandering, Michael E.; Chen, Weiyun

    2017-01-01

    Background: In this systematic review, we assessed the existing research describing the effects of physical activity (PA) on academic behavior, with a special focus on the effectiveness of the treatments applied, study designs, outcome measures, and results. Methods: We obtained data from various journal search engines and 218 journal articles…

  17. "I Carry Her in My Heart": An Exploration of the Experience of Bereavement for People with Learning Disability

    ERIC Educational Resources Information Center

    Thorp, Nicki; Stedmon, Jacqui; Lloyd, Helen

    2018-01-01

    Background: Bereavement is a universal experience, yet little research has explored the lived experience of bereavement for people with learning disability (PWLD). Materials and methods: Four PWLD were interviewed about their experience of bereavement. Data were analysed using interpretative phenomenological analysis. Results: Four themes were…

  18. A Novel Two-Step Hierarchial Quantitative Structure-Activity Relationship Modeling Workflow for Predicting Acute Toxicity of Chemicals in Rodents

    EPA Science Inventory

    Background: Accurate prediction of in vivo toxicity from in vitro testing is a challenging problem. Large public–private consortia have been formed with the goal of improving chemical safety assessment by the means of high-throughput screening. Methods and results: A database co...

  19. Worldwide Behavioral Research on Major Global Causes of Mortality

    ERIC Educational Resources Information Center

    Dal-Re, Rafael

    2011-01-01

    Background: Researchers willing to publish their interventional studies' results must register their studies before starting enrollment. This study aimed to describe all "open" (i.e., recruiting or not yet recruiting) behavioral studies in 16 of 20 top worldwide leading causes of death. Method: Search on Clinicaltrials.gov database (March 2010).…

  20. Receipt of Cancer Screening Services: Surprising Results for Some Rural Minorities

    ERIC Educational Resources Information Center

    Bennett, Kevin J.; Probst, Janice C.; Bellinger, Jessica D.

    2012-01-01

    Background: Evidence suggests that rural minority populations experience disparities in cancer screening, treatment, and outcomes. It is unknown how race/ethnicity and rurality intersect in these disparities. The purpose of this analysis is to examine the cancer screening rates among minorities in rural areas. Methods: We utilized the 2008…

  1. The Strength of School Wellness Policies: One State's Experience

    ERIC Educational Resources Information Center

    Metos, Julie; Nanney, Marilyn S.

    2007-01-01

    Background: This study examines the results of federal legislation on the content and quality of policies written in 2005-2006 by Utah school districts (n = 30). Methods: Policies were gathered by phone call requests to school districts or obtained on district Web pages. Content was compared to requirements outlined in the Child Nutrition…

  2. Challenges in Providing End-of-Life Care for People with Intellectual Disability: Health Services Access

    ERIC Educational Resources Information Center

    Wark, Stuart; Hussain, Rafat; Müller, Arne; Ryan, Peta; Parmenter, Trevor

    2017-01-01

    Background: Increasing life expectancy for people with intellectual disability is resulting in greater need for end-of-life care services. However, limited knowledge is available regarding what barriers to accessing end-of-life care support are evident, particularly comparatively across rural and metropolitan locations. Methods: Focus group…

  3. Perceptions and Discourses Relating to Genetic Testing: Interviews with People with Down Syndrome

    ERIC Educational Resources Information Center

    Barter, Barbara; Hastings, Richard Patrick; Williams, Rebecca; Huws, Jaci C.

    2017-01-01

    Background: The perceptions of individuals with Down syndrome are conspicuously absent in discussions about the use of prenatal testing. Method: Eight individuals with Down syndrome were interviewed about their views and experience of the topic of prenatal testing. Results: Interpretative phenomenological analysis revealed two major themes with…

  4. Open-Label Trial of Atomoxetine Hydrochloride in Adults with ADHD

    ERIC Educational Resources Information Center

    Johnson, Mats; Cederlund, Mats; Rastam, Maria; Areskoug, Bjorn; Gillberg, Christopher

    2010-01-01

    Background: While atomoxetine is an established treatment for attention-deficit/hyperactivity disorder in children, few studies have examined its efficacy for adults. Methods: Open-label trial of atomoxetine in 20 individuals with ADHD, aged 19-47 years, for 10 weeks, and a total of one year for responders. Results: Ten patients met primary…

  5. Health and Social Care Practitioners' Experiences of Assessing Mental Capacity in a Community Learning Disability Team

    ERIC Educational Resources Information Center

    Ratcliff, Daniel; Chapman, Melanie

    2016-01-01

    Background: The study explored experiences of health and social care practitioners within a community learning disability team in undertaking mental capacity assessments with people with learning disabilities. Materials and Methods: Eight practitioners were interviewed using a semi-structured interview schedule. Results: The information gained was…

  6. Melatonin Treatment in Individuals with Intellectual Disability and Chronic Insomnia: A Randomized Placebo-Controlled Study

    ERIC Educational Resources Information Center

    Braam, W.; Didden, R.; Smits, M.; Curfs, L.

    2008-01-01

    Background: While several small-number or open-label studies suggest that melatonin improves sleep in individuals with intellectual disabilities (ID) with chronic sleep disturbance, a larger randomized control trial is necessary to validate these promising results. Methods: The effectiveness of melatonin for the treatment of chronic sleep…

  7. Operationalizing Culturally Responsive Instruction: Preliminary Findings of CRIOP Research

    ERIC Educational Resources Information Center

    Powell, Rebecca; Cantrell, Susan Chambers; Malo-Juvera, Victor; Correll, Pamela

    2016-01-01

    Background: Many scholars have espoused the use of culturally responsive instruction (CRI) for closing achievement gaps, yet there is a paucity of research supporting its effectiveness. In this article, we share results of a mixed methods study that examined the use of the Culturally Responsive Instruction Observation Protocol (CRIOP) as a…

  8. A Mindfulness-Based Group for Young People with Learning Disabilities: A Pilot Study

    ERIC Educational Resources Information Center

    Thornton, Victoria; Williamson, Rachel; Cooke, Bronwen

    2017-01-01

    Background: Mindfulness is becoming increasingly reported as an effective way to support well-being and reduce mental health difficulties. Materials and Methods: This study reports on the development and pilot of a mindfulness-based group for young people with learning disabilities and their carers. Results: Group participants reported that the…

  9. Parent Adaptation to and Parenting Satisfaction with Children with Intellectual Disability in the United Arab Emirates

    ERIC Educational Resources Information Center

    Dukmak, Samir

    2009-01-01

    Background: This research investigated the impact that children with intellectual disability in the United Arab Emirates (UAE) may have on their families. Method: Sixty-three parents completed three scales related to parent stress, ways of coping, and parenting satisfaction. Results: There were significant relationships between emotional-focused…

  10. Sociodemographic Differences in Depressed Mood: Results from a Nationally Representative Sample of High School Adolescents

    ERIC Educational Resources Information Center

    Paxton, Raheem J.; Valois, Robert F.; Watkins, Ken W.; Huebner, E. Scott; Drane, J. Wanzer

    2007-01-01

    Background: Research on adolescent mental health suggests that prevalence rates for depressed mood are not uniformly distributed across all populations. This study examined demographic difference in depressed mood among a nationally representative sample of high school adolescents. Methods: The 2003 National Youth Risk Behavior Survey was utilized…

  11. Parent Reactions to a School-Based Body Mass Index Screening Program

    ERIC Educational Resources Information Center

    Johnson, Suzanne Bennett; Pilkington, Lorri L.; Lamp, Camilla; He, Jianghua; Deeb, Larry C.

    2009-01-01

    Background: This study assessed parent reactions to school-based body mass index (BMI) screening. Methods: After a K-8 BMI screening program, parents were sent a letter detailing their child's BMI results. Approximately 50 parents were randomly selected for interview from each of 4 child weight-classification groups (overweight, at risk of…

  12. The Effect of Brief Digital Interventions on Attitudes to Intellectual Disability: Results from a Pilot Study

    ERIC Educational Resources Information Center

    Lindau, Natalie; Amin, Tara; Zambon, Amy; Scior, Katrina

    2018-01-01

    Background: Evidence on the effects of contact and education based interventions on attitudes is limited in the intellectual disability field. This study compared the effects of brief interventions with different education, indirect and imagined contact components on lay people's attitudes. Materials and Methods: 401 adult participants were…

  13. Implicit Procedural Learning in Fragile X and Down Syndrome

    ERIC Educational Resources Information Center

    Bussy, G.; Charrin, E.; Brun, A.; Curie, A.; des Portes, V.

    2011-01-01

    Background: Procedural learning refers to rule-based motor skill learning and storage. It involves the cerebellum, striatum and motor areas of the frontal lobe network. Fragile X syndrome, which has been linked with anatomical abnormalities within the striatum, may result in implicit procedural learning deficit. Methods: To address this issue, a…

  14. Adaptive Force Control in Grasping as a Function of Level of Developmental Disability

    ERIC Educational Resources Information Center

    Sprague, R. L.; Deutsch, K. M.; Newell, K. M.

    2009-01-01

    Background: The adaptation to the task demands of grasping (grip mode and object mass) was investigated as a function of level of developmental disability. Methods: Subjects grasped objects of different grip widths and masses that were instrumented to record grip forces. Results: Proportionally, fewer participants from the profound compared with…

  15. Staff Recommendations Concerning the Delivery of Hepatitis-Related Services in County Health Departments

    ERIC Educational Resources Information Center

    Rainey, Jacquie

    2007-01-01

    Background: This paper describes a portion of a larger evaluation project of a state hepatitis prevention program. Purpose: The study explored the suggestions of key informants related to the delivery of hepatitis services in the state. Methods: Researchers conducted key informant interviews lasting 30 to 45 minutes. Results: Important findings…

  16. The Oxidation of Secondary Alcohols with Cr(VI): A Spectrophotometric Method.

    ERIC Educational Resources Information Center

    Mason, Timothy J.; And Others

    1985-01-01

    Background information, procedures, and typical results are provided for an experiment in which spectrophotometry is used to examine the oxidation of secondary alcohols using chromium (VI). The overall change in oxidation state of chromium during the reaction is VI to III, corresponding to the familiar color change from orange to green. (JN)

  17. End-of-Life Care for People with Intellectual Disabilities: Paid Carer Perspectives

    ERIC Educational Resources Information Center

    Ryan, Karen; Guerin, Suzanne; Dodd, Philip; McEvoy, John

    2011-01-01

    Background: Little is known of paid carers' perspectives when caring for people with intellectual disabilities at the end-of-life. Materials and methods: Sixty four individuals from intellectual disability services took part in 12 focus groups. Interviews were analysed using framework analysis. Results: Participants wanted to provide palliative…

  18. Genetic and Environmental Influences on Extreme Personality Dispositions in Adolescent Female Twins

    ERIC Educational Resources Information Center

    Pergadia, Michele L.; Madden, Pamela A. F.; Lessov, Christina N.; Todorov, Alexandre A.; Bucholz, Kathleen K.; Martin, Nicholas G.; Heath, Andrew C.

    2006-01-01

    Background: The objective was to determine whether the pattern of environmental and genetic influences on deviant personality scores differs from that observed for the normative range of personality, comparing results in adolescent and adult female twins. Methods: A sample of 2,796 female adolescent twins ascertained from birth records provided…

  19. How School Healthy Is Your State? a State-by-State Comparison of School Health Practices Related to a Healthy School Environment and Health Education

    ERIC Educational Resources Information Center

    Brener, Nancy D.; Wechsler, Howell; McManus, Tim

    2013-01-01

    Background: School Health Profiles (Profiles) results help states understand how they compare to each other on specific school health policies and practices. The purpose of this study was to develop composite measures of critical Profiles results and use them to rate each state on their overall performance. Methods: Using data from state Profiles…

  20. Development of Yellow Sand Image Products Using Infrared Brightness Temperature Difference Method

    NASA Astrophysics Data System (ADS)

    Ha, J.; Kim, J.; Kwak, M.; Ha, K.

    2007-12-01

    A technique for detection of airborne yellow sand dust using meteorological satellite has been developed from various bands from ultraviolet to infrared channels. Among them, Infrared (IR) channels have an advantage of detecting aerosols over high reflecting surface as well as during nighttime. There had been suggestion of using brightness temperature difference (BTD) between 11 and 12¥ìm. We have found that the technique is highly depends on surface temperature, emissivity, and zenith angle, which results in changing the threshold of BTD. In order to overcome these problems, we have constructed the background brightness temperature threshold of BTD and then aerosol index (AI) has been determined from subtracting the background threshold from BTD of our interested scene. Along with this, we utilized high temporal coverage of geostationary satellite, MTSAT, to improve the reliability of the determined AI signal. The products have been evaluated by comparing the forecasted wind field with the movement fiend of AI. The statistical score test illustrates that this newly developed algorithm produces a promising result for detecting mineral dust by reducing the errors with respect to the current BTD method.

Top