Sample records for existing binning methods

  1. MBMC: An Effective Markov Chain Approach for Binning Metagenomic Reads from Environmental Shotgun Sequencing Projects.

    PubMed

    Wang, Ying; Hu, Haiyan; Li, Xiaoman

    2016-08-01

    Metagenomics is a next-generation omics field currently impacting postgenomic life sciences and medicine. Binning metagenomic reads is essential for the understanding of microbial function, compositions, and interactions in given environments. Despite the existence of dozens of computational methods for metagenomic read binning, it is still very challenging to bin reads. This is especially true for reads from unknown species, from species with similar abundance, and/or from low-abundance species in environmental samples. In this study, we developed a novel taxonomy-dependent and alignment-free approach called MBMC (Metagenomic Binning by Markov Chains). Different from all existing methods, MBMC bins reads by measuring the similarity of reads to the trained Markov chains for different taxa instead of directly comparing reads with known genomic sequences. By testing on more than 24 simulated and experimental datasets with species of similar abundance, species of low abundance, and/or unknown species, we report here that MBMC reliably grouped reads from different species into separate bins. Compared with four existing approaches, we demonstrated that the performance of MBMC was comparable with existing approaches when binning reads from sequenced species, and superior to existing approaches when binning reads from unknown species. MBMC is a pivotal tool for binning metagenomic reads in the current era of Big Data and postgenomic integrative biology. The MBMC software can be freely downloaded at http://hulab.ucf.edu/research/projects/metagenomics/MBMC.html .

  2. CoMet: a workflow using contig coverage and composition for binning a metagenomic sample with high precision.

    PubMed

    Herath, Damayanthi; Tang, Sen-Lin; Tandon, Kshitij; Ackland, David; Halgamuge, Saman Kumara

    2017-12-28

    In metagenomics, the separation of nucleotide sequences belonging to an individual or closely matched populations is termed binning. Binning helps the evaluation of underlying microbial population structure as well as the recovery of individual genomes from a sample of uncultivable microbial organisms. Both supervised and unsupervised learning methods have been employed in binning; however, characterizing a metagenomic sample containing multiple strains remains a significant challenge. In this study, we designed and implemented a new workflow, Coverage and composition based binning of Metagenomes (CoMet), for binning contigs in a single metagenomic sample. CoMet utilizes coverage values and the compositional features of metagenomic contigs. The binning strategy in CoMet includes the initial grouping of contigs in guanine-cytosine (GC) content-coverage space and refinement of bins in tetranucleotide frequencies space in a purely unsupervised manner. With CoMet, the clustering algorithm DBSCAN is employed for binning contigs. The performances of CoMet were compared against four existing approaches for binning a single metagenomic sample, including MaxBin, Metawatt, MyCC (default) and MyCC (coverage) using multiple datasets including a sample comprised of multiple strains. Binning methods based on both compositional features and coverages of contigs had higher performances than the method which is based only on compositional features of contigs. CoMet yielded higher or comparable precision in comparison to the existing binning methods on benchmark datasets of varying complexities. MyCC (coverage) had the highest ranking score in F1-score. However, the performances of CoMet were higher than MyCC (coverage) on the dataset containing multiple strains. Furthermore, CoMet recovered contigs of more species and was 18 - 39% higher in precision than the compared existing methods in discriminating species from the sample of multiple strains. CoMet resulted in higher precision than MyCC (default) and MyCC (coverage) on a real metagenome. The approach proposed with CoMet for binning contigs, improves the precision of binning while characterizing more species in a single metagenomic sample and in a sample containing multiple strains. The F1-scores obtained from different binning strategies vary with different datasets; however, CoMet yields the highest F1-score with a sample comprised of multiple strains.

  3. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SIEBER, CHRISTIAN

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructedmore » community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.« less

  4. Effects of empty bins on image upscaling in capsule endoscopy

    NASA Astrophysics Data System (ADS)

    Rukundo, Olivier

    2017-07-01

    This paper presents a preliminary study of the effect of empty bins on image upscaling in capsule endoscopy. The presented study was conducted based on results of existing contrast enhancement and interpolation methods. A low contrast enhancement method based on pixels consecutiveness and modified bilinear weighting scheme has been developed to distinguish between necessary empty bins and unnecessary empty bins in the effort to minimize the number of empty bins in the input image, before further processing. Linear interpolation methods have been used for upscaling input images with stretched histograms. Upscaling error differences and similarity indices between pairs of interpolation methods have been quantified using the mean squared error and feature similarity index techniques. Simulation results demonstrated more promising effects using the developed method than other contrast enhancement methods mentioned.

  5. BinQuasi: a peak detection method for ChIP-sequencing data with biological replicates.

    PubMed

    Goren, Emily; Liu, Peng; Wang, Chao; Wang, Chong

    2018-04-19

    ChIP-seq experiments that are aimed at detecting DNA-protein interactions require biological replication to draw inferential conclusions, however there is no current consensus on how to analyze ChIP-seq data with biological replicates. Very few methodologies exist for the joint analysis of replicated ChIP-seq data, with approaches ranging from combining the results of analyzing replicates individually to joint modeling of all replicates. Combining the results of individual replicates analyzed separately can lead to reduced peak classification performance compared to joint modeling. Currently available methods for joint analysis may fail to control the false discovery rate at the nominal level. We propose BinQuasi, a peak caller for replicated ChIP-seq data, that jointly models biological replicates using a generalized linear model framework and employs a one-sided quasi-likelihood ratio test to detect peaks. When applied to simulated data and real datasets, BinQuasi performs favorably compared to existing methods, including better control of false discovery rate than existing joint modeling approaches. BinQuasi offers a flexible approach to joint modeling of replicated ChIP-seq data which is preferable to combining the results of replicates analyzed individually. Source code is freely available for download at https://cran.r-project.org/package=BinQuasi, implemented in R. pliu@iastate.edu or egoren@iastate.edu. Supplementary material is available at Bioinformatics online.

  6. A Nonlinearity Minimization-Oriented Resource-Saving Time-to-Digital Converter Implemented in a 28 nm Xilinx FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2015-10-01

    Because large nonlinearity errors exist in the current tapped-delay line (TDL) style field programmable gate array (FPGA)-based time-to-digital converters (TDC), bin-by-bin calibration techniques have to be resorted for gaining a high measurement resolution. If the TDL in selected FPGAs is significantly affected by changes in ambient temperature, the bin-by-bin calibration table has to be updated as frequently as possible. The on-line calibration and calibration table updating increase the TDC design complexity and limit the system performance to some extent. This paper proposes a method to minimize the nonlinearity errors of TDC bins, so that the bin-by-bin calibration may not be needed while maintaining a reasonably high time resolution. The method is a two pass approach: By a bin realignment, the large number of wasted zero-width bins in the original TDL is reused and the granularity of the bins is improved; by a bin decimation, the bin size and its uniformity is traded-off, and the time interpolation by the delay line turns more precise so that the bin-by-bin calibration is not necessary. Using Xilinx 28 nm FPGAs, in which the TDL property is not very sensitive to ambient temperature, the proposed TDC achieves approximately 15 ps root-mean-square (RMS) time resolution by dual-channel measurements of time-intervals over the range of operating temperature. Because of removing the calibration and less logic resources required for the data post-processing, the method has bigger multi-channel capability.

  7. Native conflict awared layout decomposition in triple patterning lithography using bin-based library matching method

    NASA Astrophysics Data System (ADS)

    Ke, Xianhua; Jiang, Hao; Lv, Wen; Liu, Shiyuan

    2016-03-01

    Triple patterning (TP) lithography becomes a feasible technology for manufacturing as the feature size further scale down to sub 14/10 nm. In TP, a layout is decomposed into three masks followed with exposures and etches/freezing processes respectively. Previous works mostly focus on layout decomposition with minimal conflicts and stitches simultaneously. However, since any existence of native conflict will result in layout re-design/modification and reperforming the time-consuming decomposition, the effective method that can be aware of native conflicts (NCs) in layout is desirable. In this paper, a bin-based library matching method is proposed for NCs detection and layout decomposition. First, a layout is divided into bins and the corresponding conflict graph in each bin is constructed. Then, we match the conflict graph in a prebuilt colored library, and as a result the NCs can be located and highlighted quickly.

  8. Bin-Hash Indexing: A Parallel Method for Fast Query Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Edward W; Gosink, Luke J.; Wu, Kesheng

    2008-06-27

    This paper presents a new parallel indexing data structure for answering queries. The index, called Bin-Hash, offers extremely high levels of concurrency, and is therefore well-suited for the emerging commodity of parallel processors, such as multi-cores, cell processors, and general purpose graphics processing units (GPU). The Bin-Hash approach first bins the base data, and then partitions and separately stores the values in each bin as a perfect spatial hash table. To answer a query, we first determine whether or not a record satisfies the query conditions based on the bin boundaries. For the bins with records that can not bemore » resolved, we examine the spatial hash tables. The procedures for examining the bin numbers and the spatial hash tables offer the maximum possible level of concurrency; all records are able to be evaluated by our procedure independently in parallel. Additionally, our Bin-Hash procedures access much smaller amounts of data than similar parallel methods, such as the projection index. This smaller data footprint is critical for certain parallel processors, like GPUs, where memory resources are limited. To demonstrate the effectiveness of Bin-Hash, we implement it on a GPU using the data-parallel programming language CUDA. The concurrency offered by the Bin-Hash index allows us to fully utilize the GPU's massive parallelism in our work; over 12,000 records can be simultaneously evaluated at any one time. We show that our new query processing method is an order of magnitude faster than current state-of-the-art CPU-based indexing technologies. Additionally, we compare our performance to existing GPU-based projection index strategies.« less

  9. MetaBAT, an efficient tool for accurately reconstructing single genomes from complex microbial communities

    DOE PAGES

    Kang, Dongwan D.; Froula, Jeff; Egan, Rob; ...

    2015-01-01

    Grouping large genomic fragments assembled from shotgun metagenomic sequences to deconvolute complex microbial communities, or metagenome binning, enables the study of individual organisms and their interactions. Because of the complex nature of these communities, existing metagenome binning methods often miss a large number of microbial species. In addition, most of the tools are not scalable to large datasets. Here we introduce automated software called MetaBAT that integrates empirical probabilistic distances of genome abundance and tetranucleotide frequency for accurate metagenome binning. MetaBAT outperforms alternative methods in accuracy and computational efficiency on both synthetic and real metagenome datasets. Lastly, it automatically formsmore » hundreds of high quality genome bins on a very large assembly consisting millions of contigs in a matter of hours on a single node. MetaBAT is open source software and available at https://bitbucket.org/berkeleylab/metabat.« less

  10. Stored grain pack factors for wheat: comparison of three methods to field measurements

    USDA-ARS?s Scientific Manuscript database

    Storing grain in bulk storage units results in grain packing from overbearing pressure, which increases grain bulk density and storage-unit capacity. This study compared pack factors of hard red winter (HRW) wheat in vertical storage bins using different methods: the existing packing model (WPACKING...

  11. Single-Cell-Genomics-Facilitated Read Binning of Candidate Phylum EM19 Genomes from Geothermal Spring Metagenomes

    PubMed Central

    Becraft, Eric D.; Dodsworth, Jeremy A.; Murugapiran, Senthil K.; Ohlsson, J. Ingemar; Briggs, Brandon R.; Kanbar, Jad; De Vlaminck, Iwijn; Quake, Stephen R.; Dong, Hailiang; Hedlund, Brian P.

    2015-01-01

    The vast majority of microbial life remains uncatalogued due to the inability to cultivate these organisms in the laboratory. This “microbial dark matter” represents a substantial portion of the tree of life and of the populations that contribute to chemical cycling in many ecosystems. In this work, we leveraged an existing single-cell genomic data set representing the candidate bacterial phylum “Calescamantes” (EM19) to calibrate machine learning algorithms and define metagenomic bins directly from pyrosequencing reads derived from Great Boiling Spring in the U.S. Great Basin. Compared to other assembly-based methods, taxonomic binning with a read-based machine learning approach yielded final assemblies with the highest predicted genome completeness of any method tested. Read-first binning subsequently was used to extract Calescamantes bins from all metagenomes with abundant Calescamantes populations, including metagenomes from Octopus Spring and Bison Pool in Yellowstone National Park and Gongxiaoshe Spring in Yunnan Province, China. Metabolic reconstruction suggests that Calescamantes are heterotrophic, facultative anaerobes, which can utilize oxidized nitrogen sources as terminal electron acceptors for respiration in the absence of oxygen and use proteins as their primary carbon source. Despite their phylogenetic divergence, the geographically separate Calescamantes populations were highly similar in their predicted metabolic capabilities and core gene content, respiring O2, or oxidized nitrogen species for energy conservation in distant but chemically similar hot springs. PMID:26637598

  12. Evaluation of intrinsic respiratory signal determination methods for 4D CBCT adapted for mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Rachael; Pan, Tinsu, E-mail: tpan@mdanderson.org; Rubinstein, Ashley

    Purpose: 4D CT imaging in mice is important in a variety of areas including studies of lung function and tumor motion. A necessary step in 4D imaging is obtaining a respiratory signal, which can be done through an external system or intrinsically through the projection images. A number of methods have been developed that can successfully determine the respiratory signal from cone-beam projection images of humans, however only a few have been utilized in a preclinical setting and most of these rely on step-and-shoot style imaging. The purpose of this work is to assess and make adaptions of several successfulmore » methods developed for humans for an image-guided preclinical radiation therapy system. Methods: Respiratory signals were determined from the projection images of free-breathing mice scanned on the X-RAD system using four methods: the so-called Amsterdam shroud method, a method based on the phase of the Fourier transform, a pixel intensity method, and a center of mass method. The Amsterdam shroud method was modified so the sharp inspiration peaks associated with anesthetized mouse breathing could be detected. Respiratory signals were used to sort projections into phase bins and 4D images were reconstructed. Error and standard deviation in the assignment of phase bins for the four methods compared to a manual method considered to be ground truth were calculated for a range of region of interest (ROI) sizes. Qualitative comparisons were additionally made between the 4D images obtained using each of the methods and the manual method. Results: 4D images were successfully created for all mice with each of the respiratory signal extraction methods. Only minimal qualitative differences were noted between each of the methods and the manual method. The average error (and standard deviation) in phase bin assignment was 0.24 ± 0.08 (0.49 ± 0.11) phase bins for the Fourier transform method, 0.09 ± 0.03 (0.31 ± 0.08) phase bins for the modified Amsterdam shroud method, 0.09 ± 0.02 (0.33 ± 0.07) phase bins for the intensity method, and 0.37 ± 0.10 (0.57 ± 0.08) phase bins for the center of mass method. Little dependence on ROI size was noted for the modified Amsterdam shroud and intensity methods while the Fourier transform and center of mass methods showed a noticeable dependence on the ROI size. Conclusions: The modified Amsterdam shroud, Fourier transform, and intensity respiratory signal methods are sufficiently accurate to be used for 4D imaging on the X-RAD system and show improvement over the existing center of mass method. The intensity and modified Amsterdam shroud methods are recommended due to their high accuracy and low dependence on ROI size.« less

  13. Single-Cell-Genomics-Facilitated Read Binning of Candidate Phylum EM19 Genomes from Geothermal Spring Metagenomes.

    PubMed

    Becraft, Eric D; Dodsworth, Jeremy A; Murugapiran, Senthil K; Ohlsson, J Ingemar; Briggs, Brandon R; Kanbar, Jad; De Vlaminck, Iwijn; Quake, Stephen R; Dong, Hailiang; Hedlund, Brian P; Swingley, Wesley D

    2016-02-15

    The vast majority of microbial life remains uncatalogued due to the inability to cultivate these organisms in the laboratory. This "microbial dark matter" represents a substantial portion of the tree of life and of the populations that contribute to chemical cycling in many ecosystems. In this work, we leveraged an existing single-cell genomic data set representing the candidate bacterial phylum "Calescamantes" (EM19) to calibrate machine learning algorithms and define metagenomic bins directly from pyrosequencing reads derived from Great Boiling Spring in the U.S. Great Basin. Compared to other assembly-based methods, taxonomic binning with a read-based machine learning approach yielded final assemblies with the highest predicted genome completeness of any method tested. Read-first binning subsequently was used to extract Calescamantes bins from all metagenomes with abundant Calescamantes populations, including metagenomes from Octopus Spring and Bison Pool in Yellowstone National Park and Gongxiaoshe Spring in Yunnan Province, China. Metabolic reconstruction suggests that Calescamantes are heterotrophic, facultative anaerobes, which can utilize oxidized nitrogen sources as terminal electron acceptors for respiration in the absence of oxygen and use proteins as their primary carbon source. Despite their phylogenetic divergence, the geographically separate Calescamantes populations were highly similar in their predicted metabolic capabilities and core gene content, respiring O2, or oxidized nitrogen species for energy conservation in distant but chemically similar hot springs. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  14. Development of a new bin filler for apple harvesting and infield sorting with a review of existing technologies

    USDA-ARS?s Scientific Manuscript database

    The bin filler, which receives apples from the sorting system and then places them in the bin evenly without causing bruise damage, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges ...

  15. Effect of various binning methods and ROI sizes on the accuracy of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of texture features at HRCT

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Joon Beom; Sung, Yu Sub; Park, Bum-Woo; Lee, Youngjoo; Park, Seong Hoon; Lee, Young Kyung; Kang, Suk-Ho

    2008-03-01

    To find optimal binning, variable binning size linear binning (LB) and non-linear binning (NLB) methods were tested. In case of small binning size (Q <= 10), NLB shows significant better accuracy than the LB. K-means NLB (Q = 26) is statistically significant better than every LB. To find optimal binning method and ROI size of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of textural analysis at HRCT Six-hundred circular regions of interest (ROI) with 10, 20, and 30 pixel diameter, comprising of each 100 ROIs representing six regional disease patterns (normal, NL; ground-glass opacity, GGO; reticular opacity, RO; honeycombing, HC; emphysema, EMPH; and consolidation, CONS) were marked by an experienced radiologist from HRCT images. Histogram (mean) and co-occurrence matrix (mean and SD of angular second moment, contrast, correlation, entropy, and inverse difference momentum) features were employed to test binning and ROI effects. To find optimal binning, variable binning size LB (bin size Q: 4~30, 32, 64, 128, 144, 196, 256, 384) and NLB (Q: 4~30) methods (K-means, and Fuzzy C-means clustering) were tested. For automated classification, a SVM classifier was implemented. To assess cross-validation of the system, a five-folding method was used. Each test was repeatedly performed twenty times. Overall accuracies with every combination of variable ROIs, and binning sizes were statistically compared. In case of small binning size (Q <= 10), NLB shows significant better accuracy than the LB. K-means NLB (Q = 26) is statistically significant better than every LB. In case of 30x30 ROI size and most of binning size, the K-means method showed better than other NLB and LB methods. When optimal binning and other parameters were set, overall sensitivity of the classifier was 92.85%. The sensitivity and specificity of the system for each class were as follows: NL, 95%, 97.9%; GGO, 80%, 98.9%; RO 85%, 96.9%; HC, 94.7%, 97%; EMPH, 100%, 100%; and CONS, 100%, 100%, respectively. We determined the optimal binning method and ROI size of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of texture features at HRCT.

  16. Binarized cross-approximate entropy in crowdsensing environment.

    PubMed

    Skoric, Tamara; Mohamoud, Omer; Milovanovic, Branislav; Japundzic-Zigon, Nina; Bajic, Dragana

    2017-01-01

    Personalised monitoring in health applications has been recognised as part of the mobile crowdsensing concept, where subjects equipped with sensors extract information and share them for personal or common benefit. Limited transmission resources impose the use of local analyses methodology, but this approach is incompatible with analytical tools that require stationary and artefact-free data. This paper proposes a computationally efficient binarised cross-approximate entropy, referred to as (X)BinEn, for unsupervised cardiovascular signal processing in environments where energy and processor resources are limited. The proposed method is a descendant of the cross-approximate entropy ((X)ApEn). It operates on binary, differentially encoded data series split into m-sized vectors. The Hamming distance is used as a distance measure, while a search for similarities is performed on the vector sets. The procedure is tested on rats under shaker and restraint stress, and compared to the existing (X)ApEn results. The number of processing operations is reduced. (X)BinEn captures entropy changes in a similar manner to (X)ApEn. The coding coarseness yields an adverse effect of reduced sensitivity, but it attenuates parameter inconsistency and binary bias. A special case of (X)BinEn is equivalent to Shannon's entropy. A binary conditional entropy for m =1 vectors is embedded into the (X)BinEn procedure. (X)BinEn can be applied to a single time series as an auto-entropy method, or to a pair of time series, as a cross-entropy method. Its low processing requirements makes it suitable for mobile, battery operated, self-attached sensing devices, with limited power and processor resources. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  18. VarBin, a novel method for classifying true and false positive variants in NGS data

    PubMed Central

    2013-01-01

    Background Variant discovery for rare genetic diseases using Illumina genome or exome sequencing involves screening of up to millions of variants to find only the one or few causative variant(s). Sequencing or alignment errors create "false positive" variants, which are often retained in the variant screening process. Methods to remove false positive variants often retain many false positive variants. This report presents VarBin, a method to prioritize variants based on a false positive variant likelihood prediction. Methods VarBin uses the Genome Analysis Toolkit variant calling software to calculate the variant-to-wild type genotype likelihood ratio at each variant change and position divided by read depth. The resulting Phred-scaled, likelihood-ratio by depth (PLRD) was used to segregate variants into 4 Bins with Bin 1 variants most likely true and Bin 4 most likely false positive. PLRD values were calculated for a proband of interest and 41 additional Illumina HiSeq, exome and whole genome samples (proband's family or unrelated samples). At variant sites without apparent sequencing or alignment error, wild type/non-variant calls cluster near -3 PLRD and variant calls typically cluster above 10 PLRD. Sites with systematic variant calling problems (evident by variant quality scores and biases as well as displayed on the iGV viewer) tend to have higher and more variable wild type/non-variant PLRD values. Depending on the separation of a proband's variant PLRD value from the cluster of wild type/non-variant PLRD values for background samples at the same variant change and position, the VarBin method's classification is assigned to each proband variant (Bin 1 to Bin 4). Results To assess VarBin performance, Sanger sequencing was performed on 98 variants in the proband and background samples. True variants were confirmed in 97% of Bin 1 variants, 30% of Bin 2, and 0% of Bin 3/Bin 4. Conclusions These data indicate that VarBin correctly classifies the majority of true variants as Bin 1 and Bin 3/4 contained only false positive variants. The "uncertain" Bin 2 contained both true and false positive variants. Future work will further differentiate the variants in Bin 2. PMID:24266885

  19. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    DOE PAGES

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; ...

    2016-11-25

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  20. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; Docherty, Kenneth S.; Jimenez, Jose L.

    2016-11-01

    We present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography-mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arranged into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.

  1. The volatile compound BinBase mass spectral database.

    PubMed

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  2. The volatile compound BinBase mass spectral database

    PubMed Central

    2011-01-01

    Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). Conclusions The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement. PMID:21816034

  3. Cosmological constraints with clustering-based redshifts

    NASA Astrophysics Data System (ADS)

    Kovetz, Ely D.; Raccanelli, Alvise; Rahman, Mubdi

    2017-07-01

    We demonstrate that observations lacking reliable redshift information, such as photometric and radio continuum surveys, can produce robust measurements of cosmological parameters when empowered by clustering-based redshift estimation. This method infers the redshift distribution based on the spatial clustering of sources, using cross-correlation with a reference data set with known redshifts. Applying this method to the existing Sloan Digital Sky Survey (SDSS) photometric galaxies, and projecting to future radio continuum surveys, we show that sources can be efficiently divided into several redshift bins, increasing their ability to constrain cosmological parameters. We forecast constraints on the dark-energy equation of state and on local non-Gaussianity parameters. We explore several pertinent issues, including the trade-off between including more sources and minimizing the overlap between bins, the shot-noise limitations on binning and the predicted performance of the method at high redshifts, and most importantly pay special attention to possible degeneracies with the galaxy bias. Remarkably, we find that once this technique is implemented, constraints on dynamical dark energy from the SDSS imaging catalogue can be competitive with, or better than, those from the spectroscopic BOSS survey and even future planned experiments. Further, constraints on primordial non-Gaussianity from future large-sky radio-continuum surveys can outperform those from the Planck cosmic microwave background experiment and rival those from future spectroscopic galaxy surveys. The application of this method thus holds tremendous promise for cosmology.

  4. Improving Electronic Sensor Reliability by Robust Outlier Screening

    PubMed Central

    Moreno-Lizaranzu, Manuel J.; Cuesta, Federico

    2013-01-01

    Electronic sensors are widely used in different application areas, and in some of them, such as automotive or medical equipment, they must perform with an extremely low defect rate. Increasing reliability is paramount. Outlier detection algorithms are a key component in screening latent defects and decreasing the number of customer quality incidents (CQIs). This paper focuses on new spatial algorithms (Good Die in a Bad Cluster with Statistical Bins (GDBC SB) and Bad Bin in a Bad Cluster (BBBC)) and an advanced outlier screening method, called Robust Dynamic Part Averaging Testing (RDPAT), as well as two practical improvements, which significantly enhance existing algorithms. Those methods have been used in production in Freescale® Semiconductor probe factories around the world for several years. Moreover, a study was conducted with production data of 289,080 dice with 26 CQIs to determine and compare the efficiency and effectiveness of all these algorithms in identifying CQIs. PMID:24113682

  5. Improving electronic sensor reliability by robust outlier screening.

    PubMed

    Moreno-Lizaranzu, Manuel J; Cuesta, Federico

    2013-10-09

    Electronic sensors are widely used in different application areas, and in some of them, such as automotive or medical equipment, they must perform with an extremely low defect rate. Increasing reliability is paramount. Outlier detection algorithms are a key component in screening latent defects and decreasing the number of customer quality incidents (CQIs). This paper focuses on new spatial algorithms (Good Die in a Bad Cluster with Statistical Bins (GDBC SB) and Bad Bin in a Bad Cluster (BBBC)) and an advanced outlier screening method, called Robust Dynamic Part Averaging Testing (RDPAT), as well as two practical improvements, which significantly enhance existing algorithms. Those methods have been used in production in Freescale® Semiconductor probe factories around the world for several years. Moreover, a study was conducted with production data of 289,080 dice with 26 CQIs to determine and compare the efficiency and effectiveness of all these algorithms in identifying CQIs.

  6. Potential reduction of non-residential solid waste in Sukomanunggal district West Surabaya

    NASA Astrophysics Data System (ADS)

    Warmadewanthi, I. D. A. A.; Reswari, S. A.

    2018-01-01

    Sukomanunggal district a development unit 8 with the designation as a regional trade and services, industrial, education, healthcare, offices, and shopping center. The development of this region will make an increasing solid waste generation, especially waste from non-residential facilities. The aims of this research to know the potential reduction of waste source. The method used is the Likert scale questionnaire to determine the knowledge, attitude, and behavior of non-residential facilities manager. Results from this research are the existing reduction of non-residential solid waste is 5.34%, potential reduction of the waste source is optimization of plastic and paper waste with the reduction rate up to 19,52%. The level of public participation existing amounted to 46.79% with a willingness to increase recycling efforts amounted to 72.87%. Efforts that can be developed to increase public awareness of 3R are providing three types of bins, modification of solid waste collection schedule according to a type of waste that has been sorted, the provision of the communal bin.

  7. BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation

    PubMed Central

    Kiefer, Christina; Fehlmann, Tobias; Backes, Christina

    2017-01-01

    Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498

  8. The generation and cost of litter resulting from the curbside collection of recycling.

    PubMed

    Wagner, Travis P; Broaddus, Nathan

    2016-04-01

    This study examined the generation of litter, defined as spillage and uncollected residue, from a curbside collection system for residential recycling. The primary recycling containers used in the study were 18-gal (68 L), open-top bins. The study, conducted over a seven-week period, was comprised of both an urban and suburban area. Six litter characterizations were conducted in which all new litter larger than 1 in.(2) was collected, segregated, counted, and weighed. We found that each week the open-top recycling bins contributed approximately 20,590 pieces of litter over 1 in. in size per every 1000 households, which resulted in the generation of 3.74 tons of litter per 1000 households per year. In addition to the bins having no top, the primary root causes of the litter were constantly overflowing recycling bins, the method of collection, and material scavenging. Based on an estimated cost of litter cleanup ranging from $0.17 to $0.79 per piece of litter, the direct economic costs from the collection of litter and loss in recycling revenues were estimated at US$3920 to US$19,250 per 1000 households per year. Other notable impacts from the litter, such as increased risk of flood damage from storm drain impairment and marine ecosystem damages exist, but were not monetized. The results strongly suggest that modification of the curbside collection system would decrease the amount and associated cost of litter by replacing existing curbside collection containers with larger volume containers with covers and by modifying the task-based incentive system to emphasize litter prevention rather than the current aim of completing the task most quickly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. DNA barcoding a nightmare taxon: assessing barcode index numbers and barcode gaps for sweat bees.

    PubMed

    Gibbs, Jason

    2018-01-01

    There is an ongoing campaign to DNA barcode the world's >20 000 bee species. Recent revisions of Lasioglossum (Dialictus) (Hymenoptera: Halictidae) for Canada and the eastern United States were completed using integrative taxonomy. DNA barcode data from 110 species of L. (Dialictus) are examined for their value in identification and discovering additional taxonomic diversity. Specimen identification success was estimated using the best close match method. Error rates were 20% relative to current taxonomic understanding. Barcode Index Numbers (BINs) assigned using Refined Single Linkage Analysis (RESL) and barcode gaps using the Automatic Barcode Gap Discovery (ABGD) method were also assessed. RESL was incongruent for 44.5% of species, although some cryptic diversity may exist. Forty-three of 110 species were part of merged BINs with multiple species. The barcode gap is non-existent for the data set as a whole and ABGD showed levels of discordance similar to the RESL. The viridatum species-group is particularly problematic, so that DNA barcodes alone would be misleading for species delimitation and specimen identification. Character-based methods using fixed nucleotide substitutions could improve specimen identification success in some cases. The use of DNA barcoding for species discovery for standard taxonomic practice in the absence of a well-defined barcode gap is discussed.

  10. A lifelong learning hyper-heuristic method for bin packing.

    PubMed

    Sim, Kevin; Hart, Emma; Paechter, Ben

    2015-01-01

    We describe a novel hyper-heuristic system that continuously learns over time to solve a combinatorial optimisation problem. The system continuously generates new heuristics and samples problems from its environment; and representative problems and heuristics are incorporated into a self-sustaining network of interacting entities inspired by methods in artificial immune systems. The network is plastic in both its structure and content, leading to the following properties: it exploits existing knowledge captured in the network to rapidly produce solutions; it can adapt to new problems with widely differing characteristics; and it is capable of generalising over the problem space. The system is tested on a large corpus of 3,968 new instances of 1D bin-packing problems as well as on 1,370 existing problems from the literature; it shows excellent performance in terms of the quality of solutions obtained across the datasets and in adapting to dynamically changing sets of problem instances compared to previous approaches. As the network self-adapts to sustain a minimal repertoire of both problems and heuristics that form a representative map of the problem space, the system is further shown to be computationally efficient and therefore scalable.

  11. Compressible cavitation with stochastic field method

    NASA Astrophysics Data System (ADS)

    Class, Andreas; Dumond, Julien

    2012-11-01

    Non-linear phenomena can often be well described using probability density functions (pdf) and pdf transport models. Traditionally the simulation of pdf transport requires Monte-Carlo codes based on Lagrange particles or prescribed pdf assumptions including binning techniques. Recently, in the field of combustion, a novel formulation called the stochastic field method solving pdf transport based on Euler fields has been proposed which eliminates the necessity to mix Euler and Lagrange techniques or prescribed pdf assumptions. In the present work, part of the PhD Design and analysis of a Passive Outflow Reducer relying on cavitation, a first application of the stochastic field method to multi-phase flow and in particular to cavitating flow is presented. The application considered is a nozzle subjected to high velocity flow so that sheet cavitation is observed near the nozzle surface in the divergent section. It is demonstrated that the stochastic field formulation captures the wide range of pdf shapes present at different locations. The method is compatible with finite-volume codes where all existing physical models available for Lagrange techniques, presumed pdf or binning methods can be easily extended to the stochastic field formulation.

  12. The retrospective binning method improves the consistency of phase binning in respiratory-gated PET/CT

    NASA Astrophysics Data System (ADS)

    Didierlaurent, D.; Ribes, S.; Batatia, H.; Jaudet, C.; Dierickx, L. O.; Zerdoud, S.; Brillouet, S.; Caselles, O.; Courbon, F.

    2012-12-01

    This study assesses the accuracy of prospective phase-gated PET/CT data binning and presents a retrospective data binning method that improves image quality and consistency. Respiratory signals from 17 patients who underwent 4D PET/CT were analysed to evaluate the reproducibility of temporal triggers used for the standard phase-based gating method. Breathing signals were reprocessed to implement retrospective PET data binning. The mean and standard deviation of time lags between automatic triggers provided by the Real-time Position Management (RPM, Varian) gating device and inhalation peaks derived from respiratory curves were computed for each patient. The total number of respiratory cycles available for 4D PET/CT according to the binning mode (prospective versus retrospective) was compared. The maximum standardized uptake value (SUVmax), biological tumour volume (BTV) and tumour trajectory measures were determined from the PET/CT images of five patients. Compared to retrospective binning (RB), prospective gating approach led to (i) a significant loss in breathing cycles (15%) and (ii) the inconsistency of data binning due to temporal dispersion of triggers (average 396 ms). Consequently, tumour characterization could be impacted. In retrospective mode, SUVmax was up to 27% higher, where no significant difference appeared in BTV. In addition, prospective mode gave an inconsistent spatial location of the tumour throughout the bins. Improved consistency with breathing patterns and greater motion amplitude of the tumour centroid were observed with retrospective mode. The detection of the tumour motion and trajectory was improved also for small temporal dispersion of triggers. This study shows that the binning mode could have a significant impact on 4D PET images. The consistency of triggers with breathing signals should be checked before clinical use of gated PET/CT images, and our RB method improves 4D PET/CT image quantification.

  13. BinSanity: unsupervised clustering of environmental microbial assemblies using coverage and affinity propagation

    PubMed Central

    Heidelberg, John F.; Tully, Benjamin J.

    2017-01-01

    Metagenomics has become an integral part of defining microbial diversity in various environments. Many ecosystems have characteristically low biomass and few cultured representatives. Linking potential metabolisms to phylogeny in environmental microorganisms is important for interpreting microbial community functions and the impacts these communities have on geochemical cycles. However, with metagenomic studies there is the computational hurdle of ‘binning’ contigs into phylogenetically related units or putative genomes. Binning methods have been implemented with varying approaches such as k-means clustering, Gaussian mixture models, hierarchical clustering, neural networks, and two-way clustering; however, many of these suffer from biases against low coverage/abundance organisms and closely related taxa/strains. We are introducing a new binning method, BinSanity, that utilizes the clustering algorithm affinity propagation (AP), to cluster assemblies using coverage with compositional based refinement (tetranucleotide frequency and percent GC content) to optimize bins containing multiple source organisms. This separation of composition and coverage based clustering reduces bias for closely related taxa. BinSanity was developed and tested on artificial metagenomes varying in size and complexity. Results indicate that BinSanity has a higher precision, recall, and Adjusted Rand Index compared to five commonly implemented methods. When tested on a previously published environmental metagenome, BinSanity generated high completion and low redundancy bins corresponding with the published metagenome-assembled genomes. PMID:28289564

  14. Method of multiplexed analysis using ion mobility spectrometer

    DOEpatents

    Belov, Mikhail E [Richland, WA; Smith, Richard D [Richland, WA

    2009-06-02

    A method for analyzing analytes from a sample introduced into a Spectrometer by generating a pseudo random sequence of a modulation bins, organizing each modulation bin as a series of submodulation bins, thereby forming an extended pseudo random sequence of submodulation bins, releasing the analytes in a series of analyte packets into a Spectrometer, thereby generating an unknown original ion signal vector, detecting the analytes at a detector, and characterizing the sample using the plurality of analyte signal subvectors. The method is advantageously applied to an Ion Mobility Spectrometer, and an Ion Mobility Spectrometer interfaced with a Time of Flight Mass Spectrometer.

  15. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles

    PubMed Central

    Xu, Bing; Fu, Ying; Liu, Yan; Agvanian, Sosse; Wirka, Robert C.; Baum, Rachel; Zhou, Kang; Shaw, Robin M.

    2017-01-01

    Microparticles (MPs) are cell–cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17) creates transverse-tubule (t-tubule) membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA), and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT)-III subunit charged multivesicular body protein 4B (CHMP4B) colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR) domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily to the ESCRT pathway for MP biogenesis in mammalian cardiac ventricular cells, identifying elements of a pathway by which cytoplasmic cBIN1 is released into blood. PMID:28806752

  16. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles.

    PubMed

    Xu, Bing; Fu, Ying; Liu, Yan; Agvanian, Sosse; Wirka, Robert C; Baum, Rachel; Zhou, Kang; Shaw, Robin M; Hong, TingTing

    2017-08-01

    Microparticles (MPs) are cell-cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17) creates transverse-tubule (t-tubule) membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA), and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT)-III subunit charged multivesicular body protein 4B (CHMP4B) colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR) domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily to the ESCRT pathway for MP biogenesis in mammalian cardiac ventricular cells, identifying elements of a pathway by which cytoplasmic cBIN1 is released into blood.

  17. Optimizing 4DCBCT projection allocation to respiratory bins.

    PubMed

    O'Brien, Ricky T; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J

    2014-10-07

    4D cone beam computed tomography (4DCBCT) is an emerging image guidance strategy used in radiotherapy where projections acquired during a scan are sorted into respiratory bins based on the respiratory phase or displacement. 4DCBCT reduces the motion blur caused by respiratory motion but increases streaking artefacts due to projection under-sampling as a result of the irregular nature of patient breathing and the binning algorithms used. For displacement binning the streak artefacts are so severe that displacement binning is rarely used clinically. The purpose of this study is to investigate if sharing projections between respiratory bins and adjusting the location of respiratory bins in an optimal manner can reduce or eliminate streak artefacts in 4DCBCT images. We introduce a mathematical optimization framework and a heuristic solution method, which we will call the optimized projection allocation algorithm, to determine where to position the respiratory bins and which projections to source from neighbouring respiratory bins. Five 4DCBCT datasets from three patients were used to reconstruct 4DCBCT images. Projections were sorted into respiratory bins using equispaced, equal density and optimized projection allocation. The standard deviation of the angular separation between projections was used to assess streaking and the consistency of the segmented volume of a fiducial gold marker was used to assess motion blur. The standard deviation of the angular separation between projections using displacement binning and optimized projection allocation was 30%-50% smaller than conventional phase based binning and 59%-76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The standard deviation in the marker volume was 20%-90% smaller when using optimized projection allocation than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Images reconstructed using displacement binning and the optimized projection allocation algorithm were clearer, contained visibly fewer streak artefacts and produced more consistent marker segmentation than those reconstructed with either equispaced or equal-density binning. The optimized projection allocation algorithm significantly improves image quality in 4DCBCT images and provides, for the first time, a method to consistently generate high quality displacement binned 4DCBCT images in clinical applications.

  18. Weighted Statistical Binning: Enabling Statistically Consistent Genome-Scale Phylogenetic Analyses

    PubMed Central

    Bayzid, Md Shamsuzzoha; Mirarab, Siavash; Boussau, Bastien; Warnow, Tandy

    2015-01-01

    Because biological processes can result in different loci having different evolutionary histories, species tree estimation requires multiple loci from across multiple genomes. While many processes can result in discord between gene trees and species trees, incomplete lineage sorting (ILS), modeled by the multi-species coalescent, is considered to be a dominant cause for gene tree heterogeneity. Coalescent-based methods have been developed to estimate species trees, many of which operate by combining estimated gene trees, and so are called "summary methods". Because summary methods are generally fast (and much faster than more complicated coalescent-based methods that co-estimate gene trees and species trees), they have become very popular techniques for estimating species trees from multiple loci. However, recent studies have established that summary methods can have reduced accuracy in the presence of gene tree estimation error, and also that many biological datasets have substantial gene tree estimation error, so that summary methods may not be highly accurate in biologically realistic conditions. Mirarab et al. (Science 2014) presented the "statistical binning" technique to improve gene tree estimation in multi-locus analyses, and showed that it improved the accuracy of MP-EST, one of the most popular coalescent-based summary methods. Statistical binning, which uses a simple heuristic to evaluate "combinability" and then uses the larger sets of genes to re-calculate gene trees, has good empirical performance, but using statistical binning within a phylogenomic pipeline does not have the desirable property of being statistically consistent. We show that weighting the re-calculated gene trees by the bin sizes makes statistical binning statistically consistent under the multispecies coalescent, and maintains the good empirical performance. Thus, "weighted statistical binning" enables highly accurate genome-scale species tree estimation, and is also statistically consistent under the multi-species coalescent model. New data used in this study are available at DOI: http://dx.doi.org/10.6084/m9.figshare.1411146, and the software is available at https://github.com/smirarab/binning. PMID:26086579

  19. Potential fitting biases resulting from grouping data into variable width bins

    NASA Astrophysics Data System (ADS)

    Towers, S.

    2014-07-01

    When reading peer-reviewed scientific literature describing any analysis of empirical data, it is natural and correct to proceed with the underlying assumption that experiments have made good faith efforts to ensure that their analyses yield unbiased results. However, particle physics experiments are expensive and time consuming to carry out, thus if an analysis has inherent bias (even if unintentional), much money and effort can be wasted trying to replicate or understand the results, particularly if the analysis is fundamental to our understanding of the universe. In this note we discuss the significant biases that can result from data binning schemes. As we will show, if data are binned such that they provide the best comparison to a particular (but incorrect) model, the resulting model parameter estimates when fitting to the binned data can be significantly biased, leading us to too often accept the model hypothesis when it is not in fact true. When using binned likelihood or least squares methods there is of course no a priori requirement that data bin sizes need to be constant, but we show that fitting to data grouped into variable width bins is particularly prone to produce biased results if the bin boundaries are chosen to optimize the comparison of the binned data to a wrong model. The degree of bias that can be achieved simply with variable binning can be surprisingly large. Fitting the data with an unbinned likelihood method, when possible to do so, is the best way for researchers to show that their analyses are not biased by binning effects. Failing that, equal bin widths should be employed as a cross-check of the fitting analysis whenever possible.

  20. Evaluation of five sampling methods for Liposcelis entomophila (Enderlein) and L. decolor (Pearman) (Psocoptera: Liposcelididae) in steel bins containing wheat

    USDA-ARS?s Scientific Manuscript database

    An evaluation of five sampling methods for studying psocid population levels was conducted in two steel bins containing 32.6 metric tonnes of wheat in Manhattan, KS. Psocids were sampled using a 1.2-m open-ended trier, corrugated cardboard refuges placed on the underside of the bin hatch or the surf...

  1. The oligonucleotide frequency derived error gradient and its application to the binning of metagenome fragments

    PubMed Central

    2009-01-01

    Background The characterisation, or binning, of metagenome fragments is an important first step to further downstream analysis of microbial consortia. Here, we propose a one-dimensional signature, OFDEG, derived from the oligonucleotide frequency profile of a DNA sequence, and show that it is possible to obtain a meaningful phylogenetic signal for relatively short DNA sequences. The one-dimensional signal is essentially a compact representation of higher dimensional feature spaces of greater complexity and is intended to improve on the tetranucleotide frequency feature space preferred by current compositional binning methods. Results We compare the fidelity of OFDEG against tetranucleotide frequency in both an unsupervised and semi-supervised setting on simulated metagenome benchmark data. Four tests were conducted using assembler output of Arachne and phrap, and for each, performance was evaluated on contigs which are greater than or equal to 8 kbp in length and contigs which are composed of at least 10 reads. Using both G-C content in conjunction with OFDEG gave an average accuracy of 96.75% (semi-supervised) and 95.19% (unsupervised), versus 94.25% (semi-supervised) and 82.35% (unsupervised) for tetranucleotide frequency. Conclusion We have presented an observation of an alternative characteristic of DNA sequences. The proposed feature representation has proven to be more beneficial than the existing tetranucleotide frequency space to the metagenome binning problem. We do note, however, that our observation of OFDEG deserves further anlaysis and investigation. Unsupervised clustering revealed OFDEG related features performed better than standard tetranucleotide frequency in representing a relevant organism specific signal. Further improvement in binning accuracy is given by semi-supervised classification using OFDEG. The emphasis on a feature-driven, bottom-up approach to the problem of binning reveals promising avenues for future development of techniques to characterise short environmental sequences without bias toward cultivable organisms. PMID:19958473

  2. FLIMX: A Software Package to Determine and Analyze the Fluorescence Lifetime in Time-Resolved Fluorescence Data from the Human Eye.

    PubMed

    Klemm, Matthias; Schweitzer, Dietrich; Peters, Sven; Sauer, Lydia; Hammer, Martin; Haueisen, Jens

    2015-01-01

    Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX) for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay) to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX's applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation.

  3. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi -LAT data

    DOE PAGES

    Lott, B.; Escande, L.; Larsson, S.; ...

    2012-07-19

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LATmore » analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.« less

  4. An 18-ps TDC using timing adjustment and bin realignment methods in a Cyclone-IV FPGA

    NASA Astrophysics Data System (ADS)

    Cao, Guiping; Xia, Haojie; Dong, Ning

    2018-05-01

    The method commonly used to produce a field-programmable gate array (FPGA)-based time-to-digital converter (TDC) creates a tapped delay line (TDL) for time interpolation to yield high time precision. We conduct timing adjustment and bin realignment to implement a TDC in the Altera Cyclone-IV FPGA. The former tunes the carry look-up table (LUT) cell delay by changing the LUT's function through low-level primitives according to timing analysis results, while the latter realigns bins according to the timing result obtained by timing adjustment so as to create a uniform TDL with bins of equivalent width. The differential nonlinearity and time resolution can be improved by realigning the bins. After calibration, the TDC has a 18 ps root-mean-square timing resolution and a 45 ps least-significant bit resolution.

  5. A Comprehensive Expedient Methods Field Manual.

    DTIC Science & Technology

    1984-09-01

    structures. " Revetments may be constructed of sandbags, sod blocks , and other expedients [17:933." Bunkers are emplacements with overhead protective...Lapland Fence............................. 75 19. Hardening: Dimensional Timber (Soil Bin) Revetment ............................................. 76...20. Hardening: Log Bulkhead (Soil Bin) Revetment ... 77 21. Hardening: Landing Mat Bulkhead (Soil Bin) Revetment

  6. Sensitivity of a Cloud-Resolving Model to the Bulk and Explicit Bin Microphysical Schemes. Part 1; Validations with a PRE-STORM Case

    NASA Technical Reports Server (NTRS)

    Li, Xiao-Wen; Tao, Wei-Kuo; Khain, Alexander P.; Simpson, Joanne; Johnson, Daniel E.

    2004-01-01

    A cloud-resolving model is used to study sensitivities of two different microphysical schemes, one is the bulk type, and the other is an explicit bin scheme, in simulating a mid-latitude squall line case (PRE-STORM, June 10-11, 1985). Simulations using different microphysical schemes are compared with each other and also with the observations. Both the bulk and bin models reproduce the general features during the developing and mature stage of the system. The leading convective zone, the trailing stratiform region, the horizontal wind flow patterns, pressure perturbation associated with the storm dynamics, and the cool pool in front of the system all agree well with the observations. Both the observations and the bulk scheme simulation serve as validations for the newly incorporated bin scheme. However, it is also shown that, the bulk and bin simulations have distinct differences, most notably in the stratiform region. Weak convective cells exist in the stratiform region in the bulk simulation, but not in the bin simulation. These weak convective cells in the stratiform region are remnants of the previous stronger convections at the leading edge of the system. The bin simulation, on the other hand, has a horizontally homogeneous stratiform cloud structure, which agrees better with the observations. Preliminary examinations of the downdraft core strength, the potential temperature perturbation, and the evaporative cooling rate show that the differences between the bulk and bin models are due mainly to the stronger low-level evaporative cooling in convective zone simulated in the bulk model. Further quantitative analysis and sensitivity tests for this case using both the bulk and bin models will be presented in a companion paper.

  7. Increasing donations to supermarket food-bank bins using proximal prompts.

    PubMed

    Farrimond, Samantha J; Leland, Louis S

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin.

  8. Shapes on a plane: Evaluating the impact of projection distortion on spatial binning

    USGS Publications Warehouse

    Battersby, Sarah E.; Strebe, Daniel “daan”; Finn, Michael P.

    2017-01-01

    One method for working with large, dense sets of spatial point data is to aggregate the measure of the data into polygonal containers, such as political boundaries, or into regular spatial bins such as triangles, squares, or hexagons. When mapping these aggregations, the map projection must inevitably distort relationships. This distortion can impact the reader’s ability to compare count and density measures across the map. Spatial binning, particularly via hexagons, is becoming a popular technique for displaying aggregate measures of point data sets. Increasingly, we see questionable use of the technique without attendant discussion of its hazards. In this work, we discuss when and why spatial binning works and how mapmakers can better understand the limitations caused by distortion from projecting to the plane. We introduce equations for evaluating distortion’s impact on one common projection (Web Mercator) and discuss how the methods used generalize to other projections. While we focus on hexagonal binning, these same considerations affect spatial bins of any shape, and more generally, any analysis of geographic data performed in planar space.

  9. Exploiting jet binning to identify the initial state of high-mass resonances

    NASA Astrophysics Data System (ADS)

    Ebert, Markus A.; Liebler, Stefan; Moult, Ian; Stewart, Iain W.; Tackmann, Frank J.; Tackmann, Kerstin; Zeune, Lisa

    2016-09-01

    If a new high-mass resonance is discovered at the Large Hadron Collider, model-independent techniques to identify the production mechanism will be crucial to understand its nature and effective couplings to Standard Model particles. We present a powerful and model-independent method to infer the initial state in the production of any high-mass color-singlet system by using a tight veto on accompanying hadronic jets to divide the data into two mutually exclusive event samples (jet bins). For a resonance of several hundred GeV, the jet binning cut needed to discriminate quark and gluon initial states is in the experimentally accessible range of several tens of GeV. It also yields comparable cross sections for both bins, making this method viable already with the small event samples available shortly after a discovery. Theoretically, the method is made feasible by utilizing an effective field theory setup to compute the jet cut dependence precisely and model independently and to systematically control all sources of theoretical uncertainties in the jet binning, as well as their correlations. We use a 750 GeV scalar resonance as an example to demonstrate the viability of our method.

  10. PhyloPythiaS+: a self-training method for the rapid reconstruction of low-ranking taxonomic bins from metagenomes.

    PubMed

    Gregor, Ivan; Dröge, Johannes; Schirmer, Melanie; Quince, Christopher; McHardy, Alice C

    2016-01-01

    Background. Metagenomics is an approach for characterizing environmental microbial communities in situ, it allows their functional and taxonomic characterization and to recover sequences from uncultured taxa. This is often achieved by a combination of sequence assembly and binning, where sequences are grouped into 'bins' representing taxa of the underlying microbial community. Assignment to low-ranking taxonomic bins is an important challenge for binning methods as is scalability to Gb-sized datasets generated with deep sequencing techniques. One of the best available methods for species bins recovery from deep-branching phyla is the expert-trained PhyloPythiaS package, where a human expert decides on the taxa to incorporate in the model and identifies 'training' sequences based on marker genes directly from the sample. Due to the manual effort involved, this approach does not scale to multiple metagenome samples and requires substantial expertise, which researchers who are new to the area do not have. Results. We have developed PhyloPythiaS+, a successor to our PhyloPythia(S) software. The new (+) component performs the work previously done by the human expert. PhyloPythiaS+ also includes a new k-mer counting algorithm, which accelerated the simultaneous counting of 4-6-mers used for taxonomic binning 100-fold and reduced the overall execution time of the software by a factor of three. Our software allows to analyze Gb-sized metagenomes with inexpensive hardware, and to recover species or genera-level bins with low error rates in a fully automated fashion. PhyloPythiaS+ was compared to MEGAN, taxator-tk, Kraken and the generic PhyloPythiaS model. The results showed that PhyloPythiaS+ performs especially well for samples originating from novel environments in comparison to the other methods. Availability. PhyloPythiaS+ in a virtual machine is available for installation under Windows, Unix systems or OS X on: https://github.com/algbioi/ppsp/wiki.

  11. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  12. FLIMX: A Software Package to Determine and Analyze the Fluorescence Lifetime in Time-Resolved Fluorescence Data from the Human Eye

    PubMed Central

    Klemm, Matthias; Schweitzer, Dietrich; Peters, Sven; Sauer, Lydia; Hammer, Martin; Haueisen, Jens

    2015-01-01

    Fluorescence lifetime imaging ophthalmoscopy (FLIO) is a new technique for measuring the in vivo autofluorescence intensity decays generated by endogenous fluorophores in the ocular fundus. Here, we present a software package called FLIM eXplorer (FLIMX) for analyzing FLIO data. Specifically, we introduce a new adaptive binning approach as an optimal tradeoff between the spatial resolution and the number of photons required per pixel. We also expand existing decay models (multi-exponential, stretched exponential, spectral global analysis, incomplete decay) to account for the layered structure of the eye and present a method to correct for the influence of the crystalline lens fluorescence on the retina fluorescence. Subsequently, the Holm-Bonferroni method is applied to FLIO measurements to allow for group comparisons between patients and controls on the basis of fluorescence lifetime parameters. The performance of the new approaches was evaluated in five experiments. Specifically, we evaluated static and adaptive binning in a diabetes mellitus patient, we compared the different decay models in a healthy volunteer and performed a group comparison between diabetes patients and controls. An overview of the visualization capabilities and a comparison of static and adaptive binning is shown for a patient with macular hole. FLIMX’s applicability to fluorescence lifetime imaging microscopy is shown in the ganglion cell layer of a porcine retina sample, obtained by a laser scanning microscope using two-photon excitation. PMID:26192624

  13. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  14. Adjacent bin stability evaluating for feature description

    NASA Astrophysics Data System (ADS)

    Nie, Dongdong; Ma, Qinyong

    2018-04-01

    Recent study improves descriptor performance by accumulating stability votes for all scale pairs to compose the local descriptor. We argue that the stability of a bin depends on the differences across adjacent pairs more than the differences across all scale pairs, and a new local descriptor is composed based on the hypothesis. A series of SIFT descriptors are extracted from multiple scales firstly. Then the difference value of the bin across adjacent scales is calculated, and the stability value of a bin is calculated based on it and accumulated to compose the final descriptor. The performance of the proposed method is evaluated with two popular matching datasets, and compared with other state-of-the-art works. Experimental results show that the proposed method performs satisfactorily.

  15. Increasing Donations to Supermarket Food-Bank Bins Using Proximal Prompts

    ERIC Educational Resources Information Center

    Farrimond, Samantha J.; Leland, Louis S., Jr.

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin. (Contains 1…

  16. See Also:Mechanics of Cohesive-frictional MaterialsCopyright © 2004 John Wiley & Sons, Ltd.Get Sample Copy

  17. Recommend to Your Librarian
  18. E-MailPrint
  1. Surgical Technical Evidence Review of Hip Fracture Surgery Conducted for the AHRQ Safety Program for Improving Surgical Care and Recovery.

    PubMed

    Siletz, Anaar; Childers, Christopher P; Faltermeier, Claire; Singer, Emily S; Hu, Q Lina; Ko, Clifford Y; Kates, Stephen L; Maggard-Gibbons, Melinda; Wick, Elizabeth

    2018-01-01

    Enhanced recovery pathways (ERPs) have been shown to improve patient outcomes in a variety of contexts. This review summarizes the evidence and defines a protocol for perioperative care of patients with hip fracture and was conducted for the Agency for Healthcare Research and Quality safety program for improving surgical care and recovery. Perioperative care was divided into components or "bins." For each bin, a semisystematic review of the literature was conducted using MEDLINE with priority given to systematic reviews, meta-analyses, and randomized controlled trials. Observational studies were included when higher levels of evidence were not available. Existing guidelines for perioperative care were also incorporated. For convenience, the components of care that are under the auspices of anesthesia providers will be reported separately. Recommendations for an evidence-based protocol were synthesized based on review of this evidence. Eleven bins were identified. Preoperative risk factor bins included nutrition, diabetes mellitus, tobacco use, and anemia. Perioperative management bins included thromboprophylaxis, timing of surgery, fluid management, drain placement, early mobilization, early alimentation, and discharge criteria/planning. This review provides the evidence basis for an ERP for perioperative care of patients with hip fracture.

  2. Nonlocal low-rank and sparse matrix decomposition for spectral CT reconstruction

    NASA Astrophysics Data System (ADS)

    Niu, Shanzhou; Yu, Gaohang; Ma, Jianhua; Wang, Jing

    2018-02-01

    Spectral computed tomography (CT) has been a promising technique in research and clinics because of its ability to produce improved energy resolution images with narrow energy bins. However, the narrow energy bin image is often affected by serious quantum noise because of the limited number of photons used in the corresponding energy bin. To address this problem, we present an iterative reconstruction method for spectral CT using nonlocal low-rank and sparse matrix decomposition (NLSMD), which exploits the self-similarity of patches that are collected in multi-energy images. Specifically, each set of patches can be decomposed into a low-rank component and a sparse component, and the low-rank component represents the stationary background over different energy bins, while the sparse component represents the rest of the different spectral features in individual energy bins. Subsequently, an effective alternating optimization algorithm was developed to minimize the associated objective function. To validate and evaluate the NLSMD method, qualitative and quantitative studies were conducted by using simulated and real spectral CT data. Experimental results show that the NLSMD method improves spectral CT images in terms of noise reduction, artifact suppression and resolution preservation.

  3. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    PubMed

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  4. Disrupted Membrane Structure and Intracellular Ca2+ Signaling in Adult Skeletal Muscle with Acute Knockdown of Bin1

    PubMed Central

    Tjondrokoesoemo, Andoria; Park, Ki Ho; Ferrante, Christopher; Komazaki, Shinji; Lesniak, Sebastian; Brotto, Marco; Ko, Jae-Kyun; Zhou, Jingsong; Weisleder, Noah; Ma, Jianjie

    2011-01-01

    Efficient intracellular Ca2+ ([Ca2+]i) homeostasis in skeletal muscle requires intact triad junctional complexes comprised of t-tubule invaginations of plasma membrane and terminal cisternae of sarcoplasmic reticulum. Bin1 consists of a specialized BAR domain that is associated with t-tubule development in skeletal muscle and involved in tethering the dihydropyridine receptors (DHPR) to the t-tubule. Here, we show that Bin1 is important for Ca2+ homeostasis in adult skeletal muscle. Since systemic ablation of Bin1 in mice results in postnatal lethality, in vivo electroporation mediated transfection method was used to deliver RFP-tagged plasmid that produced short –hairpin (sh)RNA targeting Bin1 (shRNA-Bin1) to study the effect of Bin1 knockdown in adult mouse FDB skeletal muscle. Upon confirming the reduction of endogenous Bin1 expression, we showed that shRNA-Bin1 muscle displayed swollen t-tubule structures, indicating that Bin1 is required for the maintenance of intact membrane structure in adult skeletal muscle. Reduced Bin1 expression led to disruption of t-tubule structure that was linked with alterations to intracellular Ca2+ release. Voltage-induced Ca2+ released in isolated single muscle fibers of shRNA-Bin1 showed that both the mean amplitude of Ca2+ current and SR Ca2+ transient were reduced when compared to the shRNA-control, indicating compromised coupling between DHPR and ryanodine receptor 1. The mean frequency of osmotic stress induced Ca2+ sparks was reduced in shRNA-Bin1, indicating compromised DHPR activation. ShRNA-Bin1 fibers also displayed reduced Ca2+ sparks' amplitude that was attributed to decreased total Ca2+ stores in the shRNA-Bin1 fibers. Human mutation of Bin1 is associated with centronuclear myopathy and SH3 domain of Bin1 is important for sarcomeric protein organization in skeletal muscle. Our study showing the importance of Bin1 in the maintenance of intact t-tubule structure and ([Ca2+]i) homeostasis in adult skeletal muscle could provide mechanistic insight on the potential role of Bin1 in skeletal muscle contractility and pathology of myopathy. PMID:21984944

  5. A novel algorithm for fast and efficient multifocus wavefront shaping

    NASA Astrophysics Data System (ADS)

    Fayyaz, Zahra; Nasiriavanaki, Mohammadreza

    2018-02-01

    Wavefront shaping using spatial light modulator (SLM) is a popular method for focusing light through a turbid media, such as biological tissues. Usually, in iterative optimization methods, due to the very large number of pixels in SLM, larger pixels are formed, bins, and the phase value of the bins are changed to obtain an optimum phase map, hence a focus. In this study an efficient optimization algorithm is proposed to obtain an arbitrary map of focus utilizing all the SLM pixels or small bin sizes. The application of such methodology in dermatology, hair removal in particular, is explored and discussed

  6. GenoGAM: genome-wide generalized additive models for ChIP-Seq analysis.

    PubMed

    Stricker, Georg; Engelhardt, Alexander; Schulz, Daniel; Schmid, Matthias; Tresch, Achim; Gagneur, Julien

    2017-08-01

    Chromatin immunoprecipitation followed by deep sequencing (ChIP-Seq) is a widely used approach to study protein-DNA interactions. Often, the quantities of interest are the differential occupancies relative to controls, between genetic backgrounds, treatments, or combinations thereof. Current methods for differential occupancy of ChIP-Seq data rely however on binning or sliding window techniques, for which the choice of the window and bin sizes are subjective. Here, we present GenoGAM (Genome-wide Generalized Additive Model), which brings the well-established and flexible generalized additive models framework to genomic applications using a data parallelism strategy. We model ChIP-Seq read count frequencies as products of smooth functions along chromosomes. Smoothing parameters are objectively estimated from the data by cross-validation, eliminating ad hoc binning and windowing needed by current approaches. GenoGAM provides base-level and region-level significance testing for full factorial designs. Application to a ChIP-Seq dataset in yeast showed increased sensitivity over existing differential occupancy methods while controlling for type I error rate. By analyzing a set of DNA methylation data and illustrating an extension to a peak caller, we further demonstrate the potential of GenoGAM as a generic statistical modeling tool for genome-wide assays. Software is available from Bioconductor: https://www.bioconductor.org/packages/release/bioc/html/GenoGAM.html . gagneur@in.tum.de. Supplementary information is available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Markov chain Monte Carlo linkage analysis: effect of bin width on the probability of linkage.

    PubMed

    Slager, S L; Juo, S H; Durner, M; Hodge, S E

    2001-01-01

    We analyzed part of the Genetic Analysis Workshop (GAW) 12 simulated data using Monte Carlo Markov chain (MCMC) methods that are implemented in the computer program Loki. The MCMC method reports the "probability of linkage" (PL) across the chromosomal regions of interest. The point of maximum PL can then be taken as a "location estimate" for the location of the quantitative trait locus (QTL). However, Loki does not provide a formal statistical test of linkage. In this paper, we explore how the bin width used in the calculations affects the max PL and the location estimate. We analyzed age at onset (AO) and quantitative trait number 5, Q5, from 26 replicates of the general simulated data in one region where we knew a major gene, MG5, is located. For each trait, we found the max PL and the corresponding location estimate, using four different bin widths. We found that bin width, as expected, does affect the max PL and the location estimate, and we recommend that users of Loki explore how their results vary with different bin widths.

  8. Surgical Technical Evidence Review of Hip Fracture Surgery Conducted for the AHRQ Safety Program for Improving Surgical Care and Recovery

    PubMed Central

    Siletz, Anaar; Faltermeier, Claire; Singer, Emily S.; Hu, Q. Lina; Ko, Clifford Y.; Kates, Stephen L.; Maggard-Gibbons, Melinda; Wick, Elizabeth

    2018-01-01

    Background: Enhanced recovery pathways (ERPs) have been shown to improve patient outcomes in a variety of contexts. This review summarizes the evidence and defines a protocol for perioperative care of patients with hip fracture and was conducted for the Agency for Healthcare Research and Quality safety program for improving surgical care and recovery. Study Design: Perioperative care was divided into components or “bins.” For each bin, a semisystematic review of the literature was conducted using MEDLINE with priority given to systematic reviews, meta-analyses, and randomized controlled trials. Observational studies were included when higher levels of evidence were not available. Existing guidelines for perioperative care were also incorporated. For convenience, the components of care that are under the auspices of anesthesia providers will be reported separately. Recommendations for an evidence-based protocol were synthesized based on review of this evidence. Results: Eleven bins were identified. Preoperative risk factor bins included nutrition, diabetes mellitus, tobacco use, and anemia. Perioperative management bins included thromboprophylaxis, timing of surgery, fluid management, drain placement, early mobilization, early alimentation, and discharge criteria/planning. Conclusions: This review provides the evidence basis for an ERP for perioperative care of patients with hip fracture. PMID:29844947

  9. Interference Cognizant Network Scheduling

    NASA Technical Reports Server (NTRS)

    Hall, Brendan (Inventor); Bonk, Ted (Inventor); DeLay, Benjamin F. (Inventor); Varadarajan, Srivatsan (Inventor); Smithgall, William Todd (Inventor)

    2017-01-01

    Systems and methods for interference cognizant network scheduling are provided. In certain embodiments, a method of scheduling communications in a network comprises identifying a bin of a global timeline for scheduling an unscheduled virtual link, wherein a bin is a segment of the timeline; identifying a pre-scheduled virtual link in the bin; and determining if the pre-scheduled and unscheduled virtual links share a port. In certain embodiments, if the unscheduled and pre-scheduled virtual links don't share a port, scheduling transmission of the unscheduled virtual link to overlap with the scheduled transmission of the pre-scheduled virtual link; and if the unscheduled and pre-scheduled virtual links share a port: determining a start time delay for the unscheduled virtual link based on the port; and scheduling transmission of the unscheduled virtual link in the bin based on the start time delay to overlap part of the scheduled transmission of the pre-scheduled virtual link.

  10. BIN1 is Reduced and Cav1.2 Trafficking is Impaired in Human Failing Cardiomyocytes

    PubMed Central

    Hong, Ting-Ting; Smyth, James W.; Chu, Kevin Y.; Vogan, Jacob M.; Fong, Tina S.; Jensen, Brian C.; Fang, Kun; Halushka, Marc K.; Russell, Stuart D.; Colecraft, Henry; Hoopes, Charles W.; Ocorr, Karen; Chi, Neil C.; Shaw, Robin M.

    2011-01-01

    Background Heart failure is a growing epidemic and a typical aspect of heart failure pathophysiology is altered calcium transients. Normal cardiac calcium transients are initiated by Cav1.2 channels at cardiac T-tubules. BIN1 is a membrane scaffolding protein that causes Cav1.2 to traffic to T-tubules in healthy hearts. The mechanisms of Cav1.2 trafficking in heart failure are not known. Objective To study BIN1 expression and its effect on Cav1.2 trafficking in failing hearts. Methods Intact myocardium and freshly isolated cardiomyocytes from non-failing and end-stage failing human hearts were used to study BIN1 expression and Cav1.2 localization. To confirm Cav1.2 surface expression dependence on BIN1, patch clamp recordings were performed of Cav1.2 current in cell lines with and without trafficking competent BIN1. Also, in adult mouse cardiomyocytes, surface Cav1.2 and calcium transients were studied after shRNA mediated knockdown of BIN1. For a functional readout in intact heart, calcium transients and cardiac contractility were analyzed in a zebrafish model with morpholino mediated knockdown of BIN1. Results BIN1 expression is significantly decreased in failing cardiomyocytes at both mRNA (30% down) and protein (36% down) levels. Peripheral Cav1.2 is reduced 42% by imaging and biochemical T-tubule fraction of Cav1.2 is reduced 68%. Total calcium current is reduced 41% in a cell line expressing non-trafficking BIN1 mutant. In mouse cardiomyocytes, BIN1 knockdown decreases surface Cav1.2 and impairs calcium transients. In zebrafish hearts, BIN1 knockdown causes a 75% reduction in calcium transients and severe ventricular contractile dysfunction. Conclusions The data indicate that BIN1 is significantly reduced in human heart failure, and this reduction impairs Cav1.2 trafficking, calcium transients, and contractility. PMID:22138472

  11. EMUDRA: Ensemble of Multiple Drug Repositioning Approaches to Improve Prediction Accuracy.

    PubMed

    Zhou, Xianxiao; Wang, Minghui; Katsyv, Igor; Irie, Hanna; Zhang, Bin

    2018-04-24

    Availability of large-scale genomic, epigenetic and proteomic data in complex diseases makes it possible to objectively and comprehensively identify therapeutic targets that can lead to new therapies. The Connectivity Map has been widely used to explore novel indications of existing drugs. However, the prediction accuracy of the existing methods, such as Kolmogorov-Smirnov statistic remains low. Here we present a novel high-performance drug repositioning approach that improves over the state-of-the-art methods. We first designed an expression weighted cosine method (EWCos) to minimize the influence of the uninformative expression changes and then developed an ensemble approach termed EMUDRA (Ensemble of Multiple Drug Repositioning Approaches) to integrate EWCos and three existing state-of-the-art methods. EMUDRA significantly outperformed individual drug repositioning methods when applied to simulated and independent evaluation datasets. We predicted using EMUDRA and experimentally validated an antibiotic rifabutin as an inhibitor of cell growth in triple negative breast cancer. EMUDRA can identify drugs that more effectively target disease gene signatures and will thus be a useful tool for identifying novel therapies for complex diseases and predicting new indications for existing drugs. The EMUDRA R package is available at doi:10.7303/syn11510888. bin.zhang@mssm.edu or zhangb@hotmail.com. Supplementary data are available at Bioinformatics online.

  12. Symmetric autocompensating quantum key distribution

    NASA Astrophysics Data System (ADS)

    Walton, Zachary D.; Sergienko, Alexander V.; Levitin, Lev B.; Saleh, Bahaa E. A.; Teich, Malvin C.

    2004-08-01

    We present quantum key distribution schemes which are autocompensating (require no alignment) and symmetric (Alice and Bob receive photons from a central source) for both polarization and time-bin qubits. The primary benefit of the symmetric configuration is that both Alice and Bob may have passive setups (neither Alice nor Bob is required to make active changes for each run of the protocol). We show that both the polarization and the time-bin schemes may be implemented with existing technology. The new schemes are related to previously described schemes by the concept of advanced waves.

  13. Measurement of Plutonium-240 Angular Momentum Dependent Fission Probabilities Using the Alpha-Alpha' Reaction

    NASA Astrophysics Data System (ADS)

    Koglin, Johnathon

    Accurate nuclear reaction data from a few keV to tens of MeV and across the table of nuclides is essential to a number of applications of nuclear physics, including national security, nuclear forensics, nuclear astrophysics, and nuclear energy. Precise determination of (n, f) and neutron capture cross sections for reactions in high- ux environments are particularly important for a proper understanding of nuclear reactor performance and stellar nucleosynthesis. In these extreme environments reactions on short-lived and otherwise difficult-to-produce isotopes play a significant role in system evolution and provide insights into the types of nuclear processes taking place; a detailed understanding of these processes is necessary to properly determine cross sections far from stability. Indirect methods are often attempted to measure cross sections on isotopes that are difficult to separate in a laboratory setting. Using the surrogate approach, the same compound nucleus from the reaction of interest is created through a "surrogate" reaction on a different isotope and the resulting decay is measured. This result is combined with appropriate reaction theory for compound nucleus population, from which the desired cross sections can be inferred. This method has shown promise, but the theoretical framework often lacks necessary experimental data to constrain models. In this work, dual arrays of silicon telescope particle identification detectors and photovoltaic (solar) cell fission fragment detectors have been used to measure the fission probability of the 240Pu(alpha, alpha'f) reaction - a surrogate for the 239Pu(n, f) - and fission of 35.9(2)MeV at eleven scattering angles from 40° to 140° in 10° intervals and at nuclear excitation energies up to 16MeV. Within experimental uncertainty, the maximum fission probability was observed at the neutron separation energy for each alpha scattering angle. Fission probabilities were separated into five 500 keV bins from 5:5MeV to 8:0MeV and one bin from 4:5MeV to 5:5MeV. Across energy bins the fission probability increases approximately linearly with increasing alpha' scattering angle. At 90° the fission probability increases from 0:069(6) in the lowest energy bin to 0:59(2) in the highest. Likewise, within a single energy bin the fission probability increases with alpha' scattering angle. Within the 6:5MeV and 7:0MeV energy bin, the fission probability increased from 0:41(1) at 60° to 0:81(10) at 140°. Fission fragment angular distributions were also measured integrated over each energy bin. These distributions were fit to theoretical distributions based on combinations of transitional nuclear vibrational and rotational excitations at the saddle point. Contributions from specific K vibrational states were extracted and combined with fission probability measurements to determine the relative fission probability of each state as a function of nuclear excitation energy. Within a given excitation energy bin, it is found that contributions from K states greater than the minimum K = 0 state tend to increase with the increasing alpha' scattering angle. This is attributed to an increase in the transferred angular momentum associated with larger scattering angles. The 90° alpha' scattering angle produced the highest quality results. The relative contributions of K states do not show a discernible trend across the energy spectrum. The energy-binned results confirm existing measurements that place a K = 2 state in the first energy bin with the opening of K = 1 and K = 4 states at energies above 5:5MeV. This experiment represents the first of its kind in which fission probabilities and angular distributions are simultaneously measured at a large number of scattering angles. The acquired fission probability, angular distribution, and K state contribution provide a diverse dataset against which microscopic fission models can be constrained and further the understanding of the properties of the 240Pu fission.

  14. Validating the operational bias and hypothesis of universal exponent in landslide frequency-area distribution.

    PubMed

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes.

  15. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    PubMed Central

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.; Kuncic, Zdenka; Keall, Paul J.

    2014-01-01

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets with various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage. PMID:24694143

  16. Diviner lunar radiometer gridded brightness temperatures from geodesic binning of modeled fields of view

    NASA Astrophysics Data System (ADS)

    Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.

    2017-12-01

    An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the spatial resolution of the grid, the size of the FOV and the on-target spacing of observations. Our approach may be applicable and beneficial for many existing and future point-based planetary datasets.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Islam, Md. Shafiqul, E-mail: shafique@eng.ukm.my; Hannan, M.A., E-mail: hannan@eng.ukm.my; Basri, Hassan

    Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensormore » intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level.« less

  18. SU-F-J-135: Tumor Displacement-Based Binning for Respiratory-Gated Time-Independent 5DCT Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, L; O’Connell, D; Lee, P

    2016-06-15

    Purpose: A published 5DCT breathing motion model enables image reconstruction at any user-selected breathing phase, defined by the model as a specific amplitude (v) and rate (f). Generation of reconstructed phase-specific CT scans will be required for time-independent radiation dose distribution simulations. This work answers the question: how many amplitude and rate bins are required to describe the tumor motion with a specific spatial resolution? Methods: 19 lung-cancer patients with 21 tumors were scanned using a free-breathing 5DCT protocol, employing an abdominally positioned pneumatic-bellows breathing surrogate and yielding voxel-specific motion model parameters α and β corresponding to motion as amore » function of amplitude and rate, respectively. Tumor GTVs were contoured on the first (reference) of 25 successive free-breathing fast helical CT image sets. The tumor displacements were binned into widths of 1mm to 5mm in 1mm steps and the total required number of bins recorded. The simulation evaluated the number of bins needed to encompass 100% of the breathing-amplitude and between the 5th and 95th percentile amplitudes to exclude breathing outliers. Results: The mean respiration-induced tumor motion was 9.90mm ± 7.86mm with a maximum of 25mm. The number of bins required was a strong function of the spatial resolution and varied widely between patients. For example, for 2mm bins, between 1–13 amplitude bins and 1–9 rate bins were required to encompass 100% of the breathing amplitude, while 1–6 amplitude bins and 1–3 rate bins were required to encompass 90% of the breathing amplitude. Conclusion: The strong relationship between number of bins and spatial resolution as well as the large variation between patients implies that time-independent radiation dose distribution simulations should be conducted using patient-specific data and that the breathing conditions will have to be carefully considered. This work will lead to the assessment of the dosimetric impact of binning resolution. This study is supported by Siemens Healthcare.« less

  19. Validating the Operational Bias and Hypothesis of Universal Exponent in Landslide Frequency-Area Distribution

    PubMed Central

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes. PMID:24852019

  20. An optimal FFT-based anisotropic power spectrum estimator

    NASA Astrophysics Data System (ADS)

    Hand, Nick; Li, Yin; Slepian, Zachary; Seljak, Uroš

    2017-07-01

    Measurements of line-of-sight dependent clustering via the galaxy power spectrum's multipole moments constitute a powerful tool for testing theoretical models in large-scale structure. Recent work shows that this measurement, including a moving line-of-sight, can be accelerated using Fast Fourier Transforms (FFTs) by decomposing the Legendre polynomials into products of Cartesian vectors. Here, we present a faster, optimal means of using FFTs for this measurement. We avoid redundancy present in the Cartesian decomposition by using a spherical harmonic decomposition of the Legendre polynomials. With this method, a given multipole of order l requires only 2l+1 FFTs rather than the (l+1)(l+2)/2 FFTs of the Cartesian approach. For the hexadecapole (l = 4), this translates to 40% fewer FFTs, with increased savings for higher l. The reduction in wall-clock time enables the calculation of finely-binned wedges in P(k,μ), obtained by computing multipoles up to a large lmax and combining them. This transformation has a number of advantages. We demonstrate that by using non-uniform bins in μ, we can isolate plane-of-sky (angular) systematics to a narrow bin at 0μ simeq while eliminating the contamination from all other bins. We also show that the covariance matrix of clustering wedges binned uniformly in μ becomes ill-conditioned when combining multipoles up to large values of lmax, but that the problem can be avoided with non-uniform binning. As an example, we present results using lmax=16, for which our procedure requires a factor of 3.4 fewer FFTs than the Cartesian method, while removing the first μ bin leads only to a 7% increase in statistical error on f σ8, as compared to a 54% increase with lmax=4.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hand, Nick; Seljak, Uroš; Li, Yin

    Measurements of line-of-sight dependent clustering via the galaxy power spectrum's multipole moments constitute a powerful tool for testing theoretical models in large-scale structure. Recent work shows that this measurement, including a moving line-of-sight, can be accelerated using Fast Fourier Transforms (FFTs) by decomposing the Legendre polynomials into products of Cartesian vectors. Here, we present a faster, optimal means of using FFTs for this measurement. We avoid redundancy present in the Cartesian decomposition by using a spherical harmonic decomposition of the Legendre polynomials. With this method, a given multipole of order ℓ requires only 2ℓ+1 FFTs rather than the (ℓ+1)(ℓ+2)/2 FFTsmore » of the Cartesian approach. For the hexadecapole (ℓ = 4), this translates to 40% fewer FFTs, with increased savings for higher ℓ. The reduction in wall-clock time enables the calculation of finely-binned wedges in P ( k ,μ), obtained by computing multipoles up to a large ℓ{sub max} and combining them. This transformation has a number of advantages. We demonstrate that by using non-uniform bins in μ, we can isolate plane-of-sky (angular) systematics to a narrow bin at 0μ ≅ while eliminating the contamination from all other bins. We also show that the covariance matrix of clustering wedges binned uniformly in μ becomes ill-conditioned when combining multipoles up to large values of ℓ{sub max}, but that the problem can be avoided with non-uniform binning. As an example, we present results using ℓ{sub max}=16, for which our procedure requires a factor of 3.4 fewer FFTs than the Cartesian method, while removing the first μ bin leads only to a 7% increase in statistical error on f σ{sub 8}, as compared to a 54% increase with ℓ{sub max}=4.« less

  2. Visual analytics of large multidimensional data using variable binned scatter plots

    NASA Astrophysics Data System (ADS)

    Hao, Ming C.; Dayal, Umeshwar; Sharma, Ratnesh K.; Keim, Daniel A.; Janetzko, Halldór

    2010-01-01

    The scatter plot is a well-known method of visualizing pairs of two-dimensional continuous variables. Multidimensional data can be depicted in a scatter plot matrix. They are intuitive and easy-to-use, but often have a high degree of overlap which may occlude a significant portion of data. In this paper, we propose variable binned scatter plots to allow the visualization of large amounts of data without overlapping. The basic idea is to use a non-uniform (variable) binning of the x and y dimensions and plots all the data points that fall within each bin into corresponding squares. Further, we map a third attribute to color for visualizing clusters. Analysts are able to interact with individual data points for record level information. We have applied these techniques to solve real-world problems on credit card fraud and data center energy consumption to visualize their data distribution and cause-effect among multiple attributes. A comparison of our methods with two recent well-known variants of scatter plots is included.

  3. Fast and flexible 3D object recognition solutions for machine vision applications

    NASA Astrophysics Data System (ADS)

    Effenberger, Ira; Kühnle, Jens; Verl, Alexander

    2013-03-01

    In automation and handling engineering, supplying work pieces between different stages along the production process chain is of special interest. Often the parts are stored unordered in bins or lattice boxes and hence have to be separated and ordered for feeding purposes. An alternative to complex and spacious mechanical systems such as bowl feeders or conveyor belts, which are typically adapted to the parts' geometry, is using a robot to grip the work pieces out of a bin or from a belt. Such applications are in need of reliable and precise computer-aided object detection and localization systems. For a restricted range of parts, there exists a variety of 2D image processing algorithms that solve the recognition problem. However, these methods are often not well suited for the localization of randomly stored parts. In this paper we present a fast and flexible 3D object recognizer that localizes objects by identifying primitive features within the objects. Since technical work pieces typically consist to a substantial degree of geometric primitives such as planes, cylinders and cones, such features usually carry enough information in order to determine the position of the entire object. Our algorithms use 3D best-fitting combined with an intelligent data pre-processing step. The capability and performance of this approach is shown by applying the algorithms to real data sets of different industrial test parts in a prototypical bin picking demonstration system.

  4. Surface contamination of hazardous drug pharmacy storage bins and pharmacy distributor shipping containers.

    PubMed

    Redic, Kimberly A; Fang, Kayleen; Christen, Catherine; Chaffee, Bruce W

    2018-03-01

    Purpose This study was conducted to determine whether there is contamination on exterior drug packaging using shipping totes from the distributor and carousel storage bins as surrogate markers of external packaging contamination. Methods A two-part study was conducted to measure the presence of 5-fluorouracil, ifosfamide, cyclophosphamide, docetaxel and paclitaxel using surrogate markers for external drug packaging. In Part I, 10 drug distributor shipping totes designated for transport of hazardous drugs provided a snapshot view of contamination from regular use and transit in and out of the pharmacy. An additional two totes designated for transport of non-hazardous drugs served as controls. In Part II, old carousel storage bins (i.e. those in use pre-study) were wiped for snapshot view of hazardous drug contamination on storage bins. New carousel storage bins were then put into use for storage of the five tested drugs and used for routine storage and inventory maintenance activities. Carousel bins were wiped at time intervals 0, 8, 16 and 52 weeks to measure surface contamination. Results Two of the 10 hazardous shipping totes were contaminated. Three of the five-old carousel bins were contaminated with cyclophosphamide. One of the old carousel bins was also contaminated with ifosfamide. There were no detectable levels of hazardous drugs on any of the new storage bins at time 0, 8 or 16 weeks. However, at the Week 52, there was a detectable level of 5-FU present in the 5-FU carousel bin. Conclusions Contamination of the surrogate markers suggests that external packaging for hazardous drugs is contaminated, either during the manufacturing process or during routine chain of custody activities. These results demonstrate that occupational exposure may occur due to contamination from shipping totes and storage bins, and that handling practices including use of personal protective equipment is warranted.

  5. Technical note: Data loggers are a valid method for assessing the feeding behavior of dairy cows using the Calan Broadbent Feeding System.

    PubMed

    Krawczel, P D; Klaiber, L M; Thibeau, S S; Dann, H M

    2012-08-01

    Assessing feeding behavior is important in understanding the effects of nutrition and management on the well-being of dairy cows. Historically, collection of these data from cows fed with a Calan Broadbent Feeding System (American Calan Inc., Northwood, NH) required the labor-intensive practices of direct observation or video review. The objective of this study was to evaluate the agreement between the output of a HOBO change-of-state data logger (Onset Computer Corp., Bourne, MA), mounted to the door shell and latch plate, and video data summarized with continuous sampling. Data (number of feed bin visits per day and feeding time in minutes per day) were recorded with both methods from 26 lactating cows and 10 nonlactating cows for 3 d per cow (n=108). The agreement of the data logger and video methods was evaluated using the REG procedure of SAS to compare the mean response of the methods against the difference between the methods. The maximum allowable difference (MAD) was set at ±3 for bin visits and ±20 min for feeding time. Ranges for feed bin visits (2 to 140 per d) and feeding time (28 to 267 min/d) were established from video data. Using the complete data set, agreement was partially established between the data logger and video methods for feed bin visits, but not established for feeding time. The complete data set generated by the data logger was screened to remove visits of a duration ≤3 s, reflecting a cow unable to enter a feed bin (representing 7% of all data) and ≥5,400 s, reflecting a failure of the data logger to align properly with its corresponding magnetic field (representing <1% of all data). Using the resulting screened data set, agreement was established for feed bin visits and feeding time. For bin visits, 4% of the data was beyond the MAD. For feeding time, 3% of the data was beyond the MAD and 74% of the data was ±1 min. The insignificant P-value, low coefficient of determination, and concentration of the data within the MAD indicate the agreement of the change-of-state data logger and video data. This suggests the usage of a change-of-state data logger to assess the feeding behavior of cows feeding from a Calan Broadbent Feeding System is appropriate. Use of the screening criteria for data analysis is recommended. Copyright © 2012 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. Characteristic features of a high-energy x-ray spectra estimation method based on the Waggener iterative perturbation principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwasaki, Akira; Kubota, Mamoru; Hirota, Junichi

    2006-11-15

    We have redeveloped a high-energy x-ray spectra estimation method reported by Iwasaki et al. [A. Iwasaki, H. Matsutani, M. Kubota, A. Fujimori, K. Suzaki, and Y. Abe, Radiat. Phys. Chem. 67, 81-91 (2003)]. The method is based on the iterative perturbation principle to minimize differences between measured and calculated transmission curves, originally proposed by Waggener et al. [R. G. Waggener, M. M. Blough, J. A. Terry, D. Chen, N. E. Lee, S. Zhang, and W. D. McDavid, Med. Phys. 26, 1269-1278 (1999)]. The method can estimate spectra applicable for media at least from water to lead using only about tenmore » energy bins. Estimating spectra of 4-15 MV x-ray beams from a linear accelerator, we describe characteristic features of the method with regard to parameters including the prespectrum, number of transmission measurements, number of energy bins, energy bin widths, and artifactual bipeaked spectrum production.« less

  7. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets withmore » various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage.« less

  8. A Transcontinental Challenge — A Test of DNA Barcode Performance for 1,541 Species of Canadian Noctuoidea (Lepidoptera)

    PubMed Central

    Zahiri, Reza; Lafontaine, J. Donald; Schmidt, B. Christian; deWaard, Jeremy R.; Zakharov, Evgeny V.; Hebert, Paul D. N.

    2014-01-01

    This study provides a first, comprehensive, diagnostic use of DNA barcodes for the Canadian fauna of noctuoids or “owlet” moths (Lepidoptera: Noctuoidea) based on vouchered records for 1,541 species (99.1% species coverage), and more than 30,000 sequences. When viewed from a Canada-wide perspective, DNA barcodes unambiguously discriminate 90% of the noctuoid species recognized through prior taxonomic study, and resolution reaches 95.6% when considered at a provincial scale. Barcode sharing is concentrated in certain lineages with 54% of the cases involving 1.8% of the genera. Deep intraspecific divergence exists in 7.7% of the species, but further studies are required to clarify whether these cases reflect an overlooked species complex or phylogeographic variation in a single species. Non-native species possess higher Nearest-Neighbour (NN) distances than native taxa, whereas generalist feeders have lower NN distances than those with more specialized feeding habits. We found high concordance between taxonomic names and sequence clusters delineated by the Barcode Index Number (BIN) system with 1,082 species (70%) assigned to a unique BIN. The cases of discordance involve both BIN mergers and BIN splits with 38 species falling into both categories, most likely reflecting bidirectional introgression. One fifth of the species are involved in a BIN merger reflecting the presence of 158 species sharing their barcode sequence with at least one other taxon, and 189 species with low, but diagnostic COI divergence. A very few cases (13) involved species whose members fell into both categories. Most of the remaining 140 species show a split into two or three BINs per species, while Virbia ferruginosa was divided into 16. The overall results confirm that DNA barcodes are effective for the identification of Canadian noctuoids. This study also affirms that BINs are a strong proxy for species, providing a pathway for a rapid, accurate estimation of animal diversity. PMID:24667847

  9. Data-driven optimal binning for respiratory motion management in PET.

    PubMed

    Kesner, Adam L; Meier, Joseph G; Burckhardt, Darrell D; Schwartz, Jazmin; Lynch, David A

    2018-01-01

    Respiratory gating has been used in PET imaging to reduce the amount of image blurring caused by patient motion. Optimal binning is an approach for using the motion-characterized data by binning it into a single, easy to understand/use, optimal bin. To date, optimal binning protocols have utilized externally driven motion characterization strategies that have been tuned with population-derived assumptions and parameters. In this work, we are proposing a new strategy with which to characterize motion directly from a patient's gated scan, and use that signal to create a patient/instance-specific optimal bin image. Two hundred and nineteen phase-gated FDG PET scans, acquired using data-driven gating as described previously, were used as the input for this study. For each scan, a phase-amplitude motion characterization was generated and normalized using principle component analysis. A patient-specific "optimal bin" window was derived using this characterization, via methods that mirror traditional optimal window binning strategies. The resulting optimal bin images were validated by correlating quantitative and qualitative measurements in the population of PET scans. In 53% (n = 115) of the image population, the optimal bin was determined to include 100% of the image statistics. In the remaining images, the optimal binning windows averaged 60% of the statistics and ranged between 20% and 90%. Tuning the algorithm, through a single acceptance window parameter, allowed for adjustments of the algorithm's performance in the population toward conservation of motion or reduced noise-enabling users to incorporate their definition of optimal. In the population of images that were deemed appropriate for segregation, average lesion SUV max were 7.9, 8.5, and 9.0 for nongated images, optimal bin, and gated images, respectively. The Pearson correlation of FWHM measurements between optimal bin images and gated images were better than with nongated images, 0.89 and 0.85, respectively. Generally, optimal bin images had better resolution than the nongated images and better noise characteristics than the gated images. We extended the concept of optimal binning to a data-driven form, updating a traditionally one-size-fits-all approach to a conformal one that supports adaptive imaging. This automated strategy was implemented easily within a large population and encapsulated motion information in an easy to use 3D image. Its simplicity and practicality may make this, or similar approaches ideal for use in clinical settings. © 2017 American Association of Physicists in Medicine.

  10. Assessment of laboratory logistics management information system practice for HIV/AIDS and tuberculosis laboratory commodities in selected public health facilities in Addis Ababa, Ethiopia

    PubMed Central

    Desale, Adino; Taye, Bineyam; Belay, Getachew; Nigatu, Alemayehu

    2013-01-01

    Introduction Logistics management information system for health commodities remained poorly implemented in most of developing countries. To assess the status of laboratory logistics management information system for HIV/AIDS and tuberculosis laboratory commodities in public health facilities in Addis Ababa. Methods A cross-sectional descriptive study was conducted from September 2010-January 2011 at selected public health facilities. A stratified random sampling method was used to include a total of 43 facilities which, were investigated through quantitative methods using structured questionnaires interviews. Focus group discussion with the designated supply chain managers and key informant interviews were conducted for the qualitative method. Results There exists a well-designed logistics system for laboratory commodities with trained pharmacy personnel, distributed standard LMIS formats and established inventory control procedures. However, majority of laboratory professionals were not trained in LMIS. Majority of the facilities (60.5%) were stocked out for at least one ART monitoring and TB laboratory reagents and the highest stock out rate was for chemistry reagents. Expired ART monitoring laboratory commodities were found in 25 (73.5%) of facilities. Fifty percent (50%) of the assessed hospitals and 54% of health centers were currently using stock/bin cards for all HIV/AIDS and TB laboratory commodities in main pharmacy store, among these only 25% and 20.8% of them were updated with accurate information matching with the physical count done at the time of visit for hospitals and health centers respectively. Conclusion Even though there exists a well designed laboratory LMIS, keeping quality stock/bin cards and LMIS reports were very low. Key ART monitoring laboratory commodities were stock out at many facilities at the day of visit and during the past six months. Based on findings, training of laboratory personnel's managing laboratory commodities and keeping accurate inventory control procedures were recommended. PMID:24106574

  11. SU-F-303-11: Implementation and Applications of Rapid, SIFT-Based Cine MR Image Binning and Region Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazur, T; Wang, Y; Fischer-Valuck, B

    2015-06-15

    Purpose: To develop a novel and rapid, SIFT-based algorithm for assessing feature motion on cine MR images acquired during MRI-guided radiotherapy treatments. In particular, we apply SIFT descriptors toward both partitioning cine images into respiratory states and tracking regions across frames. Methods: Among a training set of images acquired during a fraction, we densely assign SIFT descriptors to pixels within the images. We cluster these descriptors across all frames in order to produce a dictionary of trackable features. Associating the best-matching descriptors at every frame among the training images to these features, we construct motion traces for the features. Wemore » use these traces to define respiratory bins for sorting images in order to facilitate robust pixel-by-pixel tracking. Instead of applying conventional methods for identifying pixel correspondences across frames we utilize a recently-developed algorithm that derives correspondences via a matching objective for SIFT descriptors. Results: We apply these methods to a collection of lung, abdominal, and breast patients. We evaluate the procedure for respiratory binning using target sites exhibiting high-amplitude motion among 20 lung and abdominal patients. In particular, we investigate whether these methods yield minimal variation between images within a bin by perturbing the resulting image distributions among bins. Moreover, we compare the motion between averaged images across respiratory states to 4DCT data for these patients. We evaluate the algorithm for obtaining pixel correspondences between frames by tracking contours among a set of breast patients. As an initial case, we track easily-identifiable edges of lumpectomy cavities that show minimal motion over treatment. Conclusions: These SIFT-based methods reliably extract motion information from cine MR images acquired during patient treatments. While we performed our analysis retrospectively, the algorithm lends itself to prospective motion assessment. Applications of these methods include motion assessment, identifying treatment windows for gating, and determining optimal margins for treatment.« less

  12. An ultra-high-density bin map facilitates high-throughput QTL mapping of horticultural traits in pepper (Capsicum annuum).

    PubMed

    Han, Koeun; Jeong, Hee-Jin; Yang, Hee-Bum; Kang, Sung-Min; Kwon, Jin-Kyung; Kim, Seungill; Choi, Doil; Kang, Byoung-Cheorl

    2016-04-01

    Most agricultural traits are controlled by quantitative trait loci (QTLs); however, there are few studies on QTL mapping of horticultural traits in pepper (Capsicum spp.) due to the lack of high-density molecular maps and the sequence information. In this study, an ultra-high-density map and 120 recombinant inbred lines (RILs) derived from a cross between C. annuum'Perennial' and C. annuum'Dempsey' were used for QTL mapping of horticultural traits. Parental lines and RILs were resequenced at 18× and 1× coverage, respectively. Using a sliding window approach, an ultra-high-density bin map containing 2,578 bins was constructed. The total map length of the map was 1,372 cM, and the average interval between bins was 0.53 cM. A total of 86 significant QTLs controlling 17 horticultural traits were detected. Among these, 32 QTLs controlling 13 traits were major QTLs. Our research shows that the construction of bin maps using low-coverage sequence is a powerful method for QTL mapping, and that the short intervals between bins are helpful for fine-mapping of QTLs. Furthermore, bin maps can be used to improve the quality of reference genomes by elucidating the genetic order of unordered regions and anchoring unassigned scaffolds to linkage groups. © The Author 2016. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  13. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  15. Discretising the velocity distribution for directional dark matter experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavanagh, Bradley J., E-mail: bradley.kavanagh@cea.fr

    2015-07-01

    Dark matter (DM) direct detection experiments which are directionally-sensitive may be the only method of probing the full velocity distribution function (VDF) of the Galactic DM halo. We present an angular basis for the DM VDF which can be used to parametrise the distribution in order to mitigate astrophysical uncertainties in future directional experiments and extract information about the DM halo. This basis consists of discretising the VDF in a series of angular bins, with the VDF being only a function of the DM speed v within each bin. In contrast to other methods, such as spherical harmonic expansions, themore » use of this basis allows us to guarantee that the resulting VDF is everywhere positive and therefore physical. We present a recipe for calculating the event rates corresponding to the discrete VDF for an arbitrary number of angular bins N and investigate the discretisation error which is introduced in this way. For smooth, Standard Halo Model-like distribution functions, only N=3 angular bins are required to achieve an accuracy of around 01–30% in the number of events in each bin. Shortly after confirmation of the DM origin of the signal with around 50 events, this accuracy should be sufficient to allow the discretised velocity distribution to be employed reliably. For more extreme VDFs (such as streams), the discretisation error is typically much larger, but can be improved with increasing N. This method paves the way towards an astrophysics-independent analysis framework for the directional detection of dark matter.« less

  16. Discretising the velocity distribution for directional dark matter experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kavanagh, Bradley J.; School of Physics & Astronomy, University of Nottingham,University Park, Nottingham, NG7 2RD

    2015-07-13

    Dark matter (DM) direct detection experiments which are directionally-sensitive may be the only method of probing the full velocity distribution function (VDF) of the Galactic DM halo. We present an angular basis for the DM VDF which can be used to parametrise the distribution in order to mitigate astrophysical uncertainties in future directional experiments and extract information about the DM halo. This basis consists of discretising the VDF in a series of angular bins, with the VDF being only a function of the DM speed v within each bin. In contrast to other methods, such as spherical harmonic expansions, themore » use of this basis allows us to guarantee that the resulting VDF is everywhere positive and therefore physical. We present a recipe for calculating the event rates corresponding to the discrete VDF for an arbitrary number of angular bins N and investigate the discretisation error which is introduced in this way. For smooth, Standard Halo Model-like distribution functions, only N=3 angular bins are required to achieve an accuracy of around 10–30% in the number of events in each bin. Shortly after confirmation of the DM origin of the signal with around 50 events, this accuracy should be sufficient to allow the discretised velocity distribution to be employed reliably. For more extreme VDFs (such as streams), the discretisation error is typically much larger, but can be improved with increasing N. This method paves the way towards an astrophysics-independent analysis framework for the directional detection of dark matter.« less

  17. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen

    Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  18. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    DOE PAGES

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; ...

    2018-03-12

    Here, we propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator–coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  19. Reconfigurable generation and measurement of mutually unbiased bases for time-bin qudits

    NASA Astrophysics Data System (ADS)

    Lukens, Joseph M.; Islam, Nurul T.; Lim, Charles Ci Wen; Gauthier, Daniel J.

    2018-03-01

    We propose a method for implementing mutually unbiased generation and measurement of time-bin qudits using a cascade of electro-optic phase modulator-coded fiber Bragg grating pairs. Our approach requires only a single spatial mode and can switch rapidly between basis choices. We obtain explicit solutions for dimensions d = 2, 3, and 4 that realize all d + 1 possible mutually unbiased bases and analyze the performance of our approach in quantum key distribution. Given its practicality and compatibility with current technology, our approach provides a promising springboard for scalable processing of high-dimensional time-bin states.

  20. 4D CT amplitude binning for the generation of a time-averaged 3D mid-position CT scan

    NASA Astrophysics Data System (ADS)

    Kruis, Matthijs F.; van de Kamer, Jeroen B.; Belderbos, José S. A.; Sonke, Jan-Jakob; van Herk, Marcel

    2014-09-01

    The purpose of this study was to develop a method to use amplitude binned 4D-CT (A-4D-CT) data for the construction of mid-position CT data and to compare the results with data created from phase-binned 4D-CT (P-4D-CT) data. For the latter purpose we have developed two measures which describe the regularity of the 4D data and we have tried to correlate these measures with the regularity of the external respiration signal. 4D-CT data was acquired for 27 patients on a combined PET-CT scanner. The 4D data were reconstructed twice, using phase and amplitude binning. The 4D frames of each dataset were registered using a quadrature-based optical flow method. After registration the deformation vector field was repositioned to the mid-position. Since amplitude-binned 4D data does not provide temporal information, we corrected the mid-position for the occupancy of the bins. We quantified the differences between the two mid-position datasets in terms of tumour offset and amplitude differences. Furthermore, we measured the standard deviation of the image intensity over the respiration after registration (σregistration) and the regularity of the deformation vector field (\\overline{\\Delta |J|} ) to quantify the quality of the 4D-CT data. These measures were correlated to the regularity of the external respiration signal (σsignal). The two irregularity measures, \\overline{\\Delta |J|} and σregistration, were dependent on each other (p < 0.0001, R2 = 0.80 for P-4D-CT, R2 = 0.74 for A-4D-CT). For all datasets amplitude binning resulted in lower \\overline{\\Delta |J|} and σregistration and large decreases led to visible quality improvements in the mid-position data. The quantity of artefact decrease was correlated to the irregularity of the external respiratory signal. The average tumour offset between the phase and amplitude binned mid-position without occupancy correction was 0.42 mm in the caudal direction (10.6% of the amplitude). After correction this was reduced to 0.16 mm in caudal direction (4.1% of the amplitude). Similar relative offsets were found at the diaphragm. We have devised a method to use amplitude binned 4D-CT to construct motion model and generate a mid-position planning CT for radiotherapy treatment purposes. We have decimated the systematic offset of this mid-position model with a motion model derived from P-4D-CT. We found that the A-4D-CT led to a decrease of local artefacts and that this decrease was correlated to the irregularity of the external respiration signal.

  1. Deeper insight into the structure of the anaerobic digestion microbial community; the biogas microbiome database is expanded with 157 new genomes.

    PubMed

    Treu, Laura; Kougias, Panagiotis G; Campanaro, Stefano; Bassani, Ilaria; Angelidaki, Irini

    2016-09-01

    This research aimed to better characterize the biogas microbiome by means of high throughput metagenomic sequencing and to elucidate the core microbial consortium existing in biogas reactors independently from the operational conditions. Assembly of shotgun reads followed by an established binning strategy resulted in the highest, up to now, extraction of microbial genomes involved in biogas producing systems. From the 236 extracted genome bins, it was remarkably found that the vast majority of them could only be characterized at high taxonomic levels. This result confirms that the biogas microbiome is comprised by a consortium of unknown species. A comparative analysis between the genome bins of the current study and those extracted from a previous metagenomic assembly demonstrated a similar phylogenetic distribution of the main taxa. Finally, this analysis led to the identification of a subset of common microbes that could be considered as the core essential group in biogas production. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Multifractal diffusion entropy analysis: Optimal bin width of probability histograms

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Korbel, Jan

    2014-11-01

    In the framework of Multifractal Diffusion Entropy Analysis we propose a method for choosing an optimal bin-width in histograms generated from underlying probability distributions of interest. The method presented uses techniques of Rényi’s entropy and the mean squared error analysis to discuss the conditions under which the error in the multifractal spectrum estimation is minimal. We illustrate the utility of our approach by focusing on a scaling behavior of financial time series. In particular, we analyze the S&P500 stock index as sampled at a daily rate in the time period 1950-2013. In order to demonstrate a strength of the method proposed we compare the multifractal δ-spectrum for various bin-widths and show the robustness of the method, especially for large values of q. For such values, other methods in use, e.g., those based on moment estimation, tend to fail for heavy-tailed data or data with long correlations. Connection between the δ-spectrum and Rényi’s q parameter is also discussed and elucidated on a simple example of multiscale time series.

  3. The Application of DNA Barcodes for the Identification of Marine Crustaceans from the North Sea and Adjacent Regions.

    PubMed

    Raupach, Michael J; Barco, Andrea; Steinke, Dirk; Beermann, Jan; Laakmann, Silke; Mohrbeck, Inga; Neumann, Hermann; Kihara, Terue C; Pointner, Karin; Radulovici, Adriana; Segelken-Voigt, Alexandra; Wesse, Christina; Knebelsberger, Thomas

    2015-01-01

    During the last years DNA barcoding has become a popular method of choice for molecular specimen identification. Here we present a comprehensive DNA barcode library of various crustacean taxa found in the North Sea, one of the most extensively studied marine regions of the world. Our data set includes 1,332 barcodes covering 205 species, including taxa of the Amphipoda, Copepoda, Decapoda, Isopoda, Thecostraca, and others. This dataset represents the most extensive DNA barcode library of the Crustacea in terms of species number to date. By using the Barcode of Life Data Systems (BOLD), unique BINs were identified for 198 (96.6%) of the analyzed species. Six species were characterized by two BINs (2.9%), and three BINs were found for the amphipod species Gammarus salinus Spooner, 1947 (0.4%). Intraspecific distances with values higher than 2.2% were revealed for 13 species (6.3%). Exceptionally high distances of up to 14.87% between two distinct but monophyletic clusters were found for the parasitic copepod Caligus elongatus Nordmann, 1832, supporting the results of previous studies that indicated the existence of an overlooked sea louse species. In contrast to these high distances, haplotype-sharing was observed for two decapod spider crab species, Macropodia parva Van Noort & Adema, 1985 and Macropodia rostrata (Linnaeus, 1761), underlining the need for a taxonomic revision of both species. Summarizing the results, our study confirms the application of DNA barcodes as highly effective identification system for the analyzed marine crustaceans of the North Sea and represents an important milestone for modern biodiversity assessment studies using barcode sequences.

  4. Research and Development of a New Waste Collection Bin to Facilitate Education in Plastic Recycling

    ERIC Educational Resources Information Center

    Chow, Cheuk-fai; So, Wing-Mui Winnie; Cheung, Tsz-Yan

    2016-01-01

    Plastic recycling has been an alternative method for solid waste management apart from landfill and incineration. However, recycling quality is affected when all plastics are discarded into a single recycling bin that increases cross contaminations and operation cost to the recycling industry. Following the engineering design process, a new…

  5. Decision Analysis in the U.S. Army’s Capabilties Needs Analysis: Applications of Decision Analysis Methods to Capabilities Resource Allocation and Capabilities Development Decisions

    DTIC Science & Technology

    2015-10-01

    capability to meet the task to the standard under the condition, nothing more or less, else the funding is wasted . Also, that funding for the...bin to segregate gaps qualitatively before the gap value model determined preference among gaps within the bins. Computation of a gap’s...for communication, interpretation, or processing by humans or by automatic means (as it pertains to modeling and simulation). Delphi Method -- a

  6. Bin Packing, Number Balancing, and Rescaling Linear Programs

    NASA Astrophysics Data System (ADS)

    Hoberg, Rebecca

    This thesis deals with several important algorithmic questions using techniques from diverse areas including discrepancy theory, machine learning and lattice theory. In Chapter 2, we construct an improved approximation algorithm for a classical NP-complete problem, the bin packing problem. In this problem, the goal is to pack items of sizes si ∈ [0,1] into as few bins as possible, where a set of items fits into a bin provided the sum of the item sizes is at most one. We give a polynomial-time rounding scheme for a standard linear programming relaxation of the problem, yielding a packing that uses at most OPT + O(log OPT) bins. This makes progress towards one of the "10 open problems in approximation algorithms" stated in the book of Shmoys and Williamson. In fact, based on related combinatorial lower bounds, Rothvoss conjectures that theta(logOPT) may be a tight bound on the additive integrality gap of this LP relaxation. In Chapter 3, we give a new polynomial-time algorithm for linear programming. Our algorithm is based on the multiplicative weights update (MWU) method, which is a general framework that is currently of great interest in theoretical computer science. An algorithm for linear programming based on MWU was known previously, but was not polynomial time--we remedy this by alternating between a MWU phase and a rescaling phase. The rescaling methods we introduce improve upon previous methods by reducing the number of iterations needed until one can rescale, and they can be used for any algorithm with a similar rescaling structure. Finally, we note that the MWU phase of the algorithm has a simple interpretation as gradient descent of a particular potential function, and we show we can speed up this phase by walking in a direction that decreases both the potential function and its gradient. In Chapter 4, we show that an approximate oracle for Minkowski's Theorem gives an approximate oracle for the number balancing problem, and conversely. Number balancing is the problem of minimizing | 〈a,x〉 | over x ∈ {-1,0,1}n \\ { 0}, given a ∈ [0,1]n. While an application of the pigeonhole principle shows that there always exists x with | 〈a,x〉| ≤ O(√ n/2n), the best known algorithm only guarantees |〈a,x〉| ≤ 2-ntheta(log n). We show that an oracle for Minkowski's Theorem with approximation factor rho would give an algorithm for NBP that guarantees | 〈a,x〉 | ≤ 2-ntheta(1/rho). In particular, this would beat the bound of Karmarkar and Karp provided rho ≤ O(logn/loglogn). In the other direction, we prove that any polynomial time algorithm for NBP that guarantees a solution of difference at most 2√n/2 n would give a polynomial approximation for Minkowski as well as a polynomial factor approximation algorithm for the Shortest Vector Problem.

  7. Assessment of laboratory logistics management information system practice for HIV/AIDS and tuberculosis laboratory commodities in selected public health facilities in Addis Ababa, Ethiopia.

    PubMed

    Desale, Adino; Taye, Bineyam; Belay, Getachew; Nigatu, Alemayehu

    2013-01-01

    Logistics management information system for health commodities remained poorly implemented in most of developing countries. To assess the status of laboratory logistics management information system for HIV/AIDS and tuberculosis laboratory commodities in public health facilities in Addis Ababa. A cross-sectional descriptive study was conducted from September 2010-January 2011 at selected public health facilities. A stratified random sampling method was used to include a total of 43 facilities which, were investigated through quantitative methods using structured questionnaires interviews. Focus group discussion with the designated supply chain managers and key informant interviews were conducted for the qualitative method. There exists a well-designed logistics system for laboratory commodities with trained pharmacy personnel, distributed standard LMIS formats and established inventory control procedures. However, majority of laboratory professionals were not trained in LMIS. Majority of the facilities (60.5%) were stocked out for at least one ART monitoring and TB laboratory reagents and the highest stock out rate was for chemistry reagents. Expired ART monitoring laboratory commodities were found in 25 (73.5%) of facilities. Fifty percent (50%) of the assessed hospitals and 54% of health centers were currently using stock/bin cards for all HIV/AIDS and TB laboratory commodities in main pharmacy store, among these only 25% and 20.8% of them were updated with accurate information matching with the physical count done at the time of visit for hospitals and health centers respectively. Even though there exists a well designed laboratory LMIS, keeping quality stock/bin cards and LMIS reports were very low. Key ART monitoring laboratory commodities were stock out at many facilities at the day of visit and during the past six months. Based on findings, training of laboratory personnel's managing laboratory commodities and keeping accurate inventory control procedures were recommended.

  8. Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.

    2018-07-01

    One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We consider simulations in which the number of photon packets is Poisson distributed, while the weight assigned to a single photon packet follows any distribution of choice. We show how to estimate the statistical uncertainty of the sum of weights in each bin from the output of a single radiative-transfer simulation. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalize existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.

  9. Engagement Assessment Using EEG Signals

    NASA Technical Reports Server (NTRS)

    Li, Feng; Li, Jiang; McKenzie, Frederic; Zhang, Guangfan; Wang, Wei; Pepe, Aaron; Xu, Roger; Schnell, Thomas; Anderson, Nick; Heitkamp, Dean

    2012-01-01

    In this paper, we present methods to analyze and improve an EEG-based engagement assessment approach, consisting of data preprocessing, feature extraction and engagement state classification. During data preprocessing, spikes, baseline drift and saturation caused by recording devices in EEG signals are identified and eliminated, and a wavelet based method is utilized to remove ocular and muscular artifacts in the EEG recordings. In feature extraction, power spectrum densities with 1 Hz bin are calculated as features, and these features are analyzed using the Fisher score and the one way ANOVA method. In the classification step, a committee classifier is trained based on the extracted features to assess engagement status. Finally, experiment results showed that there exist significant differences in the extracted features among different subjects, and we have implemented a feature normalization procedure to mitigate the differences and significantly improved the engagement assessment performance.

  10. Palm-Vein Classification Based on Principal Orientation Features

    PubMed Central

    Zhou, Yujia; Liu, Yaqin; Feng, Qianjin; Yang, Feng; Huang, Jing; Nie, Yixiao

    2014-01-01

    Personal recognition using palm–vein patterns has emerged as a promising alternative for human recognition because of its uniqueness, stability, live body identification, flexibility, and difficulty to cheat. With the expanding application of palm–vein pattern recognition, the corresponding growth of the database has resulted in a long response time. To shorten the response time of identification, this paper proposes a simple and useful classification for palm–vein identification based on principal direction features. In the registration process, the Gaussian-Radon transform is adopted to extract the orientation matrix and then compute the principal direction of a palm–vein image based on the orientation matrix. The database can be classified into six bins based on the value of the principal direction. In the identification process, the principal direction of the test sample is first extracted to ascertain the corresponding bin. One-by-one matching with the training samples is then performed in the bin. To improve recognition efficiency while maintaining better recognition accuracy, two neighborhood bins of the corresponding bin are continuously searched to identify the input palm–vein image. Evaluation experiments are conducted on three different databases, namely, PolyU, CASIA, and the database of this study. Experimental results show that the searching range of one test sample in PolyU, CASIA and our database by the proposed method for palm–vein identification can be reduced to 14.29%, 14.50%, and 14.28%, with retrieval accuracy of 96.67%, 96.00%, and 97.71%, respectively. With 10,000 training samples in the database, the execution time of the identification process by the traditional method is 18.56 s, while that by the proposed approach is 3.16 s. The experimental results confirm that the proposed approach is more efficient than the traditional method, especially for a large database. PMID:25383715

  11. Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm

    PubMed Central

    Hashimoto, Koichi

    2017-01-01

    Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216

  12. Identification of Intensity Ratio Break Points from Photon Arrival Trajectories in Ratiometric Single Molecule Spectroscopy

    PubMed Central

    Bingemann, Dieter; Allen, Rachel M.

    2012-01-01

    We describe a statistical method to analyze dual-channel photon arrival trajectories from single molecule spectroscopy model-free to identify break points in the intensity ratio. Photons are binned with a short bin size to calculate the logarithm of the intensity ratio for each bin. Stochastic photon counting noise leads to a near-normal distribution of this logarithm and the standard student t-test is used to find statistically significant changes in this quantity. In stochastic simulations we determine the significance threshold for the t-test’s p-value at a given level of confidence. We test the method’s sensitivity and accuracy indicating that the analysis reliably locates break points with significant changes in the intensity ratio with little or no error in realistic trajectories with large numbers of small change points, while still identifying a large fraction of the frequent break points with small intensity changes. Based on these results we present an approach to estimate confidence intervals for the identified break point locations and recommend a bin size to choose for the analysis. The method proves powerful and reliable in the analysis of simulated and actual data of single molecule reorientation in a glassy matrix. PMID:22837704

  13. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  14. Accelerated simulation of stochastic particle removal processes in particle-resolved aerosol models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis, J.H.; Michelotti, M.D.; Riemer, N.

    2016-10-01

    Stochastic particle-resolved methods have proven useful for simulating multi-dimensional systems such as composition-resolved aerosol size distributions. While particle-resolved methods have substantial benefits for highly detailed simulations, these techniques suffer from high computational cost, motivating efforts to improve their algorithmic efficiency. Here we formulate an algorithm for accelerating particle removal processes by aggregating particles of similar size into bins. We present the Binned Algorithm for particle removal processes and analyze its performance with application to the atmospherically relevant process of aerosol dry deposition. We show that the Binned Algorithm can dramatically improve the efficiency of particle removals, particularly for low removalmore » rates, and that computational cost is reduced without introducing additional error. In simulations of aerosol particle removal by dry deposition in atmospherically relevant conditions, we demonstrate about 50-times increase in algorithm efficiency.« less

  15. SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, C; Qi, H; Chen, Z

    Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using meanmore » filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.« less

  16. How to improve the comfort of Kesawan Heritage Corridor, Medan City

    NASA Astrophysics Data System (ADS)

    Tegar; Ginting, Nurlisa; Suwantoro, H.

    2018-03-01

    Comfort is indispensable to make a friendly neighborhood or city. Especially the comfort of the infrastructure in the corridor. People must be able to feel comfortable to act rationally in their physical environment. Existing infrastructure must able to support Kesawan as a historic district. Kesawan is an area that is filled with so many unique buildings. Without comfort, how good the existing buildings’ architecture cannot be enjoyed. It will also affect the identity of a region or city. The aim of this research is to re-design the public facilities from Kesawan corridor’s comfort aspect: orientation, traffic calming, vegetation, signage, public facilities (toilet, seating place, bus station, bins), information center, parking and pedestrian path. It will translate the design concept in the form of design criteria. This research uses qualitative methods. Some facilities in this corridor are unsuitable even some of them are not available. So, we need some improvements and additions to the existing facilities. It is expected that by upgrading the existing facilities, visitors who come to Kesawan will be able to enjoy more and able to make Medan city more friendly.

  17. Development of a two-dimensional binning model for N{sub 2}–N relaxation in hypersonic shock conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, Tong, E-mail: tongzhu2@illinois.edu; Levin, Deborah A., E-mail: deblevin@illinois.edu; Li, Zheng, E-mail: zul107@psu.edu

    2016-08-14

    A high fidelity internal energy relaxation model for N{sub 2}–N suitable for use in direct simulation Monte Carlo (DSMC) modeling of chemically reacting flows is proposed. A novel two-dimensional binning approach with variable bin energy resolutions in the rotational and vibrational modes is developed for treating the internal mode of N{sub 2}. Both bin-to-bin and state-specific relaxation cross sections are obtained using the molecular dynamics/quasi-classical trajectory (MD/QCT) method with two potential energy surfaces as well as the state-specific database of Jaffe et al. The MD/QCT simulations of inelastic energy exchange between N{sub 2} and N show that there is amore » strong forward-preferential scattering behavior at high collision velocities. The 99 bin model is used in homogeneous DSMC relaxation simulations and is found to be able to recover the state-specific master equation results of Panesi et al. when the Jaffe state-specific cross sections are used. Rotational relaxation energy profiles and relaxation times obtained using the ReaxFF and Jaffe potential energy surfaces (PESs) are in general agreement but there are larger differences between the vibrational relaxation times. These differences become smaller as the translational temperature increases because the difference in the PES energy barrier becomes less important.« less

  18. Intrication temporelle et communication quantique

    NASA Astrophysics Data System (ADS)

    Bussieres, Felix

    Quantum communication is the art of transferring a quantum state from one place to another and the study of tasks that can be accomplished with it. This thesis is devoted to the development of tools and tasks for quantum communication in a real-world setting. These were implemented using an underground optical fibre link deployed in an urban environment. The technological and theoretical innovations presented here broaden the range of applications of time-bin entanglement through new methods of manipulating time-bin qubits, a novel model for characterizing sources of photon pairs, new ways of testing non-locality and the design and the first implementation of a new loss-tolerant quantum coin-flipping protocol. Manipulating time-bin qubits. A single photon is an excellent vehicle in which a qubit, the fundamental unit of quantum information, can be encoded. In particular, the time-bin encoding of photonic qubits is well suited for optical fibre transmission. Before this thesis, the applications of quantum communication based on the time-bin encoding were limited due to the lack of methods to implement arbitrary operations and measurements. We have removed this restriction by proposing the first methods to realize arbitrary deterministic operations on time-bin qubits as well as single qubit measurements in an arbitrary basis. We applied these propositions to the specific case of optical measurement-based quantum computing and showed how to implement the feedforward operations, which are essential to this model. This therefore opens new possibilities for creating an optical quantum computer, but also for other quantum communication tasks. Characterizing sources of photon pairs. Experimental quantum communication requires the creation of single photons and entangled photons. These two ingredients can be obtained from a source of photon pairs based on non-linear spontaneous processes. Several tasks in quantum communication require a precise knowledge of the properties of the source being used. We developed and implemented a fast and simple method to characterize a source of photon pairs. This method is well suited for a realistic setting where experimental conditions, such as channel transmittance, may fluctuate, and for which the characterization of the source has to be done in real time. Testing the non-locality of time-bin entanglement. Entanglement is a resource needed for the realization of many important tasks in quantum communication. It also allows two physical systems to be correlated in a way that cannot be explained by classical physics; this manifestation of entanglement is called non-locality. We built a source of time-bin entangled photonic qubits and characterized it with the new methods implementing arbitrary single qubit measurements that we developed. This allowed us to reveal the non-local nature of our source of entanglement in ways that were never implemented before. It also opens the door to study previously untested features of non-locality using this source. Theses experiments were performed in a realistic setting where quantum (non-local) correlations were observed even after transmission of one of the entangled qubits over 12.4 km of an underground optical fibre. Flipping quantum coins. Quantum coin-flipping is a quantum cryptographic primitive proposed in 1984, that is when the very first steps of quantum communication were being taken, where two players alternate in sending classical and quantum information in order to generate a shared random bit. The use of quantum information is such that a potential cheater cannot force the outcome to his choice with certainty. Classically, however, one of the players can always deterministically choose the outcome. Unfortunately, the security of all previous quantum coin-flipping protocols is seriously compromised in the presence of losses on the transmission channel, thereby making this task impractical. We found a solution to this problem and obtained the first loss-tolerant quantum coin-flipping protocol whose security is independent of the amount of the losses. We have also experimentally demonstrated our loss-tolerant protocol using our source of time-bin entanglement combined with our arbitrary single qubit measurement methods. This experiment took place in a realistic setting where qubits travelled over an underground optical fibre link. This new task thus joins quantum key distribution as a practical application of quantum communication. Keywords. quantum communication, photonics, time-bin encoding, source of photon pairs, heralded single photon source, entanglement, non-locality, time-bin entanglement, hybrid entanglement, quantum network, quantum cryptography, quantum coin-flipping, measurement-based quantum computation, telecommunication, optical fibre, nonlinear optics.

  19. Consistency Check for the Bin Packing Constraint Revisited

    NASA Astrophysics Data System (ADS)

    Dupuis, Julien; Schaus, Pierre; Deville, Yves

    The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.

  20. 40 CFR 63.9882 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...

  1. 40 CFR 63.9882 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...

  2. 40 CFR 63.9882 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...

  3. 40 CFR 63.9882 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...

  4. 40 CFR 63.9882 - What parts of my plant does this subpart cover?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CATEGORIES (CONTINUED) National Emissions Standards for Hazardous Air Pollutants for Primary Magnesium... affected sources are each new and existing primary magnesium refining facility. (b) This subpart covers emissions from each spray dryer stack, magnesium chloride storage bins scrubber stack, melt/reactor system...

  5. Comparison of recycling outcomes in three types of recycling collection units.

    PubMed

    Andrews, Ashley; Gregoire, Mary; Rasmussen, Heather; Witowich, Gretchen

    2013-03-01

    Commercial institutions have many factors to consider when implementing an effective recycling program. This study examined the effectiveness of three different types of recycling bins on recycling accuracy by determining the percent weight of recyclable material placed in the recycling bins, comparing the percent weight of recyclable material by type of container used, and examining whether a change in signage increased recycling accuracy. Data were collected over 6 weeks totaling 30 days from 3 different recycling bin types at a Midwest University medical center. Five bin locations for each bin type were used. Bags from these bins were collected, sorted into recyclable and non-recyclable material, and weighed. The percent recyclable material was calculated using these weights. Common contaminates found in the bins were napkins and paper towels, plastic food wrapping, plastic bags, and coffee cups. The results showed a significant difference in percent recyclable material between bin types and bin locations. Bin type 2 was found to have one bin location to be statistically different (p=0.048), which may have been due to lack of a trash bin next to the recycling bin in that location. Bin type 3 had significantly lower percent recyclable material (p<0.001), which may have been due to lack of a trash bin next to the recycling bin and increased contamination due to the combination of commingled and paper into one bag. There was no significant change in percent recyclable material in recycling bins post signage change. These results suggest a signage change may not be an effective way, when used alone, to increase recycling compliance and accuracy. This study showed two or three-compartment bins located next to a trash bin may be the best bin type for recycling accuracy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  6. Monitoring household waste recycling centres performance using mean bin weight analyses.

    PubMed

    Maynard, Sarah; Cherrett, Tom; Waterson, Ben

    2009-02-01

    This paper describes a modelling approach used to investigate the significance of key factors (vehicle type, compaction type, site design, temporal effects) in influencing the variability in observed nett amenity bin weights produced by household waste recycling centres (HWRCs). This new method can help to quickly identify sites that are producing significantly lighter bins, enabling detailed back-end analyses to be efficiently targeted and best practice in HWRC operation identified. Tested on weigh ticket data from nine HWRCs across West Sussex, UK, the model suggests that compaction technique, vehicle type, month and site design explained 76% of the variability in the observed nett amenity weights. For each factor, a weighting coefficient was calculated to generate a predicted nett weight for each bin transaction and three sites were subsequently identified as having similar characteristics but returned significantly different mean nett bin weights. Waste and site audits were then conducted at the three sites to try and determine the possible sources of the remaining variability. Significant differences were identified in the proportions of contained waste (bagged), wood, and dry recyclables entering the amenity waste stream, particularly at one site where significantly less contaminated waste and dry recyclables were observed.

  7. The Application of DNA Barcodes for the Identification of Marine Crustaceans from the North Sea and Adjacent Regions

    PubMed Central

    Raupach, Michael J.; Barco, Andrea; Steinke, Dirk; Beermann, Jan; Laakmann, Silke; Mohrbeck, Inga; Neumann, Hermann; Kihara, Terue C.; Pointner, Karin; Radulovici, Adriana; Segelken-Voigt, Alexandra; Wesse, Christina; Knebelsberger, Thomas

    2015-01-01

    During the last years DNA barcoding has become a popular method of choice for molecular specimen identification. Here we present a comprehensive DNA barcode library of various crustacean taxa found in the North Sea, one of the most extensively studied marine regions of the world. Our data set includes 1,332 barcodes covering 205 species, including taxa of the Amphipoda, Copepoda, Decapoda, Isopoda, Thecostraca, and others. This dataset represents the most extensive DNA barcode library of the Crustacea in terms of species number to date. By using the Barcode of Life Data Systems (BOLD), unique BINs were identified for 198 (96.6%) of the analyzed species. Six species were characterized by two BINs (2.9%), and three BINs were found for the amphipod species Gammarus salinus Spooner, 1947 (0.4%). Intraspecific distances with values higher than 2.2% were revealed for 13 species (6.3%). Exceptionally high distances of up to 14.87% between two distinct but monophyletic clusters were found for the parasitic copepod Caligus elongatus Nordmann, 1832, supporting the results of previous studies that indicated the existence of an overlooked sea louse species. In contrast to these high distances, haplotype-sharing was observed for two decapod spider crab species, Macropodia parva Van Noort & Adema, 1985 and Macropodia rostrata (Linnaeus, 1761), underlining the need for a taxonomic revision of both species. Summarizing the results, our study confirms the application of DNA barcodes as highly effective identification system for the analyzed marine crustaceans of the North Sea and represents an important milestone for modern biodiversity assessment studies using barcode sequences. PMID:26417993

  8. Low-Light Image Enhancement Using Adaptive Digital Pixel Binning

    PubMed Central

    Yoo, Yoonjong; Im, Jaehyun; Paik, Joonki

    2015-01-01

    This paper presents an image enhancement algorithm for low-light scenes in an environment with insufficient illumination. Simple amplification of intensity exhibits various undesired artifacts: noise amplification, intensity saturation, and loss of resolution. In order to enhance low-light images without undesired artifacts, a novel digital binning algorithm is proposed that considers brightness, context, noise level, and anti-saturation of a local region in the image. The proposed algorithm does not require any modification of the image sensor or additional frame-memory; it needs only two line-memories in the image signal processor (ISP). Since the proposed algorithm does not use an iterative computation, it can be easily embedded in an existing digital camera ISP pipeline containing a high-resolution image sensor. PMID:26121609

  9. Emission of greenhouse gases from home aerobic composting, anaerobic digestion and vermicomposting of household wastes in Brisbane (Australia).

    PubMed

    Chan, Yiu C; Sinha, Rajiv K; Weijin Wang

    2011-05-01

    This study investigated greenhouse gas (GHG) emissions from three different home waste treatment methods in Brisbane, Australia. Gas samples were taken monthly from 34 backyard composting bins from January to April 2009. Averaged over the study period, the aerobic composting bins released lower amounts of CH(4) (2.2 mg m(- 2) h(-1)) than the anaerobic digestion bins (9.5 mg m(-2) h(-1)) and the vermicomposting bins (4.8 mg m(-2) h( -1)). The vermicomposting bins had lower N(2)O emission rates (1.2 mg m(-2) h(- 1)) than the others (1.5-1.6 mg m(-2) h( -1)). Total GHG emissions including both N(2)O and CH(4) were 463, 504 and 694 mg CO(2)-e m(- 2) h(-1) for vermicomposting, aerobic composting and anaerobic digestion, respectively, with N(2)O contributing >80% in the total budget. The GHG emissions varied substantially with time and were regulated by temperature, moisture content and the waste properties, indicating the potential to mitigate GHG emission through proper management of the composting systems. In comparison with other mainstream municipal waste management options including centralized composting and anaerobic digestion facilities, landfilling and incineration, home composting has the potential to reduce GHG emissions through both lower on-site emissions and the minimal need for transportation and processing. On account of the lower cost, the present results suggest that home composting provides an effective and feasible supplementary waste management method to a centralized facility in particular for cities with lower population density such as the Australian cities.

  10. Automated mask and wafer defect classification using a novel method for generalized CD variation measurements

    NASA Astrophysics Data System (ADS)

    Verechagin, V.; Kris, R.; Schwarzband, I.; Milstein, A.; Cohen, B.; Shkalim, A.; Levy, S.; Price, D.; Bal, E.

    2018-03-01

    Over the years, mask and wafers defects dispositioning has become an increasingly challenging and time consuming task. With design rules getting smaller, OPC getting complex and scanner illumination taking on free-form shapes - the probability of a user to perform accurate and repeatable classification of defects detected by mask inspection tools into pass/fail bins is reducing. The critical challenging of mask defect metrology for small nodes ( < 30 nm) was reviewed in [1]. While Critical Dimension (CD) variation measurement is still the method of choice for determining a mask defect future impact on wafer, the high complexity of OPCs combined with high variability in pattern shapes poses a challenge for any automated CD variation measurement method. In this study, a novel approach for measurement generalization is presented. CD variation assessment performance is evaluated on multiple different complex shape patterns, and is benchmarked against an existing qualified measurement methodology.

  11. A 4.2 ps Time-Interval RMS Resolution Time-to-Digital Converter Using a Bin Decimation Method in an UltraScale FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2016-10-01

    The common solution for a field programmable gate array (FPGA)-based time-to-digital converter (TDC) is constructing a tapped delay line (TDL) for time interpolation to yield a sub-clock time resolution. The granularity and uniformity of the delay elements of TDL determine the TDC time resolution. In this paper, we propose a dual-sampling TDL architecture and a bin decimation method that could make the delay elements as small and uniform as possible, so that the implemented TDCs can achieve a high time resolution beyond the intrinsic cell delay. Two identical full hardware-based TDCs were implemented in a Xilinx UltraScale FPGA for performance evaluation. For fixed time intervals in the range from 0 to 440 ns, the average time-interval RMS resolution is measured by the two TDCs with 4.2 ps, thus the timestamp resolution of single TDC is derived as 2.97 ps. The maximum hit rate of the TDC is as high as half the system clock rate of FPGA, namely 250 MHz in our demo prototype. Because the conventional online bin-by-bin calibration is not needed, the implementation of the proposed TDC is straightforward and relatively resource-saving.

  12. Optimizing and evaluating the reconstruction of Metagenome-assembled microbial genomes.

    PubMed

    Papudeshi, Bhavya; Haggerty, J Matthew; Doane, Michael; Morris, Megan M; Walsh, Kevin; Beattie, Douglas T; Pande, Dnyanada; Zaeri, Parisa; Silva, Genivaldo G Z; Thompson, Fabiano; Edwards, Robert A; Dinsdale, Elizabeth A

    2017-11-28

    Microbiome/host interactions describe characteristics that affect the host's health. Shotgun metagenomics includes sequencing a random subset of the microbiome to analyze its taxonomic and metabolic potential. Reconstruction of DNA fragments into genomes from metagenomes (called metagenome-assembled genomes) assigns unknown fragments to taxa/function and facilitates discovery of novel organisms. Genome reconstruction incorporates sequence assembly and sorting of assembled sequences into bins, characteristic of a genome. However, the microbial community composition, including taxonomic and phylogenetic diversity may influence genome reconstruction. We determine the optimal reconstruction method for four microbiome projects that had variable sequencing platforms (IonTorrent and Illumina), diversity (high or low), and environment (coral reefs and kelp forests), using a set of parameters to select for optimal assembly and binning tools. We tested the effects of the assembly and binning processes on population genome reconstruction using 105 marine metagenomes from 4 projects. Reconstructed genomes were obtained from each project using 3 assemblers (IDBA, MetaVelvet, and SPAdes) and 2 binning tools (GroopM and MetaBat). We assessed the efficiency of assemblers using statistics that including contig continuity and contig chimerism and the effectiveness of binning tools using genome completeness and taxonomic identification. We concluded that SPAdes, assembled more contigs (143,718 ± 124 contigs) of longer length (N50 = 1632 ± 108 bp), and incorporated the most sequences (sequences-assembled = 19.65%). The microbial richness and evenness were maintained across the assembly, suggesting low contig chimeras. SPAdes assembly was responsive to the biological and technological variations within the project, compared with other assemblers. Among binning tools, we conclude that MetaBat produced bins with less variation in GC content (average standard deviation: 1.49), low species richness (4.91 ± 0.66), and higher genome completeness (40.92 ± 1.75) across all projects. MetaBat extracted 115 bins from the 4 projects of which 66 bins were identified as reconstructed metagenome-assembled genomes with sequences belonging to a specific genus. We identified 13 novel genomes, some of which were 100% complete, but show low similarity to genomes within databases. In conclusion, we present a set of biologically relevant parameters for evaluation to select for optimal assembly and binning tools. For the tools we tested, SPAdes assembler and MetaBat binning tools reconstructed quality metagenome-assembled genomes for the four projects. We also conclude that metagenomes from microbial communities that have high coverage of phylogenetically distinct, and low taxonomic diversity results in highest quality metagenome-assembled genomes.

  13. Does Your Optical Particle Counter Measure What You Think it Does? Calibration and Refractive Index Correction Methods.

    NASA Astrophysics Data System (ADS)

    Rosenberg, Phil; Dean, Angela; Williams, Paul; Dorsey, James; Minikin, Andreas; Pickering, Martyn; Petzold, Andreas

    2013-04-01

    Optical Particle Counters (OPCs) are the de-facto standard for in-situ measurements of airborne aerosol size distributions and small cloud particles over a wide size range. This is particularly the case on airborne platforms where fast response is important. OPCs measure scattered light from individual particles and generally bin particles according to the measured peak amount of light scattered (the OPC's response). Most manufacturers provide a table along with their instrument which indicates the particle diameters which represent the edges of each bin. It is important to correct the particle size reported by OPCs for the refractive index of the particles being measured, which is often not the same as for those used during calibration. However, the OPC's response is not a monotonic function of particle diameter and obvious problems occur when refractive index corrections are attempted, but multiple diameters correspond to the same OPC response. Here we recommend that OPCs are calibrated in terms of particle scattering cross section as this is a monotonic (usually linear) function of an OPC's response. We present a method for converting a bin's boundaries in terms of scattering cross section into a bin centre and bin width in terms of diameter for any aerosol species for which the scattering properties are known. The relationship between diameter and scattering cross section can be arbitrarily complex and does not need to be monotonic; it can be based on Mie-Lorenz theory or any other scattering theory. Software has been provided on the Sourceforge open source repository for scientific users to implement such methods in their own measurement and calibration routines. As a case study data is presented showing data from Passive Cavity Aerosol Spectrometer Probe (PCASP) and a Cloud Droplet Probe (CDP) calibrated using polystyrene latex spheres and glass beads before being deployed as part of the Fennec project to measure airborne dust in the inaccessible regions of the Sahara.

  14. Octree Bin-to-Bin Fractional-NTC Collisions

    DTIC Science & Technology

    2015-09-17

    Briefing Charts 3. DATES COVERED (From - To) 24 August 2015 – 17 September 2015 4. TITLE AND SUBTITLE Octree bin-to-bin fractional -NTC collisions...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. 239.18 OCTREE BIN-TO-BIN FRACTIONAL -NTC COLLISIONS Robert Martin ERC INC., SPACECRAFT PROPULSION...AFTC/PA clearance No. TBD ROBERT MARTIN (AFRL/RQRS) DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE 1 / 15 OUTLINE 1 BACKGROUND 2 FRACTIONAL COLLISIONS 3 BIN

  15. Experimental preparation and characterization of four-dimensional quantum states using polarization and time-bin modes of a single photon

    NASA Astrophysics Data System (ADS)

    Yoo, Jinwon; Choi, Yujun; Cho, Young-Wook; Han, Sang-Wook; Lee, Sang-Yun; Moon, Sung; Oh, Kyunghwan; Kim, Yong-Su

    2018-07-01

    We present a detailed method to prepare and characterize four-dimensional pure quantum states or ququarts using polarization and time-bin modes of a single-photon. In particular, we provide a simple method to generate an arbitrary pure ququart and fully characterize the state with quantum state tomography. We also verify the reliability of the recipe by showing experimental preparation and characterization of 20 ququart states in mutually unbiased bases. As qudits provide superior properties over qubits in many fundamental tests of quantum physics and applications in quantum information processing, the presented method will be useful for photonic quantum information science.

  16. A frequency domain analysis of respiratory variations in the seismocardiogram signal.

    PubMed

    Pandia, Keya; Inan, Omer T; Kovacs, Gregory T A

    2013-01-01

    The seismocardiogram (SCG) signal traditionally measured using a chest-mounted accelerometer contains low-frequency (0-100 Hz) cardiac vibrations that can be used to derive diagnostically relevant information about cardiovascular and cardiopulmonary health. This work is aimed at investigating the effects of respiration on the frequency domain characteristics of SCG signals measured from 18 healthy subjects. Toward this end, the 0-100 Hz SCG signal bandwidth of interest was sub-divided into 5 Hz and 10 Hz frequency bins to compare the spectral energy in corresponding frequency bins of the SCG signal measured during three key conditions of respiration--inspiration, expiration, and apnea. Statistically significant differences were observed between the power in ensemble averaged inspiratory and expiratory SCG beats and between ensemble averaged inspiratory and apneaic beats across the 18 subjects for multiple frequency bins in the 10-40 Hz frequency range. Accordingly, the spectral analysis methods described in this paper could provide complementary and improved classification of respiratory modulations in the SCG signal over and above time-domain SCG analysis methods.

  17. The Fifth Bin - Opportunity to Empower the National Four Bin Analysis Discussion

    DTIC Science & Technology

    2012-06-01

    Analysis and Methods for the Exploitation of ELICIT Experimental Data, ( Martin & McEver, 2008) the authors present illustrative examples of data...and Adm. Mullen from the Pentagon. (Egenhofer, et al., 2003) – Eggenhofer, Petra M., Reiner K. Huber, & Sebastian Richter, “Communication Processes...Environment”, 13th ICCRTS, Bellevue WA, 3008. http://www.dodccrp.org/events/13th_iccrts_2008/CD/html/papers/190.pdf ( Martin & McEver, 2008) – Martin

  18. Accelerating Sequences in the Presence of Metal by Exploiting the Spatial Distribution of Off-Resonance

    PubMed Central

    Smith, Matthew R.; Artz, Nathan S.; Koch, Kevin M.; Samsonov, Alexey; Reeder, Scott B.

    2014-01-01

    Purpose To demonstrate feasibility of exploiting the spatial distribution of off-resonance surrounding metallic implants for accelerating multispectral imaging techniques. Theory Multispectral imaging (MSI) techniques perform time-consuming independent 3D acquisitions with varying RF frequency offsets to address the extreme off-resonance from metallic implants. Each off-resonance bin provides a unique spatial sensitivity that is analogous to the sensitivity of a receiver coil, and therefore provides a unique opportunity for acceleration. Methods Fully sampled MSI was performed to demonstrate retrospective acceleration. A uniform sampling pattern across off-resonance bins was compared to several adaptive sampling strategies using a total hip replacement phantom. Monte Carlo simulations were performed to compare noise propagation of two of these strategies. With a total knee replacement phantom, positive and negative off-resonance bins were strategically sampled with respect to the B0 field to minimize aliasing. Reconstructions were performed with a parallel imaging framework to demonstrate retrospective acceleration. Results An adaptive sampling scheme dramatically improved reconstruction quality, which was supported by the noise propagation analysis. Independent acceleration of negative and positive off-resonance bins demonstrated reduced overlapping of aliased signal to improve the reconstruction. Conclusion This work presents the feasibility of acceleration in the presence of metal by exploiting the spatial sensitivities of off-resonance bins. PMID:24431210

  19. Limited-angle effect compensation for respiratory binned cardiac SPECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Wenyuan; Yang, Yongyi, E-mail: yy@ece.iit.edu; Wernick, Miles N.

    Purpose: In cardiac single photon emission computed tomography (SPECT), respiratory-binned study is used to combat the motion blur associated with respiratory motion. However, owing to the variability in respiratory patterns during data acquisition, the acquired data counts can vary significantly both among respiratory bins and among projection angles within individual bins. If not properly accounted for, such variation could lead to artifacts similar to limited-angle effect in image reconstruction. In this work, the authors aim to investigate several reconstruction strategies for compensating the limited-angle effect in respiratory binned data for the purpose of reducing the image artifacts. Methods: The authorsmore » first consider a model based correction approach, in which the variation in acquisition time is directly incorporated into the imaging model, such that the data statistics are accurately described among both the projection angles and respiratory bins. Afterward, the authors consider an approximation approach, in which the acquired data are rescaled to accommodate the variation in acquisition time among different projection angles while the imaging model is kept unchanged. In addition, the authors also consider the use of a smoothing prior in reconstruction for suppressing the artifacts associated with limited-angle effect. In our evaluation study, the authors first used Monte Carlo simulated imaging with 4D NCAT phantom wherein the ground truth is known for quantitative comparison. The authors evaluated the accuracy of the reconstructed myocardium using a number of metrics, including regional and overall accuracy of the myocardium, uniformity and spatial resolution of the left ventricle (LV) wall, and detectability of perfusion defect using a channelized Hotelling observer. As a preliminary demonstration, the authors also tested the different approaches on five sets of clinical acquisitions. Results: The quantitative evaluation results show that the three compensation methods could all, but to different extents, reduce the reconstruction artifacts over no compensation. In particular, the model based approach reduced the mean-squared-error of the reconstructed myocardium by as much as 40%. Compared to the approach of data rescaling, the model based approach further improved both the overall and regional accuracy of the myocardium; it also further improved the lesion detectability and the uniformity of the LV wall. When ML reconstruction was used, the model based approach was notably more effective for improving the LV wall; when MAP reconstruction was used, the smoothing prior could reduce the noise level and artifacts with little or no increase in bias, but at the cost of a slight resolution loss of the LV wall. The improvements in image quality by the different compensation methods were also observed in the clinical acquisitions. Conclusions: Compensating for the uneven distribution of acquisition time among both projection angles and respiratory bins can effectively reduce the limited-angle artifacts in respiratory-binned cardiac SPECT reconstruction. Direct incorporation of the time variation into the imaging model together with a smoothing prior in reconstruction can lead to the most improvement in the accuracy of the reconstructed myocardium.« less

  20. A water quality management strategy for regionally protected water through health risk assessment and spatial distribution of heavy metal pollution in 3 marine reserves.

    PubMed

    Zhang, Yinan; Chu, Chunli; Li, Tong; Xu, Shengguo; Liu, Lei; Ju, Meiting

    2017-12-01

    Severe water pollution and resource scarcity is a major problem in China, where it is necessary to establish water quality-oriented monitoring and intelligent watershed management. In this study, an effective watershed management method is explored, in which water quality is first assessed using the heavy metal pollution index and the human health risk index, and then by classifying the pollution and management grade based on cluster analysis and GIS visualization. Three marine reserves in Tianjin were selected and analyzed, namely the Tianjin Ancient Coastal Wetland National Nature Reserve (Qilihai Natural Reserve), the Tianjin DaShentang Oyster Reef National Marine Special Reserve (DaShentang Reserve), and the Tianjin Coastal Wetland National Marine Special Reserve (BinHai Wetland Reserve) which is under construction. The water quality and potential human health risks of 5 heavy metals (Pb, As, Cd, Hg, Cr) in the three reserves were assessed using the Nemerow index and USEPA methods. Moreover, ArcGIS10.2 software was used to visualize the heavy metal index and display their spatial distribution. Cluster analysis enabled classification of the heavy metals into 4 categories, which allowed for identification of the heavy metals whose pollution index and health risks were highest, and, thus, whose control in the reserve is a priority. Results indicate that heavy metal pollution exists in the Qilihai Natural Reserve and in the north and east of the DaShentang Reserve; furthermore, human health risks exist in the Qilihai Natural Reserve and in the BinHai Wetland Reserve. In each reserve, the main factor influencing the pollution and health risk were high concentrations of As and Pb that exceed the corresponding standards. Measures must be adopted to control and remediate the pollutants. Furthermore, to protect the marine reserves, management policies must be implemented to improve water quality, which is an urgent task for both local and national governments. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. optBINS: Optimal Binning for histograms

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2018-03-01

    optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

  2. See Also:physica status solidi (b)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  3. Get Sample Copy
  4. Recommend to Your Librarian
  5. E-MailPrint
  1. Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Shue, David R.

    1989-01-01

    Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.

  2. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  3. Does amplitude scaling of ground motion records result in biased nonlinear structural drift responses?

    USGS Publications Warehouse

    Luco, N.; Bazzurro, P.

    2007-01-01

    Limitations of the existing earthquake ground motion database lead to scaling of records to obtain seismograms consistent with a ground motion target for structural design and evaluation. In the engineering seismology community, acceptable limits for 'legitimate' scaling vary from one (no scaling allowed) to 10 or more. The concerns expressed by detractors of scaling are mostly based on the knowledge of, for example, differences in ground motion characteristics for different earthquake magnitude-distance (Mw-Rclose) scenarios, and much less on their effects on structures. At the other end of the spectrum, proponents have demonstrated that scaling is not only legitimate but also useful for assessing structural response statistics for Mw-Rclose scenarios. Their studies, however, have not investigated more recent purposes of scaling and have not always drawn conclusions for a wide spectrum of structural vibration periods and strengths. This article investigates whether scaling of records randomly selected from an Mw-Rclose bin (or range) to a target fundamental-mode spectral acceleration (Sa) level introduces bias in the expected nonlinear structural drift response of both single-degree-of-freedom oscillators and one multi-degree-of-freedom building. The bias is quantified relative to unscaled records from the target Mw-Rclose bin that are 'naturally' at the target Sa level. We consider scaling of records from the target Mw-Rclose bin and from other Mw-Rclose bins. The results demonstrate that scaling can indeed introduce a bias that, for the most part, ca be explained by differences between the elastic response spectra of the scaled versus unscaled records. Copyright ?? 2007 John Wiley & Sons, Ltd.

  4. A scoring metric for multivariate data for reproducibility analysis using chemometric methods

    PubMed Central

    Sheen, David A.; de Carvalho Rocha, Werickson Fortunato; Lippa, Katrice A.; Bearden, Daniel W.

    2017-01-01

    Process quality control and reproducibility in emerging measurement fields such as metabolomics is normally assured by interlaboratory comparison testing. As a part of this testing process, spectral features from a spectroscopic method such as nuclear magnetic resonance (NMR) spectroscopy are attributed to particular analytes within a mixture, and it is the metabolite concentrations that are returned for comparison between laboratories. However, data quality may also be assessed directly by using binned spectral data before the time-consuming identification and quantification. Use of the binned spectra has some advantages, including preserving information about trace constituents and enabling identification of process difficulties. In this paper, we demonstrate the use of binned NMR spectra to conduct a detailed interlaboratory comparison and composition analysis. Spectra of synthetic and biologically-obtained metabolite mixtures, taken from a previous interlaboratory study, are compared with cluster analysis using a variety of distance and entropy metrics. The individual measurements are then evaluated based on where they fall within their clusters, and a laboratory-level scoring metric is developed, which provides an assessment of each laboratory’s individual performance. PMID:28694553

  5. Time-bin entangled photons from a quantum dot

    PubMed Central

    Jayakumar, Harishankar; Predojević, Ana; Kauten, Thomas; Huber, Tobias; Solomon, Glenn S.; Weihs, Gregor

    2014-01-01

    Long distance quantum communication is one of the prime goals in the field of quantum information science. With information encoded in the quantum state of photons, existing telecommunication fibre networks can be effectively used as a transport medium. To achieve this goal, a source of robust entangled single photon pairs is required. Here, we report the realization of a source of time-bin entangled photon pairs utilizing the biexciton-exciton cascade in a III/V self-assembled quantum dot. We analyse the generated photon pairs by an inherently phase-stable interferometry technique, facilitating uninterrupted long integration times. We confirm the entanglement by performing quantum state tomography of the emitted photons, which yields a fidelity of 0.69(3) and a concurrence of 0.41(6) for our realization of time-energy entanglement from a single quantum emitter. PMID:24968024

  6. Time-bin entangled photons from a quantum dot.

    PubMed

    Jayakumar, Harishankar; Predojević, Ana; Kauten, Thomas; Huber, Tobias; Solomon, Glenn S; Weihs, Gregor

    2014-06-26

    Long-distance quantum communication is one of the prime goals in the field of quantum information science. With information encoded in the quantum state of photons, existing telecommunication fibre networks can be effectively used as a transport medium. To achieve this goal, a source of robust entangled single-photon pairs is required. Here we report the realization of a source of time-bin entangled photon pairs utilizing the biexciton-exciton cascade in a III/V self-assembled quantum dot. We analyse the generated photon pairs by an inherently phase-stable interferometry technique, facilitating uninterrupted long integration times. We confirm the entanglement by performing quantum state tomography of the emitted photons, which yields a fidelity of 0.69(3) and a concurrence of 0.41(6) for our realization of time-energy entanglement from a single quantum emitter.

  7. The CTIO Acquisition CCD-TV camera design

    NASA Astrophysics Data System (ADS)

    Schmidt, Ricardo E.

    1990-07-01

    A CCD-based Acquisition TV Camera has been developed at CTIO to replace the existing ISIT units. In a 60 second exposure, the new Camera shows a sixfold improvement in sensitivity over an ISIT used with a Leaky Memory. Integration times can be varied over a 0.5 to 64 second range. The CCD, contained in an evacuated enclosure, is operated at -45 C. Only the image section, an area of 8.5 mm x 6.4 mm, gets exposed to light. Pixel size is 22 microns and either no binning or 2 x 2 binning can be selected. The typical readout rates used vary between 3.5 and 9 microseconds/pixel. Images are stored in a PC/XT/AT, which generates RS-170 video. The contrast in the RS-170 frames is automatically enhanced by the software.

  8. A search strategy for SETI - The search for extraterrestrial intelligence

    NASA Technical Reports Server (NTRS)

    Billingham, J.; Wolfe, J.; Edelson, R.; Gulkis, S.; Olsen, E.; Oliver, B.; Tarter, J.; Seeger, C.

    1980-01-01

    A search strategy is proposed for the detection of signals of extraterrestrial intelligent origin. It constitutes an exploration of a well defined volume of search space in the microwave region of the spectrum and envisages the use of a combination of sky survey and targeted star approaches. It is predicated on the use of existing antennas equipped with sophisticated multichannel spectrum analyzers and signal processing systems operating in the digital mode. The entire sky would be surveyed between 1 and 10 GHz with resolution bin widths down to 32 Hz. More than 700 nearby solar type stars and other selected interesting directions would be searched between 1 GHz and 3 GHz with bin widths down to 1 Hz. Particular emphasis would be placed on those solar type stars that are within 20 light years of earth.

  9. BInGaN alloys nearly lattice-matched to GaN for high-power high-efficiency visible LEDs

    NASA Astrophysics Data System (ADS)

    Williams, Logan; Kioupakis, Emmanouil

    2017-11-01

    InGaN-based visible light-emitting diodes (LEDs) find commercial applications for solid-state lighting and displays, but lattice mismatch limits the thickness of InGaN quantum wells that can be grown on GaN with high crystalline quality. Since narrower wells operate at a higher carrier density for a given current density, they increase the fraction of carriers lost to Auger recombination and lower the efficiency. The incorporation of boron, a smaller group-III element, into InGaN alloys is a promising method to eliminate the lattice mismatch and realize high-power, high-efficiency visible LEDs with thick active regions. In this work, we apply predictive calculations based on hybrid density functional theory to investigate the thermodynamic, structural, and electronic properties of BInGaN alloys. Our results show that BInGaN alloys with a B:In ratio of 2:3 are better lattice matched to GaN compared to InGaN and, for indium fractions less than 0.2, nearly lattice matched. Deviations from Vegard's law appear as bowing of the in-plane lattice constant with respect to composition. Our thermodynamics calculations demonstrate that the solubility of boron is higher in InGaN than in pure GaN. Varying the Ga mole fraction while keeping the B:In ratio constant enables the adjustment of the (direct) gap in the 1.75-3.39 eV range, which covers the entire visible spectrum. Holes are strongly localized in non-bonded N 2p states caused by local bond planarization near boron atoms. Our results indicate that BInGaN alloys are promising for fabricating nitride heterostructures with thick active regions for high-power, high-efficiency LEDs.

  10. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE PAGES

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh; ...

    2013-10-15

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  11. A Bayesian Modeling Approach for Estimation of a Shape-Free Groundwater Age Distribution using Multiple Tracers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massoudieh, Arash; Visser, Ate; Sharifi, Soroosh

    The mixing of groundwaters with different ages in aquifers, groundwater age is more appropriately represented by a distribution rather than a scalar number. To infer a groundwater age distribution from environmental tracers, a mathematical form is often assumed for the shape of the distribution and the parameters of the mathematical distribution are estimated using deterministic or stochastic inverse methods. We found that the prescription of the mathematical form limits the exploration of the age distribution to the shapes that can be described by the selected distribution. In this paper, the use of freeform histograms as groundwater age distributions is evaluated.more » A Bayesian Markov Chain Monte Carlo approach is used to estimate the fraction of groundwater in each histogram bin. This method was able to capture the shape of a hypothetical gamma distribution from the concentrations of four age tracers. The number of bins that can be considered in this approach is limited based on the number of tracers available. The histogram method was also tested on tracer data sets from Holten (The Netherlands; 3H, 3He, 85Kr, 39Ar) and the La Selva Biological Station (Costa-Rica; SF 6, CFCs, 3H, 4He and 14C), and compared to a number of mathematical forms. According to standard Bayesian measures of model goodness, the best mathematical distribution performs better than the histogram distributions in terms of the ability to capture the observed tracer data relative to their complexity. Among the histogram distributions, the four bin histogram performs better in most of the cases. The Monte Carlo simulations showed strong correlations in the posterior estimates of bin contributions, indicating that these bins cannot be well constrained using the available age tracers. The fact that mathematical forms overall perform better than the freeform histogram does not undermine the benefit of the freeform approach, especially for the cases where a larger amount of observed data is available and when the real groundwater distribution is more complex than can be represented by simple mathematical forms.« less

  12. Shuttle car loading system

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr. (Inventor)

    1985-01-01

    A system is described for loading newly mined material such as coal, into a shuttle car, at a location near the mine face where there is only a limited height available for a loading system. The system includes a storage bin having several telescoping bin sections and a shuttle car having a bottom wall that can move under the bin. With the bin in an extended position and filled with coal the bin sections can be telescoped to allow the coal to drop out of the bin sections and into the shuttle car, to quickly load the car. The bin sections can then be extended, so they can be slowly filled with more while waiting another shuttle car.

  13. Green material selection for sustainability: A hybrid MCDM approach.

    PubMed

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.

  14. Green material selection for sustainability: A hybrid MCDM approach

    PubMed Central

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864

  15. Noise reduction in spectral CT: reducing dose and breaking the trade-off between image noise and energy bin selection.

    PubMed

    Leng, Shuai; Yu, Lifeng; Wang, Jia; Fletcher, Joel G; Mistretta, Charles A; McCollough, Cynthia H

    2011-09-01

    Our purpose was to reduce image noise in spectral CT by exploiting data redundancies in the energy domain to allow flexible selection of the number, width, and location of the energy bins. Using a variety of spectral CT imaging methods, conventional filtered backprojection (FBP) reconstructions were performed and resulting images were compared to those processed using a Local HighlY constrained backPRojection Reconstruction (HYPR-LR) algorithm. The mean and standard deviation of CT numbers were measured within regions of interest (ROIs), and results were compared between FBP and HYPR-LR. For these comparisons, the following spectral CT imaging methods were used:(i) numerical simulations based on a photon-counting, detector-based CT system, (ii) a photon-counting, detector-based micro CT system using rubidium and potassium chloride solutions, (iii) a commercial CT system equipped with integrating detectors utilizing tube potentials of 80, 100, 120, and 140 kV, and (iv) a clinical dual-energy CT examination. The effects of tube energy and energy bin width were evaluated appropriate to each CT system. The mean CT number in each ROI was unchanged between FBP and HYPR-LR images for each of the spectral CT imaging scenarios, irrespective of bin width or tube potential. However, image noise, as represented by the standard deviation of CT numbers in each ROI, was reduced by 36%-76%. In all scenarios, image noise after HYPR-LR algorithm was similar to that of composite images, which used all available photons. No difference in spatial resolution was observed between HYPR-LR processing and FBP. Dual energy patient data processed using HYPR-LR demonstrated reduced noise in the individual, low- and high-energy images, as well as in the material-specific basis images. Noise reduction can be accomplished for spectral CT by exploiting data redundancies in the energy domain. HYPR-LR is a robust method for reducing image noise in a variety of spectral CT imaging systems without losing spatial resolution or CT number accuracy. This method improves the flexibility to select energy bins in the manner that optimizes material identification and separation without paying the penalty of increased image noise or its corollary, increased patient dose.

  16. A genetic fingerprint of Amphipoda from Icelandic waters – the baseline for further biodiversity and biogeography studies

    PubMed Central

    Jażdżewska, Anna M.; Corbari, Laure; Driskell, Amy; Frutos, Inmaculada; Havermans, Charlotte; Hendrycks, Ed; Hughes, Lauren; Lörz, Anne-Nina; Bente Stransky; Tandberg, Anne Helene S.; Vader, Wim; Brix, Saskia

    2018-01-01

    Abstract Amphipods constitute an abundant part of Icelandic deep-sea zoobenthos yet knowledge of the diversity of this fauna, particularly at the molecular level, is scarce. The present work aims to use molecular methods to investigate genetic variation of the Amphipoda sampled during two IceAGE collecting expeditions. The mitochondrial cytochrome oxidase subunit 1 (COI) of 167 individuals originally assigned to 75 morphospecies was analysed. These targeted morhospecies were readily identifiable by experts using light microscopy and representative of families where there is current ongoing taxonomic research. The study resulted in 81 Barcode Identity Numbers (BINs) (of which >90% were published for the first time), while Automatic Barcode Gap Discovery revealed the existence of 78 to 83 Molecular Operational Taxonomic Units (MOTUs). Six nominal species (Rhachotropis helleri, Arrhis phyllonyx, Deflexilodes tenuirostratus, Paroediceros propinquus, Metopa boeckii, Astyra abyssi) appeared to have a molecular variation higher than the 0.03 threshold of both p-distance and K2P usually used for amphipod species delineation. Conversely, two Oedicerotidae regarded as separate morphospecies clustered together with divergences in the order of intraspecific variation. The incongruence between the BINs associated with presently identified species and the publicly available data of the same taxa was observed in case of Paramphithoe hystrix and Amphilochus manudens. The findings from this research project highlight the necessity of supporting molecular studies with thorough morphology species analyses. PMID:29472762

  17. See Also:Mechanics of Cohesive-frictional MaterialsCopyright © 2004 John Wiley & Sons, Ltd.Get Sample Copy

  18. Recommend to Your Librarian
  19. E-MailPrint
  1. The effect of SUV discretization in quantitative FDG-PET Radiomics: the need for standardized methodology in tumor texture analysis

    NASA Astrophysics Data System (ADS)

    Leijenaar, Ralph T. H.; Nalbantov, Georgi; Carvalho, Sara; van Elmpt, Wouter J. C.; Troost, Esther G. C.; Boellaard, Ronald; Aerts, Hugo J. W. L.; Gillies, Robert J.; Lambin, Philippe

    2015-08-01

    FDG-PET-derived textural features describing intra-tumor heterogeneity are increasingly investigated as imaging biomarkers. As part of the process of quantifying heterogeneity, image intensities (SUVs) are typically resampled into a reduced number of discrete bins. We focused on the implications of the manner in which this discretization is implemented. Two methods were evaluated: (1) RD, dividing the SUV range into D equally spaced bins, where the intensity resolution (i.e. bin size) varies per image; and (2) RB, maintaining a constant intensity resolution B. Clinical feasibility was assessed on 35 lung cancer patients, imaged before and in the second week of radiotherapy. Forty-four textural features were determined for different D and B for both imaging time points. Feature values depended on the intensity resolution and out of both assessed methods, RB was shown to allow for a meaningful inter- and intra-patient comparison of feature values. Overall, patients ranked differently according to feature values-which was used as a surrogate for textural feature interpretation-between both discretization methods. Our study shows that the manner of SUV discretization has a crucial effect on the resulting textural features and the interpretation thereof, emphasizing the importance of standardized methodology in tumor texture analysis.

  2. A spectral X-ray CT simulation study for quantitative determination of iron

    NASA Astrophysics Data System (ADS)

    Su, Ting; Kaftandjian, Valérie; Duvauchelle, Philippe; Zhu, Yuemin

    2018-06-01

    Iron is an essential element in the human body and disorders in iron such as iron deficiency or overload can cause serious diseases. This paper aims to explore the ability of spectral X-ray CT to quantitatively separate iron from calcium and potassium and to investigate the influence of different acquisition parameters on material decomposition performance. We simulated spectral X-ray CT imaging of a PMMA phantom filled with iron, calcium, and potassium solutions at various concentrations (15-200 mg/cc). Different acquisition parameters were considered, such as the number of energy bins (6, 10, 15, 20, 30, 60) and exposure factor per projection (0.025, 0.1, 1, 10, 100 mA s). Based on the simulation data, we investigated the performance of two regularized material decomposition approaches: projection domain method and image domain method. It was found that the former method discriminated iron from calcium, potassium and water in all cases and tended to benefit from lower number of energy bins for lower exposure factor acquisition. The latter method succeeded in iron determination only when the number of energy bins equals 60, and in this case, the contrast-to-noise ratios of the decomposed iron images are higher than those obtained using the projection domain method. The results demonstrate that both methods are able to discriminate and quantify iron from calcium, potassium and water under certain conditions. Their performances vary with the acquisition parameters of spectral CT. One can use one method or the other to benefit better performance according to the data available.

  3. SU-D-218-05: Material Quantification in Spectral X-Ray Imaging: Optimization and Validation.

    PubMed

    Nik, S J; Thing, R S; Watts, R; Meyer, J

    2012-06-01

    To develop and validate a multivariate statistical method to optimize scanning parameters for material quantification in spectral x-rayimaging. An optimization metric was constructed by extensively sampling the thickness space for the expected number of counts for m (two or three) materials. This resulted in an m-dimensional confidence region ofmaterial quantities, e.g. thicknesses. Minimization of the ellipsoidal confidence region leads to the optimization of energy bins. For the given spectrum, the minimum counts required for effective material separation can be determined by predicting the signal-to-noise ratio (SNR) of the quantification. A Monte Carlo (MC) simulation framework using BEAM was developed to validate the metric. Projection data of the m-materials was generated and material decomposition was performed for combinations of iodine, calcium and water by minimizing the z-score between the expected spectrum and binned measurements. The mean square error (MSE) and variance were calculated to measure the accuracy and precision of this approach, respectively. The minimum MSE corresponds to the optimal energy bins in the BEAM simulations. In the optimization metric, this is equivalent to the smallest confidence region. The SNR of the simulated images was also compared to the predictions from the metric. TheMSE was dominated by the variance for the given material combinations,which demonstrates accurate material quantifications. The BEAMsimulations revealed that the optimization of energy bins was accurate to within 1keV. The SNRs predicted by the optimization metric yielded satisfactory agreement but were expectedly higher for the BEAM simulations due to the inclusion of scattered radiation. The validation showed that the multivariate statistical method provides accurate material quantification, correct location of optimal energy bins and adequateprediction of image SNR. The BEAM code system is suitable for generating spectral x- ray imaging simulations. © 2012 American Association of Physicists in Medicine.

  4. Comparison of compostable bags and aerated bins with conventional storage systems to collect the organic fraction of municipal solid waste from homes. a Catalonia case study.

    PubMed

    Puyuelo, Belén; Colón, Joan; Martín, Patrícia; Sánchez, Antoni

    2013-06-01

    The separation of biowaste at home is key to improving, facilitating and reducing the operational costs of the treatment of organic municipal waste. The conventional method of collecting such waste and separating it at home is usually done by using a sealed bin with a plastic bag. The use of modern compostable bags is starting to be implemented in some European countries. These compostable bags are made of biodegradable polymers, often from renewable sources. In addition to compostable bags, a new model of bin is also promoted that has a perforated surface that, together with the compostable bag, makes the so-called "aerated system". In this study, different combinations of home collection systems have been systematically studied in the laboratory and at home. The results obtained quantitatively demonstrate that the aerated bin and compostable bag system combination is effective at improving the collection of biowaste without significant gaseous emissions and preparing the organic waste for further composting as concluded from the respiration indices. In terms of weight loss, temperature, gas emissions, respiration index and organic matter reduction, the best results were achieved with the aerated system. At the same time, a qualitative study of bin and bag combinations was carried in 100 homes in which more than 80% of the families participating preferred the aerated system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Reducing 4D CT artifacts using optimized sorting based on anatomic similarity.

    PubMed

    Johnston, Eric; Diehn, Maximilian; Murphy, James D; Loo, Billy W; Maxim, Peter G

    2011-05-01

    Four-dimensional (4D) computed tomography (CT) has been widely used as a tool to characterize respiratory motion in radiotherapy. The two most commonly used 4D CT algorithms sort images by the associated respiratory phase or displacement into a predefined number of bins, and are prone to image artifacts at transitions between bed positions. The purpose of this work is to demonstrate a method of reducing motion artifacts in 4D CT by incorporating anatomic similarity into phase or displacement based sorting protocols. Ten patient datasets were retrospectively sorted using both the displacement and phase based sorting algorithms. Conventional sorting methods allow selection of only the nearest-neighbor image in time or displacement within each bin. In our method, for each bed position either the displacement or the phase defines the center of a bin range about which several candidate images are selected. The two dimensional correlation coefficients between slices bordering the interface between adjacent couch positions are then calculated for all candidate pairings. Two slices have a high correlation if they are anatomically similar. Candidates from each bin are then selected to maximize the slice correlation over the entire data set using the Dijkstra's shortest path algorithm. To assess the reduction of artifacts, two thoracic radiation oncologists independently compared the resorted 4D datasets pairwise with conventionally sorted datasets, blinded to the sorting method, to choose which had the least motion artifacts. Agreement between reviewers was evaluated using the weighted kappa score. Anatomically based image selection resulted in 4D CT datasets with significantly reduced motion artifacts with both displacement (P = 0.0063) and phase sorting (P = 0.00022). There was good agreement between the two reviewers, with complete agreement 34 times and complete disagreement 6 times. Optimized sorting using anatomic similarity significantly reduces 4D CT motion artifacts compared to conventional phase or displacement based sorting. This improved sorting algorithm is a straightforward extension of the two most common 4D CT sorting algorithms.

  6. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    NASA Astrophysics Data System (ADS)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  7. Galaxy clustering with photometric surveys using PDF redshift information

    DOE PAGES

    Asorey, J.; Carrasco Kind, M.; Sevilla-Noarbe, I.; ...

    2016-03-28

    Here, photometric surveys produce large-area maps of the galaxy distribution, but with less accurate redshift information than is obtained from spectroscopic methods. Modern photometric redshift (photo-z) algorithms use galaxy magnitudes, or colors, that are obtained through multi-band imaging to produce a probability density function (PDF) for each galaxy in the map. We used simulated data to study the effect of using different photo-z estimators to assign galaxies to redshift bins in order to compare their effects on angular clustering and galaxy bias measurements. We found that if we use the entire PDF, rather than a single-point (mean or mode) estimate, the deviations are less biased, especially when using narrow redshift bins. When the redshift bin widths aremore » $$\\Delta z=0.1$$, the use of the entire PDF reduces the typical measurement bias from 5%, when using single point estimates, to 3%.« less

  8. A fixed mass method for the Kramers-Moyal expansion--application to time series with outliers.

    PubMed

    Petelczyc, M; Żebrowski, J J; Orłowska-Baranowska, E

    2015-03-01

    Extraction of stochastic and deterministic components from empirical data-necessary for the reconstruction of the dynamics of the system-is discussed. We determine both components using the Kramers-Moyal expansion. In our earlier papers, we obtained large fluctuations in the magnitude of both terms for rare or extreme valued events in the data. Calculations for such events are burdened by an unsatisfactory quality of the statistics. In general, the method is sensitive to the binning procedure applied for the construction of histograms. Instead of the commonly used constant width of bins, we use here a constant number of counts for each bin. This approach-the fixed mass method-allows to include in the calculation events, which do not yield satisfactory statistics in the fixed bin width method. The method developed is general. To demonstrate its properties, here, we present the modified Kramers-Moyal expansion method and discuss its properties by the application of the fixed mass method to four representative heart rate variability recordings with different numbers of ectopic beats. These beats may be rare events as well as outlying, i.e., very small or very large heart cycle lengths. The properties of ectopic beats are important not only for medical diagnostic purposes but the occurrence of ectopic beats is a general example of the kind of variability that occurs in a signal with outliers. To show that the method is general, we also present results for two examples of data from very different areas of science: daily temperatures at a large European city and recordings of traffics on a highway. Using the fixed mass method, to assess the dynamics leading to the outlying events we studied the occurrence of higher order terms of the Kramers-Moyal expansion in the recordings. We found that the higher order terms of the Kramers-Moyal expansion are negligible for heart rate variability. This finding opens the possibility of the application of the Langevin equation to the whole range of empirical signals containing rare or outlying events. Note, however, that the higher order terms are non-negligible for the other data studied here and for it the Langevin equation is not applicable as a model.

  9. Conservative bin-to-bin fractional collisions

    NASA Astrophysics Data System (ADS)

    Martin, Robert

    2016-11-01

    Particle methods such as direct simulation Monte Carlo (DSMC) and particle-in-cell (PIC) are commonly used to model rarefied kinetic flows for engineering applications because of their ability to efficiently capture non-equilibrium behavior. The primary drawback to these methods relates to the poor convergence properties due to the stochastic nature of the methods which typically rely heavily on high degrees of non-equilibrium and time averaging to compensate for poor signal to noise ratios. For standard implementations, each computational particle represents many physical particles which further exacerbate statistical noise problems for flow with large species density variation such as encountered in flow expansions and chemical reactions. The stochastic weighted particle method (SWPM) introduced by Rjasanow and Wagner overcome this difficulty by allowing the ratio of real to computational particles to vary on a per particle basis throughout the flow. The DSMC procedure must also be slightly modified to properly sample the Boltzmann collision integral accounting for the variable particle weights and to avoid the creation of additional particles with negative weight. In this work, the SWPM with necessary modification to incorporate the variable hard sphere (VHS) collision cross section model commonly used in engineering applications is first incorporated into an existing engineering code, the Thermophysics Universal Research Framework. The results and computational efficiency are compared to a few simple test cases using a standard validated implementation of the DSMC method along with the adapted SWPM/VHS collision using an octree based conservative phase space reconstruction. The SWPM method is then further extended to combine the collision and phase space reconstruction into a single step which avoids the need to create additional computational particles only to destroy them again during the particle merge. This is particularly helpful when oversampling the collision integral when compared to the standard DSMC method. However, it is found that the more frequent phase space reconstructions can cause added numerical thermalization with low particle per cell counts due to the coarseness of the octree used. However, the methods are expected to be of much greater utility in transient expansion flows and chemical reactions in the future.

  10. Subcellular Changes in Bridging Integrator 1 Protein Expression in the Cerebral Cortex During the Progression of Alzheimer Disease Pathology.

    PubMed

    Adams, Stephanie L; Tilton, Kathy; Kozubek, James A; Seshadri, Sudha; Delalle, Ivana

    2016-08-01

    Genome-wide association studies have established BIN1 (Bridging Integrator 1) as the most significant late-onset Alzheimer disease (AD) susceptibility locus after APOE We analyzed BIN1 protein expression using automated immunohistochemistry on the hippocampal CA1 region in 19 patients with either no, mild, or moderate-to-marked AD pathology, who had been assessed by Clinical Dementia Rating and CERAD scores. We also examined the amygdala, prefrontal, temporal, and occipital regions in a subset of these patients. In non-demented controls without AD pathology, BIN1 protein was expressed in white matter, glia, particularly oligodendrocytes, and in the neuropil in which the BIN1 signal decorated axons. With increasing severity of AD, BIN1 in the CA1 region showed: 1) sustained expression in glial cells, 2) decreased areas of neuropil expression, and 3) increased cytoplasmic neuronal expression that did not correlate with neurofibrillary tangle load. In patients with AD, both the prefrontal cortex and CA1 showed a decrease in BIN1-immunoreactive (BIN1-ir) neuropil areas and increases in numbers of BIN1-ir neurons. The numbers of CA1 BIN1-ir pyramidal neurons correlated with hippocampal CERAD neuritic plaque scores; BIN1 neuropil signal was absent in neuritic plaques. Our data provide novel insight into the relationship between BIN1 protein expression and the progression of AD-associated pathology and its diagnostic hallmarks. © 2016 American Association of Neuropathologists, Inc. All rights reserved.

  11. Studying Filamentary Currents with Thomson Scattering on MST

    NASA Astrophysics Data System (ADS)

    den Hartog, D. J.; Young, W. C.; Kubala, S. Z.

    2016-10-01

    The MST reversed-field pinch plasma generates bursts of toroidally localized magnetic activity associated with m = 0 modes resonant at the reversal surface near the plasma edge. Previously, using data from an array of edge magnetic probes, these bursts were connected to poloidal current filaments. Now the MST Thomson scattering diagnostic is being used to measure the net drift in the electron distribution due to these currents. An additional long-wavelength spectral bin has been added to several Thomson scattering polychromators, in addition to 5-7 pre-existing short wavelength spectral bins, to improve discrimination between shifted vs. broadened spectra. The bursts are examined in plasma conditions that display spontaneous periods of low tearing-mode activity, with higher confinement and higher temperatures that improve Thomson scattering measurement performance. This work is supported by the U.S. Department of Energy and the National Science Foundation.

  12. 45. VIEW OF UPPER LEVEL CRUSHER ADDITION FROM CRUSHED OXIDIZED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    45. VIEW OF UPPER LEVEL CRUSHER ADDITION FROM CRUSHED OXIDIZED ORE BIN. 18 INCH BELT CONVEYOR BIN FEED, LOWER CENTER, WITH STEPHENS-ADAMSON 25 TON/HR ELEVATOR SPLIT DISCHARGE (OXIDIZED/UNOXIDIZED) IN CENTER. CRUDE ORE BINS AND MACHINE SHOP BEYOND. NOTE TOP OF CRUSHED OXIDIZED ORE BIN IS BELOW TOP OF CRUDE ORE BINS. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  13. Effects of Mixtures on Liquid and Solid Fragment Size Distributions

    DTIC Science & Technology

    2016-05-01

    bins, too few size bins, fixed bin widths, or inadequately- varying bin widths. Overpopulated bins – which typically occur for smaller fragments...2010 C. V. B. Cunningham, The Kuz-Ram Fragmentation Model – 20 Years On, In R. Holmberg et. al., Editors, Proceedings of the 3 rd World ...1992 P. K. Sahoo and T. Riedel, Mean Value Theorems and Functional Equations, World Scientific, 1998 K. A. Sallam, C. Aalburg, G.M. Faeth

  14. BIN1 is reduced and Cav1.2 trafficking is impaired in human failing cardiomyocytes.

    PubMed

    Hong, Ting-Ting; Smyth, James W; Chu, Kevin Y; Vogan, Jacob M; Fong, Tina S; Jensen, Brian C; Fang, Kun; Halushka, Marc K; Russell, Stuart D; Colecraft, Henry; Hoopes, Charles W; Ocorr, Karen; Chi, Neil C; Shaw, Robin M

    2012-05-01

    Heart failure is a growing epidemic, and a typical aspect of heart failure pathophysiology is altered calcium transients. Normal cardiac calcium transients are initiated by Cav1.2 channels at cardiac T tubules. Bridging integrator 1 (BIN1) is a membrane scaffolding protein that causes Cav1.2 to traffic to T tubules in healthy hearts. The mechanisms of Cav1.2 trafficking in heart failure are not known. To study BIN1 expression and its effect on Cav1.2 trafficking in failing hearts. Intact myocardium and freshly isolated cardiomyocytes from nonfailing and end-stage failing human hearts were used to study BIN1 expression and Cav1.2 localization. To confirm Cav1.2 surface expression dependence on BIN1, patch-clamp recordings were performed of Cav1.2 current in cell lines with and without trafficking-competent BIN1. Also, in adult mouse cardiomyocytes, surface Cav1.2 and calcium transients were studied after small hairpin RNA-mediated knockdown of BIN1. For a functional readout in intact heart, calcium transients and cardiac contractility were analyzed in a zebrafish model with morpholino-mediated knockdown of BIN1. BIN1 expression is significantly decreased in failing cardiomyocytes at both mRNA (30% down) and protein (36% down) levels. Peripheral Cav1.2 is reduced to 42% by imaging, and a biochemical T-tubule fraction of Cav1.2 is reduced to 68%. The total calcium current is reduced to 41% in a cell line expressing a nontrafficking BIN1 mutant. In mouse cardiomyocytes, BIN1 knockdown decreases surface Cav1.2 and impairs calcium transients. In zebrafish hearts, BIN1 knockdown causes a 75% reduction in calcium transients and severe ventricular contractile dysfunction. The data indicate that BIN1 is significantly reduced in human heart failure, and this reduction impairs Cav1.2 trafficking, calcium transients, and contractility. Copyright © 2012 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  15. Computation of reliable textural indices from multimodal brain MRI: suggestions based on a study of patients with diffuse intrinsic pontine glioma.

    PubMed

    Goya-Outi, Jessica; Orlhac, Fanny; Calmon, Raphael; Alentorn, Agusti; Nioche, Christophe; Philippe, Cathy; Puget, Stéphanie; Boddaert, Nathalie; Buvat, Irène; Grill, Jacques; Frouin, Vincent; Frouin, Frederique

    2018-05-10

    Few methodological studies regarding widely used textural indices robustness in MRI have been reported. In this context, this study aims to propose some rules to compute reliable textural indices from multimodal 3D brain MRI. Diagnosis and post-biopsy MR scans including T1, post-contrast T1, T2 and FLAIR images from thirty children with diffuse intrinsic pontine glioma (DIPG) were considered. The hybrid white stripe method was adapted to standardize MR intensities. Sixty textural indices were then computed for each modality in different regions of interest (ROI), including tumor and white matter (WM). Three types of intensity binning were compared [Formula: see text]: constant bin width and relative bounds; [Formula: see text] constant number of bins and relative bounds; [Formula: see text] constant number of bins and absolute bounds. The impact of the volume of the region was also tested within the WM. First, the mean Hellinger distance between patient-based intensity distributions decreased by a factor greater than 10 in WM and greater than 2.5 in gray matter after standardization. Regarding the binning strategy, the ranking of patients was highly correlated for 188/240 features when comparing [Formula: see text] with [Formula: see text], but for only 20 when comparing [Formula: see text] with [Formula: see text], and nine when comparing [Formula: see text] with [Formula: see text]. Furthermore, when using [Formula: see text] or [Formula: see text] texture indices reflected tumor heterogeneity as assessed visually by experts. Last, 41 features presented statistically significant differences between contralateral WM regions when ROI size slightly varies across patients, and none when using ROI of the same size. For regions with similar size, 224 features were significantly different between WM and tumor. Valuable information from texture indices can be biased by methodological choices. Recommendations are to standardize intensities in MR brain volumes, to use intensity binning with constant bin width, and to define regions with the same volumes to get reliable textural indices.

  16. Explicit Cloud Nucleation from Arbitrary Mixtures of Aerosol Types and Sizes Using an Ultra-Efficient In-Line Aerosol Bin Model in High-Resolution Simulations of Hurricanes

    NASA Astrophysics Data System (ADS)

    Walko, R. L.; Ashby, T.; Cotton, W. R.

    2017-12-01

    The fundamental role of atmospheric aerosols in the process of cloud droplet nucleation is well known, and there is ample evidence that the concentration, size, and chemistry of aerosols can strongly influence microphysical, thermodynamic, and ultimately dynamic properties and evolution of clouds and convective systems. With the increasing availability of observation- and model-based environmental representations of different types of anthropogenic and natural aerosols, there is increasing need for models to be able to represent which aerosols nucleate and which do not in supersaturated conditions. However, this is a very complex process that involves competition for water vapor between multiple aerosol species (chemistries) and different aerosol sizes within each species. Attempts have been made to parameterize the nucleation properties of mixtures of different aerosol species, but it is very difficult or impossible to represent all possible mixtures that may occur in practice. As part of a modeling study of the impact of anthropogenic and natural aerosols on hurricanes, we developed an ultra-efficient aerosol bin model to represent nucleation in a high-resolution atmospheric model that explicitly represents cloud- and subcloud-scale vertical motion. The bin model is activated at any time and location in a simulation where supersaturation occurs and is potentially capable of activating new cloud droplets. The bins are populated from the aerosol species that are present at the given time and location and by multiple sizes from each aerosol species according to a characteristic size distribution, and the chemistry of each species is represented by its absorption or adsorption characteristics. The bin model is integrated in time increments that are smaller than that of the atmospheric model in order to temporally resolve the peak supersaturation, which determines the total nucleated number. Even though on the order of 100 bins are typically utilized, this leads only to a 10 or 20% increase in overall computational cost due to the efficiency of the bin model. This method is highly versatile in that it automatically accommodates any possible number and mixture of different aerosol species. Applications of this model to simulations of Typhoon Nuri will be presented.

  17. 15. NORTH ELEVATION OF UPPER ORE BIN, CHUTE, AND JAW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. NORTH ELEVATION OF UPPER ORE BIN, CHUTE, AND JAW CRUSHER, LOOKING SOUTH FROM END OF CONVEYOR PLATFORM. NOTICE THE THREE ORE BIN CONTROL DOORS, CORRESPONDING TO SEPARATE COMPARTMENTS OF THE BIN. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  18. A fixed mass method for the Kramers-Moyal expansion—Application to time series with outliers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petelczyc, M.; Żebrowski, J. J.; Orłowska-Baranowska, E.

    2015-03-15

    Extraction of stochastic and deterministic components from empirical data—necessary for the reconstruction of the dynamics of the system—is discussed. We determine both components using the Kramers-Moyal expansion. In our earlier papers, we obtained large fluctuations in the magnitude of both terms for rare or extreme valued events in the data. Calculations for such events are burdened by an unsatisfactory quality of the statistics. In general, the method is sensitive to the binning procedure applied for the construction of histograms. Instead of the commonly used constant width of bins, we use here a constant number of counts for each bin. Thismore » approach—the fixed mass method—allows to include in the calculation events, which do not yield satisfactory statistics in the fixed bin width method. The method developed is general. To demonstrate its properties, here, we present the modified Kramers-Moyal expansion method and discuss its properties by the application of the fixed mass method to four representative heart rate variability recordings with different numbers of ectopic beats. These beats may be rare events as well as outlying, i.e., very small or very large heart cycle lengths. The properties of ectopic beats are important not only for medical diagnostic purposes but the occurrence of ectopic beats is a general example of the kind of variability that occurs in a signal with outliers. To show that the method is general, we also present results for two examples of data from very different areas of science: daily temperatures at a large European city and recordings of traffics on a highway. Using the fixed mass method, to assess the dynamics leading to the outlying events we studied the occurrence of higher order terms of the Kramers-Moyal expansion in the recordings. We found that the higher order terms of the Kramers-Moyal expansion are negligible for heart rate variability. This finding opens the possibility of the application of the Langevin equation to the whole range of empirical signals containing rare or outlying events. Note, however, that the higher order terms are non-negligible for the other data studied here and for it the Langevin equation is not applicable as a model.« less

  19. Lagrangian analysis by clustering. An example in the Nordic Seas.

    NASA Astrophysics Data System (ADS)

    Koszalka, Inga; Lacasce, Joseph H.

    2010-05-01

    We propose a new method for obtaining average velocities and eddy diffusivities from Lagrangian data. Rather than grouping the drifter-derived velocities in uniform geographical bins, as is commonly done, we group a specified number of nearest-neighbor velocities. This is done via a clustering algorithm operating on the instantaneous positions of the drifters. Thus it is the data distribution itself which determines the positions of the averages and the areal extent of the clusters. A major advantage is that because the number of members is essentially the same for all clusters, the statistical accuracy is more uniform than with geographical bins. We illustrate the technique using synthetic data from a stochastic model, employing a realistic mean flow. The latter is an accurate representation of the surface currents in the Nordic Seas and is strongly inhomogeneous in space. We use the clustering algorithm to extract the mean velocities and diffusivities (both of which are known from the stochastic model). We also compare the results to those obtained with fixed geographical bins. Clustering is more successful at capturing spatial variability of the mean flow and also improves convergence in the eddy diffusivity estimates. We discuss both the future prospects and shortcomings of the new method.

  20. Non-negative Matrix Factorization for Self-calibration of Photometric Redshift Scatter in Weak-lensing Surveys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Le; Yu, Yu; Zhang, Pengjie, E-mail: lezhang@sjtu.edu.cn

    Photo- z error is one of the major sources of systematics degrading the accuracy of weak-lensing cosmological inferences. Zhang et al. proposed a self-calibration method combining galaxy–galaxy correlations and galaxy–shear correlations between different photo- z bins. Fisher matrix analysis shows that it can determine the rate of photo- z outliers at a level of 0.01%–1% merely using photometric data and do not rely on any prior knowledge. In this paper, we develop a new algorithm to implement this method by solving a constrained nonlinear optimization problem arising in the self-calibration process. Based on the techniques of fixed-point iteration and non-negativemore » matrix factorization, the proposed algorithm can efficiently and robustly reconstruct the scattering probabilities between the true- z and photo- z bins. The algorithm has been tested extensively by applying it to mock data from simulated stage IV weak-lensing projects. We find that the algorithm provides a successful recovery of the scatter rates at the level of 0.01%–1%, and the true mean redshifts of photo- z bins at the level of 0.001, which may satisfy the requirements in future lensing surveys.« less

  1. Cosmological model-independent test of ΛCDM with two-point diagnostic by the observational Hubble parameter data

    NASA Astrophysics Data System (ADS)

    Cao, Shu-Lei; Duan, Xiao-Wei; Meng, Xiao-Lei; Zhang, Tong-Jie

    2018-04-01

    Aiming at exploring the nature of dark energy (DE), we use forty-three observational Hubble parameter data (OHD) in the redshift range 0 < z ≤slant 2.36 to make a cosmological model-independent test of the ΛCDM model with two-point Omh^2(z2;z1) diagnostic. In ΛCDM model, with equation of state (EoS) w=-1, two-point diagnostic relation Omh^2 ≡ Ωmh^2 is tenable, where Ωm is the present matter density parameter, and h is the Hubble parameter divided by 100 {km s^{-1 Mpc^{-1}}}. We utilize two methods: the weighted mean and median statistics to bin the OHD to increase the signal-to-noise ratio of the measurements. The binning methods turn out to be promising and considered to be robust. By applying the two-point diagnostic to the binned data, we find that although the best-fit values of Omh^2 fluctuate as the continuous redshift intervals change, on average, they are continuous with being constant within 1 σ confidence interval. Therefore, we conclude that the ΛCDM model cannot be ruled out.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leng, Shuai; Yu, Lifeng; Wang, Jia

    Purpose: Our purpose was to reduce image noise in spectral CT by exploiting data redundancies in the energy domain to allow flexible selection of the number, width, and location of the energy bins. Methods: Using a variety of spectral CT imaging methods, conventional filtered backprojection (FBP) reconstructions were performed and resulting images were compared to those processed using a Local HighlY constrained backPRojection Reconstruction (HYPR-LR) algorithm. The mean and standard deviation of CT numbers were measured within regions of interest (ROIs), and results were compared between FBP and HYPR-LR. For these comparisons, the following spectral CT imaging methods were used:(i)more » numerical simulations based on a photon-counting, detector-based CT system, (ii) a photon-counting, detector-based micro CT system using rubidium and potassium chloride solutions, (iii) a commercial CT system equipped with integrating detectors utilizing tube potentials of 80, 100, 120, and 140 kV, and (iv) a clinical dual-energy CT examination. The effects of tube energy and energy bin width were evaluated appropriate to each CT system. Results: The mean CT number in each ROI was unchanged between FBP and HYPR-LR images for each of the spectral CT imaging scenarios, irrespective of bin width or tube potential. However, image noise, as represented by the standard deviation of CT numbers in each ROI, was reduced by 36%-76%. In all scenarios, image noise after HYPR-LR algorithm was similar to that of composite images, which used all available photons. No difference in spatial resolution was observed between HYPR-LR processing and FBP. Dual energy patient data processed using HYPR-LR demonstrated reduced noise in the individual, low- and high-energy images, as well as in the material-specific basis images. Conclusions: Noise reduction can be accomplished for spectral CT by exploiting data redundancies in the energy domain. HYPR-LR is a robust method for reducing image noise in a variety of spectral CT imaging systems without losing spatial resolution or CT number accuracy. This method improves the flexibility to select energy bins in the manner that optimizes material identification and separation without paying the penalty of increased image noise or its corollary, increased patient dose.« less

  3. Long-range barcode labeling-sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Feng; Zhang, Tao; Singh, Kanwar K.

    Methods for sequencing single large DNA molecules by clonal multiple displacement amplification using barcoded primers. Sequences are binned based on barcode sequences and sequenced using a microdroplet-based method for sequencing large polynucleotide templates to enable assembly of haplotype-resolved complex genomes and metagenomes.

  4. Whole-heart coronary MRA with 3D affine motion correction using 3D image-based navigation.

    PubMed

    Henningsson, Markus; Prieto, Claudia; Chiribiri, Amedeo; Vaillant, Ghislain; Razavi, Reza; Botnar, René M

    2014-01-01

    Robust motion correction is necessary to minimize respiratory motion artefacts in coronary MR angiography (CMRA). The state-of-the-art method uses a 1D feet-head translational motion correction approach, and data acquisition is limited to a small window in the respiratory cycle, which prolongs the scan by a factor of 2-3. The purpose of this work was to implement 3D affine motion correction for Cartesian whole-heart CMRA using a 3D navigator (3D-NAV) to allow for data acquisition throughout the whole respiratory cycle. 3D affine transformations for different respiratory states (bins) were estimated by using 3D-NAV image acquisitions which were acquired during the startup profiles of a steady-state free precession sequence. The calculated 3D affine transformations were applied to the corresponding high-resolution Cartesian image acquisition which had been similarly binned, to correct for respiratory motion between bins. Quantitative and qualitative comparisons showed no statistical difference between images acquired with the proposed method and the reference method using a diaphragmatic navigator with a narrow gating window. We demonstrate that 3D-NAV and 3D affine correction can be used to acquire Cartesian whole-heart 3D coronary artery images with 100% scan efficiency with similar image quality as with the state-of-the-art gated and corrected method with approximately 50% scan efficiency. Copyright © 2013 Wiley Periodicals, Inc.

  5. Calculating Soil Wetness, Evapotranspiration and Carbon Cycle Processes Over Large Grid Areas Using a New Scaling Technique

    NASA Technical Reports Server (NTRS)

    Sellers, Piers

    2012-01-01

    Soil wetness typically shows great spatial variability over the length scales of general circulation model (GCM) grid areas (approx 100 km ), and the functions relating evapotranspiration and photosynthetic rate to local-scale (approx 1 m) soil wetness are highly non-linear. Soil respiration is also highly dependent on very small-scale variations in soil wetness. We therefore expect significant inaccuracies whenever we insert a single grid area-average soil wetness value into a function to calculate any of these rates for the grid area. For the particular case of evapotranspiration., this method - use of a grid-averaged soil wetness value - can also provoke severe oscillations in the evapotranspiration rate and soil wetness under some conditions. A method is presented whereby the probability distribution timction(pdf) for soil wetness within a grid area is represented by binning. and numerical integration of the binned pdf is performed to provide a spatially-integrated wetness stress term for the whole grid area, which then permits calculation of grid area fluxes in a single operation. The method is very accurate when 10 or more bins are used, can deal realistically with spatially variable precipitation, conserves moisture exactly and allows for precise modification of the soil wetness pdf after every time step. The method could also be applied to other ecological problems where small-scale processes must be area-integrated, or upscaled, to estimate fluxes over large areas, for example in treatments of the terrestrial carbon budget or trace gas generation.

  6. 30 CFR 57.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... NONMETAL MINES Materials Storage and Handling § 57.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled...

  7. 30 CFR 56.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... MINES Materials Storage and Handling § 56.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled or...

  8. Pack Factor Measurementss for Corn in Grain Storage Bins

    USDA-ARS?s Scientific Manuscript database

    Grain is commonly stored commercially in tall bins, which often are as deep as 35 m (114.8 ft) for tall and narrow concrete bins and about 32 m (105 ft) in diameter for large corrugated steel bins. Grain can support the great pressure without crushing, but it yields somewhat to compaction under its ...

  9. 19. VIEW OF CRUDE ORE BINS FROM EAST. EAST CRUDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. VIEW OF CRUDE ORE BINS FROM EAST. EAST CRUDE ORE BIN IN FOREGROUND WITH DISCHARGE TO GRIZZLY AT BOTTOM OF VIEW. CONCRETE RETAINING WALL TO LEFT (SOUTH) AND BOTTOM (EAST EDGE OF EAST BIN). - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  10. Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system.

    PubMed

    Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan

    2011-12-01

    This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. Toward zero waste: Composting and recycling for sustainable venue based events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hottle, Troy A., E-mail: troy.hottle@asu.edu; Bilec, Melissa M., E-mail: mbilec@pitt.edu; Brown, Nicholas R., E-mail: nick.brown@asu.edu

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g.more » recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO{sub 2} equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO{sub 2} eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO{sub 2} eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the staffed bins) and 23% contamination rates at the third game.« less

  12. Dark Energy Survey Year 1 Results: redshift distributions of the weak-lensing source galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, B.; Gruen, D.; Bernstein, G. M.; Rau, M. M.; De Vicente, J.; Hartley, W. G.; Gaztanaga, E.; DeRose, J.; Troxel, M. A.; Davis, C.; Alarcon, A.; MacCrann, N.; Prat, J.; Sánchez, C.; Sheldon, E.; Wechsler, R. H.; Asorey, J.; Becker, M. R.; Bonnett, C.; Carnero Rosell, A.; Carollo, D.; Carrasco Kind, M.; Castander, F. J.; Cawthon, R.; Chang, C.; Childress, M.; Davis, T. M.; Drlica-Wagner, A.; Gatti, M.; Glazebrook, K.; Gschwend, J.; Hinton, S. R.; Hoormann, J. K.; Kim, A. G.; King, A.; Kuehn, K.; Lewis, G.; Lidman, C.; Lin, H.; Macaulay, E.; Maia, M. A. G.; Martini, P.; Mudd, D.; Möller, A.; Nichol, R. C.; Ogando, R. L. C.; Rollins, R. P.; Roodman, A.; Ross, A. J.; Rozo, E.; Rykoff, E. S.; Samuroff, S.; Sevilla-Noarbe, I.; Sharp, R.; Sommer, N. E.; Tucker, B. E.; Uddin, S. A.; Varga, T. N.; Vielzeuf, P.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Busha, M. T.; Capozzi, D.; Carretero, J.; Crocce, M.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Eifler, T. F.; Estrada, J.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Kirk, D.; Krause, E.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Miquel, R.; Nord, B.; O'Neill, C. R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.; Yanny, B.; Zuntz, J.

    2018-07-01

    We describe the derivation and validation of redshift distribution estimates and their uncertainties for the populations of galaxies used as weak-lensing sources in the Dark Energy Survey (DES) Year 1 cosmological analyses. The Bayesian Photometric Redshift (BPZ) code is used to assign galaxies to four redshift bins between z ≈ 0.2 and ≈1.3, and to produce initial estimates of the lensing-weighted redshift distributions n^i_PZ(z)∝ dn^i/dz for members of bin i. Accurate determination of cosmological parameters depends critically on knowledge of ni, but is insensitive to bin assignments or redshift errors for individual galaxies. The cosmological analyses allow for shifts n^i(z)=n^i_PZ(z-Δ z^i) to correct the mean redshift of ni(z) for biases in n^i_PZ. The Δzi are constrained by comparison of independently estimated 30-band photometric redshifts of galaxies in the Cosmic Evolution Survey (COSMOS) field to BPZ estimates made from the DES griz fluxes, for a sample matched in fluxes, pre-seeing size, and lensing weight to the DES weak-lensing sources. In companion papers, the Δzi of the three lowest redshift bins are further constrained by the angular clustering of the source galaxies around red galaxies with secure photometric redshifts at 0.15 < z < 0.9. This paper details the BPZ and COSMOS procedures, and demonstrates that the cosmological inference is insensitive to details of the ni(z) beyond the choice of Δzi. The clustering and COSMOS validation methods produce consistent estimates of Δzi in the bins where both can be applied, with combined uncertainties of σ_{Δ z^i}=0.015, 0.013, 0.011, and 0.022 in the four bins. Repeating the photo-z procedure instead using the Directional Neighbourhood Fitting algorithm, or using the ni(z) estimated from the matched sample in COSMOS, yields no discernible difference in cosmological inferences.

  13. Dark Energy Survey Year 1 Results: Redshift distributions of the weak lensing source galaxies

    NASA Astrophysics Data System (ADS)

    Hoyle, B.; Gruen, D.; Bernstein, G. M.; Rau, M. M.; De Vicente, J.; Hartley, W. G.; Gaztanaga, E.; DeRose, J.; Troxel, M. A.; Davis, C.; Alarcon, A.; MacCrann, N.; Prat, J.; Sánchez, C.; Sheldon, E.; Wechsler, R. H.; Asorey, J.; Becker, M. R.; Bonnett, C.; Carnero Rosell, A.; Carollo, D.; Carrasco Kind, M.; Castander, F. J.; Cawthon, R.; Chang, C.; Childress, M.; Davis, T. M.; Drlica-Wagner, A.; Gatti, M.; Glazebrook, K.; Gschwend, J.; Hinton, S. R.; Hoormann, J. K.; Kim, A. G.; King, A.; Kuehn, K.; Lewis, G.; Lidman, C.; Lin, H.; Macaulay, E.; Maia, M. A. G.; Martini, P.; Mudd, D.; Möller, A.; Nichol, R. C.; Ogando, R. L. C.; Rollins, R. P.; Roodman, A.; Ross, A. J.; Rozo, E.; Rykoff, E. S.; Samuroff, S.; Sevilla-Noarbe, I.; Sharp, R.; Sommer, N. E.; Tucker, B. E.; Uddin, S. A.; Varga, T. N.; Vielzeuf, P.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Abdalla, F. B.; Allam, S.; Annis, J.; Bechtol, K.; Benoit-Lévy, A.; Bertin, E.; Brooks, D.; Buckley-Geer, E.; Burke, D. L.; Busha, M. T.; Capozzi, D.; Carretero, J.; Crocce, M.; D'Andrea, C. B.; da Costa, L. N.; DePoy, D. L.; Desai, S.; Diehl, H. T.; Doel, P.; Eifler, T. F.; Estrada, J.; Evrard, A. E.; Fernandez, E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Giannantonio, T.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Kirk, D.; Krause, E.; Kuhlmann, S.; Kuropatkin, N.; Lahav, O.; Li, T. S.; Lima, M.; March, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Miquel, R.; Nord, B.; O'Neill, C. R.; Plazas, A. A.; Romer, A. K.; Sako, M.; Sanchez, E.; Santiago, B.; Scarpine, V.; Schindler, R.; Schubnell, M.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.; Yanny, B.; Zuntz, J.; DES Collaboration

    2018-04-01

    We describe the derivation and validation of redshift distribution estimates and their uncertainties for the populations of galaxies used as weak lensing sources in the Dark Energy Survey (DES) Year 1 cosmological analyses. The Bayesian Photometric Redshift (BPZ) code is used to assign galaxies to four redshift bins between z ≈ 0.2 and ≈1.3, and to produce initial estimates of the lensing-weighted redshift distributions n^i_PZ(z)∝ dn^i/dz for members of bin i. Accurate determination of cosmological parameters depends critically on knowledge of ni but is insensitive to bin assignments or redshift errors for individual galaxies. The cosmological analyses allow for shifts n^i(z)=n^i_PZ(z-Δ z^i) to correct the mean redshift of ni(z) for biases in n^i_PZ. The Δzi are constrained by comparison of independently estimated 30-band photometric redshifts of galaxies in the COSMOS field to BPZ estimates made from the DES griz fluxes, for a sample matched in fluxes, pre-seeing size, and lensing weight to the DES weak-lensing sources. In companion papers, the Δzi of the three lowest redshift bins are further constrained by the angular clustering of the source galaxies around red galaxies with secure photometric redshifts at 0.15 < z < 0.9. This paper details the BPZ and COSMOS procedures, and demonstrates that the cosmological inference is insensitive to details of the ni(z) beyond the choice of Δzi. The clustering and COSMOS validation methods produce consistent estimates of Δzi in the bins where both can be applied, with combined uncertainties of σ _{Δ z^i}=0.015, 0.013, 0.011, and 0.022 in the four bins. Repeating the photo-z proceedure instead using the Directional Neighborhood Fitting (DNF) algorithm, or using the ni(z) estimated from the matched sample in COSMOS, yields no discernible difference in cosmological inferences.

  14. Allocation of solid waste collection bins and route optimisation using geographical information system: A case study of Dhanbad City, India.

    PubMed

    Khan, D; Samadder, S R

    2016-07-01

    Collection of municipal solid waste is one of the most important elements of municipal waste management and requires maximum fund allocated for waste management. The cost of collection and transportation can be reduced in comparison with the present scenario if the solid waste collection bins are located at suitable places so that the collection routes become minimum. This study presents a suitable solid waste collection bin allocation method at appropriate places with uniform distance and easily accessible location so that the collection vehicle routes become minimum for the city Dhanbad, India. The network analyst tool set available in ArcGIS was used to find the optimised route for solid waste collection considering all the required parameters for solid waste collection efficiently. These parameters include the positions of solid waste collection bins, the road network, the population density, waste collection schedules, truck capacities and their characteristics. The present study also demonstrates the significant cost reductions that can be obtained compared with the current practices in the study area. The vehicle routing problem solver tool of ArcGIS was used to identify the cost-effective scenario for waste collection, to estimate its running costs and to simulate its application considering both travel time and travel distance simultaneously. © The Author(s) 2016.

  15. An Enhanced Differential Evolution Algorithm Based on Multiple Mutation Strategies.

    PubMed

    Xiang, Wan-li; Meng, Xue-lei; An, Mei-qing; Li, Yin-zhen; Gao, Ming-xia

    2015-01-01

    Differential evolution algorithm is a simple yet efficient metaheuristic for global optimization over continuous spaces. However, there is a shortcoming of premature convergence in standard DE, especially in DE/best/1/bin. In order to take advantage of direction guidance information of the best individual of DE/best/1/bin and avoid getting into local trap, based on multiple mutation strategies, an enhanced differential evolution algorithm, named EDE, is proposed in this paper. In the EDE algorithm, an initialization technique, opposition-based learning initialization for improving the initial solution quality, and a new combined mutation strategy composed of DE/current/1/bin together with DE/pbest/bin/1 for the sake of accelerating standard DE and preventing DE from clustering around the global best individual, as well as a perturbation scheme for further avoiding premature convergence, are integrated. In addition, we also introduce two linear time-varying functions, which are used to decide which solution search equation is chosen at the phases of mutation and perturbation, respectively. Experimental results tested on twenty-five benchmark functions show that EDE is far better than the standard DE. In further comparisons, EDE is compared with other five state-of-the-art approaches and related results show that EDE is still superior to or at least equal to these methods on most of benchmark functions.

  16. Assessing the uncertainty in a normal tissue complication probability difference (∆NTCP): radiation-induced liver disease (RILD) in liver tumour patients treated with proton vs X-ray therapy.

    PubMed

    Kobashi, Keiji; Prayongrat, Anussara; Kimoto, Takuya; Toramatsu, Chie; Dekura, Yasuhiro; Katoh, Norio; Shimizu, Shinichi; Ito, Yoichi M; Shirato, Hiroki

    2018-03-01

    Modern radiotherapy technologies such as proton beam therapy (PBT) permit dose escalation to the tumour and minimize unnecessary doses to normal tissues. To achieve appropriate patient selection for PBT, a normal tissue complication probability (NTCP) model can be applied to estimate the risk of treatment-related toxicity relative to X-ray therapy (XRT). A methodology for estimating the difference in NTCP (∆NTCP), including its uncertainty as a function of dose to normal tissue, is described in this study using the Delta method, a statistical method for evaluating the variance of functions, considering the variance-covariance matrix. We used a virtual individual patient dataset of radiation-induced liver disease (RILD) in liver tumour patients who were treated with XRT as a study model. As an alternative option for individual patient data, dose-bin data, which consists of the number of patients who developed toxicity in each dose level/bin and the total number of patients in that dose level/bin, are useful for multi-institutional data sharing. It provides comparable accuracy with individual patient data when using the Delta method. With reliable NTCP models, the ∆NTCP with uncertainty might potentially guide the use of PBT; however, clinical validation and a cost-effectiveness study are needed to determine the appropriate ∆NTCP threshold.

  17. Assessing the uncertainty in a normal tissue complication probability difference (∆NTCP): radiation-induced liver disease (RILD) in liver tumour patients treated with proton vs X-ray therapy

    PubMed Central

    Kobashi, Keiji; Kimoto, Takuya; Toramatsu, Chie; Dekura, Yasuhiro; Katoh, Norio; Shimizu, Shinichi; Ito, Yoichi M; Shirato, Hiroki

    2018-01-01

    Abstract Modern radiotherapy technologies such as proton beam therapy (PBT) permit dose escalation to the tumour and minimize unnecessary doses to normal tissues. To achieve appropriate patient selection for PBT, a normal tissue complication probability (NTCP) model can be applied to estimate the risk of treatment-related toxicity relative to X-ray therapy (XRT). A methodology for estimating the difference in NTCP (∆NTCP), including its uncertainty as a function of dose to normal tissue, is described in this study using the Delta method, a statistical method for evaluating the variance of functions, considering the variance–covariance matrix. We used a virtual individual patient dataset of radiation-induced liver disease (RILD) in liver tumour patients who were treated with XRT as a study model. As an alternative option for individual patient data, dose-bin data, which consists of the number of patients who developed toxicity in each dose level/bin and the total number of patients in that dose level/bin, are useful for multi-institutional data sharing. It provides comparable accuracy with individual patient data when using the Delta method. With reliable NTCP models, the ∆NTCP with uncertainty might potentially guide the use of PBT; however, clinical validation and a cost-effectiveness study are needed to determine the appropriate ∆NTCP threshold. PMID:29538699

  18. Retrospective data-driven respiratory gating for PET/CT

    NASA Astrophysics Data System (ADS)

    Schleyer, Paul J.; O'Doherty, Michael J.; Barrington, Sally F.; Marsden, Paul K.

    2009-04-01

    Respiratory motion can adversely affect both PET and CT acquisitions. Respiratory gating allows an acquisition to be divided into a series of motion-reduced bins according to the respiratory signal, which is typically hardware acquired. In order that the effects of motion can potentially be corrected for, we have developed a novel, automatic, data-driven gating method which retrospectively derives the respiratory signal from the acquired PET and CT data. PET data are acquired in listmode and analysed in sinogram space, and CT data are acquired in cine mode and analysed in image space. Spectral analysis is used to identify regions within the CT and PET data which are subject to respiratory motion, and the variation of counts within these regions is used to estimate the respiratory signal. Amplitude binning is then used to create motion-reduced PET and CT frames. The method was demonstrated with four patient datasets acquired on a 4-slice PET/CT system. To assess the accuracy of the data-derived respiratory signal, a hardware-based signal was acquired for comparison. Data-driven gating was successfully performed on PET and CT datasets for all four patients. Gated images demonstrated respiratory motion throughout the bin sequences for all PET and CT series, and image analysis and direct comparison of the traces derived from the data-driven method with the hardware-acquired traces indicated accurate recovery of the respiratory signal.

  19. Testing the accuracy of clustering redshifts with simulations

    NASA Astrophysics Data System (ADS)

    Scottez, V.; Benoit-Lévy, A.; Coupon, J.; Ilbert, O.; Mellier, Y.

    2018-03-01

    We explore the accuracy of clustering-based redshift inference within the MICE2 simulation. This method uses the spatial clustering of galaxies between a spectroscopic reference sample and an unknown sample. This study give an estimate of the reachable accuracy of this method. First, we discuss the requirements for the number objects in the two samples, confirming that this method does not require a representative spectroscopic sample for calibration. In the context of next generation of cosmological surveys, we estimated that the density of the Quasi Stellar Objects in BOSS allows us to reach 0.2 per cent accuracy in the mean redshift. Secondly, we estimate individual redshifts for galaxies in the densest regions of colour space ( ˜ 30 per cent of the galaxies) without using the photometric redshifts procedure. The advantage of this procedure is threefold. It allows: (i) the use of cluster-zs for any field in astronomy, (ii) the possibility to combine photo-zs and cluster-zs to get an improved redshift estimation, (iii) the use of cluster-z to define tomographic bins for weak lensing. Finally, we explore this last option and build five cluster-z selected tomographic bins from redshift 0.2 to 1. We found a bias on the mean redshift estimate of 0.002 per bin. We conclude that cluster-z could be used as a primary redshift estimator by next generation of cosmological surveys.

  20. A decision-analytic approach to the optimal allocation of resources for endangered species consultation

    USGS Publications Warehouse

    Converse, Sarah J.; Shelley, Kevin J.; Morey, Steve; Chan, Jeffrey; LaTier, Andrea; Scafidi, Carolyn; Crouse, Deborah T.; Runge, Michael C.

    2011-01-01

    The resources available to support conservation work, whether time or money, are limited. Decision makers need methods to help them identify the optimal allocation of limited resources to meet conservation goals, and decision analysis is uniquely suited to assist with the development of such methods. In recent years, a number of case studies have been described that examine optimal conservation decisions under fiscal constraints; here we develop methods to look at other types of constraints, including limited staff and regulatory deadlines. In the US, Section Seven consultation, an important component of protection under the federal Endangered Species Act, requires that federal agencies overseeing projects consult with federal biologists to avoid jeopardizing species. A benefit of consultation is negotiation of project modifications that lessen impacts on species, so staff time allocated to consultation supports conservation. However, some offices have experienced declining staff, potentially reducing the efficacy of consultation. This is true of the US Fish and Wildlife Service's Washington Fish and Wildlife Office (WFWO) and its consultation work on federally-threatened bull trout (Salvelinus confluentus). To improve effectiveness, WFWO managers needed a tool to help allocate this work to maximize conservation benefits. We used a decision-analytic approach to score projects based on the value of staff time investment, and then identified an optimal decision rule for how scored projects would be allocated across bins, where projects in different bins received different time investments. We found that, given current staff, the optimal decision rule placed 80% of informal consultations (those where expected effects are beneficial, insignificant, or discountable) in a short bin where they would be completed without negotiating changes. The remaining 20% would be placed in a long bin, warranting an investment of seven days, including time for negotiation. For formal consultations (those where expected effects are significant), 82% of projects would be placed in a long bin, with an average time investment of 15. days. The WFWO is using this decision-support tool to help allocate staff time. Because workload allocation decisions are iterative, we describe a monitoring plan designed to increase the tool's efficacy over time. This work has general application beyond Section Seven consultation, in that it provides a framework for efficient investment of staff time in conservation when such time is limited and when regulatory deadlines prevent an unconstrained approach. ?? 2010.

  1. Compass: a hybrid method for clinical and biobank data mining.

    PubMed

    Krysiak-Baltyn, K; Nordahl Petersen, T; Audouze, K; Jørgensen, Niels; Angquist, L; Brunak, S

    2014-02-01

    We describe a new method for identification of confident associations within large clinical data sets. The method is a hybrid of two existing methods; Self-Organizing Maps and Association Mining. We utilize Self-Organizing Maps as the initial step to reduce the search space, and then apply Association Mining in order to find association rules. We demonstrate that this procedure has a number of advantages compared to traditional Association Mining; it allows for handling numerical variables without a priori binning and is able to generate variable groups which act as "hotspots" for statistically significant associations. We showcase the method on infertility-related data from Danish military conscripts. The clinical data we analyzed contained both categorical type questionnaire data and continuous variables generated from biological measurements, including missing values. From this data set, we successfully generated a number of interesting association rules, which relate an observation with a specific consequence and the p-value for that finding. Additionally, we demonstrate that the method can be used on non-clinical data containing chemical-disease associations in order to find associations between different phenotypes, such as prostate cancer and breast cancer. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. 14 CFR 125.183 - Carriage of cargo in passenger compartments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... emergency landing conditions applicable to the passenger seats of the airplane in which the bin is installed... bin. (3) The bin may not impose any load on the floor or other structure of the airplane that exceeds the load limitations of that structure. (4) The bin must be attached to the seat tracks or to the...

  3. 18. VIEW OF CRUDE ORE BINS FROM WEST. WEST CRUDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. VIEW OF CRUDE ORE BINS FROM WEST. WEST CRUDE ORE BIN AND TRESTLE FROM TWO JOHNS TRAMLINE TO SOUTH, CRUDE ORE BIN IN FOREGROUND. MACHINE SHOP IN BACKGROUND. THE TRAM TO PORTLAND PASSED TO NORTH OF MACHINE SHOP. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  4. 4. TROJAN MILL, DETAIL OF CRUDE ORE BINS FROM NORTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. TROJAN MILL, DETAIL OF CRUDE ORE BINS FROM NORTH, c. 1912. SHOWS TIMBER FRAMING UNDER CONSTRUCTION FOR EAST AND WEST CRUDE ORE BINS AT PREVIOUS LOCATION OF CRUSHER HOUSE, AND SNOW SHED PRESENT OVER SOUTH CRUDE ORE BIN WITH PHASE CHANGE IN SNOW SHED CONSTRUCTION INDICATED AT EAST END OF EAST CRUDE ORE BIN. THIS PHOTOGRAPH IS THE FIRST IMAGE OF THE MACHINE SHOP, UPPER LEFT CORNER. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  5. A general method for motion compensation in x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Biguri, Ander; Dosanjh, Manjit; Hancock, Steven; Soleimani, Manuchehr

    2017-08-01

    Motion during data acquisition is a known source of error in medical tomography, resulting in blur artefacts in the regions that move. It is critical to reduce these artefacts in applications such as image-guided radiation therapy as a clearer image translates into a more accurate treatment and the sparing of healthy tissue close to a tumour site. Most research in 4D x-ray tomography involving the thorax relies on respiratory phase binning of the acquired data and reconstructing each of a set of images using the limited subset of data per phase. In this work, we demonstrate a motion-compensation method to reconstruct images from the complete dataset taken during breathing without recourse to phase-binning or breath-hold techniques. As long as the motion is sufficiently well known, the new method can accurately reconstruct an image at any time during the acquisition time span. It can be applied to any iterative reconstruction algorithm.

  6. A general method for motion compensation in x-ray computed tomography.

    PubMed

    Biguri, Ander; Dosanjh, Manjit; Hancock, Steven; Soleimani, Manuchehr

    2017-07-24

    Motion during data acquisition is a known source of error in medical tomography, resulting in blur artefacts in the regions that move. It is critical to reduce these artefacts in applications such as image-guided radiation therapy as a clearer image translates into a more accurate treatment and the sparing of healthy tissue close to a tumour site. Most research in 4D x-ray tomography involving the thorax relies on respiratory phase binning of the acquired data and reconstructing each of a set of images using the limited subset of data per phase. In this work, we demonstrate a motion-compensation method to reconstruct images from the complete dataset taken during breathing without recourse to phase-binning or breath-hold techniques. As long as the motion is sufficiently well known, the new method can accurately reconstruct an image at any time during the acquisition time span. It can be applied to any iterative reconstruction algorithm.

  7. Cyber Fundamental Exercises

    DTIC Science & Technology

    2013-03-01

    the /bin, /sbin, /etc, /var/log, /home, /proc, /root, /dev, /tmp, and /lib directories • Describe the purpose of the /etc/shadow and /etc/ passwd ...UNLIMITED 19 2.6.2 /etc/ passwd and /etc/shadow The /etc/shadow file didn’t exist on early Linux distributions. Originally only root could access the...etc/ passwd file, which stored user names, user configuration information, and passwords. However, when common programs such as ls running under

  8. Microbial metabolic networks in a complex electrogenic biofilm recovered from a stimulus-induced metatranscriptomics approach

    PubMed Central

    Ishii, Shun’ichi; Suzuki, Shino; Tenney, Aaron; Norden-Krichmar, Trina M.; Nealson, Kenneth H.; Bretschger, Orianna

    2015-01-01

    Microorganisms almost always exist as mixed communities in nature. While the significance of microbial community activities is well appreciated, a thorough understanding about how microbial communities respond to environmental perturbations has not yet been achieved. Here we have used a combination of metagenomic, genome binning, and stimulus-induced metatranscriptomic approaches to estimate the metabolic network and stimuli-induced metabolic switches existing in a complex microbial biofilm that was producing electrical current via extracellular electron transfer (EET) to a solid electrode surface. Two stimuli were employed: to increase EET and to stop EET. An analysis of cell activity marker genes after stimuli exposure revealed that only two strains within eleven binned genomes had strong transcriptional responses to increased EET rates, with one responding positively and the other responding negatively. Potential metabolic switches between eleven dominant members were mainly observed for acetate, hydrogen, and ethanol metabolisms. These results have enabled the estimation of a multi-species metabolic network and the associated short-term responses to EET stimuli that induce changes to metabolic flow and cooperative or competitive microbial interactions. This systematic meta-omics approach represents a next step towards understanding complex microbial roles within a community and how community members respond to specific environmental stimuli. PMID:26443302

  9. Spatial clustering of dark matter haloes: secondary bias, neighbour bias, and the influence of massive neighbours on halo properties

    NASA Astrophysics Data System (ADS)

    Salcedo, Andrés N.; Maller, Ariyeh H.; Berlind, Andreas A.; Sinha, Manodeep; McBride, Cameron K.; Behroozi, Peter S.; Wechsler, Risa H.; Weinberg, David H.

    2018-04-01

    We explore the phenomenon commonly known as halo assembly bias, whereby dark matter haloes of the same mass are found to be more or less clustered when a second halo property is considered, for haloes in the mass range 3.7 × 1011-5.0 × 1013 h-1 M⊙. Using the Large Suite of Dark Matter Simulations (LasDamas) we consider nine commonly used halo properties and find that a clustering bias exists if haloes are binned by mass or by any other halo property. This secondary bias implies that no single halo property encompasses all the spatial clustering information of the halo population. The mean values of some halo properties depend on their halo's distance to a more massive neighbour. Halo samples selected by having high values of one of these properties therefore inherit a neighbour bias such that they are much more likely to be close to a much more massive neighbour. This neighbour bias largely accounts for the secondary bias seen in haloes binned by mass and split by concentration or age. However, haloes binned by other mass-like properties still show a secondary bias even when the neighbour bias is removed. The secondary bias of haloes selected by their spin behaves differently than that for other halo properties, suggesting that the origin of the spin bias is different than of other secondary biases.

  10. Haystack, a web-based tool for metabolomics research

    PubMed Central

    2014-01-01

    Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods. PMID:25350247

  11. Haystack, a web-based tool for metabolomics research.

    PubMed

    Grace, Stephen C; Embry, Stephen; Luo, Heng

    2014-01-01

    Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods.

  12. Construction of a High-Density Genetic Map from RNA-Seq Data for an Arabidopsis Bay-0 × Shahdara RIL Population

    PubMed Central

    Serin, Elise A. R.; Snoek, L. B.; Nijveen, Harm; Willems, Leo A. J.; Jiménez-Gómez, Jose M.; Hilhorst, Henk W. M.; Ligterink, Wilco

    2017-01-01

    High-density genetic maps are essential for high resolution mapping of quantitative traits. Here, we present a new genetic map for an Arabidopsis Bayreuth × Shahdara recombinant inbred line (RIL) population, built on RNA-seq data. RNA-seq analysis on 160 RILs of this population identified 30,049 single-nucleotide polymorphisms (SNPs) covering the whole genome. Based on a 100-kbp window SNP binning method, 1059 bin-markers were identified, physically anchored on the genome. The total length of the RNA-seq genetic map spans 471.70 centimorgans (cM) with an average marker distance of 0.45 cM and a maximum marker distance of 4.81 cM. This high resolution genotyping revealed new recombination breakpoints in the population. To highlight the advantages of such high-density map, we compared it to two publicly available genetic maps for the same population, comprising 69 PCR-based markers and 497 gene expression markers derived from microarray data, respectively. In this study, we show that SNP markers can effectively be derived from RNA-seq data. The new RNA-seq map closes many existing gaps in marker coverage, saturating the previously available genetic maps. Quantitative trait locus (QTL) analysis for published phenotypes using the available genetic maps showed increased QTL mapping resolution and reduced QTL confidence interval using the RNA-seq map. The new high-density map is a valuable resource that facilitates the identification of candidate genes and map-based cloning approaches. PMID:29259624

  13. 9. 5TH FLOOR, INTERIOR DETAIL TO EAST OF SOAP BIN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. 5TH FLOOR, INTERIOR DETAIL TO EAST OF SOAP BIN No. 4: UPPER SCREWS MOVED SOAP CHIPS HORIZONTALLY FROM BIN TO BIN; LOWER LEFT-AND RIGHT-HAND SCREWS MOVED CHIPS TO CHUTE LEADING TO 3RD FLOOR SOAP MILLS - Colgate & Company Jersey City Plant, Building No. B-14, 54-58 Grand Street, Jersey City, Hudson County, NJ

  14. 13. OBLIQUE VIEW OF UPPER ORE BIN, LOOKING WEST NORTHWEST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. OBLIQUE VIEW OF UPPER ORE BIN, LOOKING WEST NORTHWEST. THIS ORE BIN WAS ADDED IN THE LATE 1930'S. IT IS TRAPAZOIDAL IN SHAPE, WIDER AT THE REAR THAN THE FRONT, AND DIVIDED INTO THREE BINS, EACH WITH ITS OWN CONTROL DOOR (SEE CA-290-15). - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  15. 3. EAGLE MILL, DETAIL OF CRUDE ORE BIN FROM NORTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. EAGLE MILL, DETAIL OF CRUDE ORE BIN FROM NORTH, c. 1908-10. SHOWS EXPOSED CRUSHER HOUSE IN FRONT OF (SOUTH) CRUDE ORE BIN AND SNOW SHED ADDED OVER TRAM TRACKS. NOTE LACK OF EAST OR WEST CRUDE ORE BINS. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  16. Loss of Bin1 Promotes the Propagation of Tau Pathology.

    PubMed

    Calafate, Sara; Flavin, William; Verstreken, Patrik; Moechars, Diederik

    2016-10-18

    Tau pathology propagates within synaptically connected neuronal circuits, but the underlying mechanisms are unclear. BIN1-amphiphysin2 is the second most prevalent genetic risk factor for late-onset Alzheimer's disease. In diseased brains, the BIN1-amphiphysin2 neuronal isoform is downregulated. Here, we show that lowering BIN1-amphiphysin2 levels in neurons promotes Tau pathology propagation whereas overexpression of neuronal BIN1-amphiphysin2 inhibits the process in two in vitro models. Increased Tau propagation is caused by increased endocytosis, given our finding that BIN1-amphiphysin2 negatively regulates endocytic flux. Furthermore, blocking endocytosis by inhibiting dynamin also reduces Tau pathology propagation. Using a galectin-3-binding assay, we show that internalized Tau aggregates damage the endosomal membrane, allowing internalized aggregates to leak into the cytoplasm to propagate pathology. Our work indicates that lower BIN1 levels promote the propagation of Tau pathology by efficiently increasing aggregate internalization by endocytosis and endosomal trafficking. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Mosquito larvicide BinAB revealed by de novo phasing with an X-ray laser

    PubMed Central

    Colletier, Jacques-Philippe; Sawaya, Michael R.; Gingery, Mari; Rodriguez, Jose A.; Cascio, Duilio; Brewster, Aaron S.; Michels-Clark, Tara; Hice, Robert H.; Coquelle, Nicolas; Boutet, Sébastien; Williams, Garth J.; Messerschmidt, Marc; DePonte, Daniel P.; Sierra, Raymond G.; Laksmono, Hartawan; Koglin, Jason E.; Hunter, Mark S.; Park, Hyun-Woo; Uervirojnangkoorn, Monarin; Bideshi, Dennis K.; Brunger, Axel T.; Federici, Brian A.; Sauter, Nicholas K.; Eisenberg, David S.

    2016-01-01

    Summary BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally toxic oligomeric pores. The small size of the crystals, 50 unit cells per edge, on average, has impeded structural characterization by conventional means. Here, we report the structure of BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser (XFEL). The structure reveals tyrosine and carboxylate-mediated contacts acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears responsible for anchoring BinA to receptor-bound BinB for co-internalization. Remarkably, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation. PMID:27680699

  18. On the Mathematical Consequences of Binning Spike Trains.

    PubMed

    Cessac, Bruno; Le Ny, Arnaud; Löcherbach, Eva

    2017-01-01

    We initiate a mathematical analysis of hidden effects induced by binning spike trains of neurons. Assuming that the original spike train has been generated by a discrete Markov process, we show that binning generates a stochastic process that is no longer Markov but is instead a variable-length Markov chain (VLMC) with unbounded memory. We also show that the law of the binned raster is a Gibbs measure in the DLR (Dobrushin-Lanford-Ruelle) sense coined in mathematical statistical mechanics. This allows the derivation of several important consequences on statistical properties of binned spike trains. In particular, we introduce the DLR framework as a natural setting to mathematically formalize anticipation, that is, to tell "how good" our nervous system is at making predictions. In a probabilistic sense, this corresponds to condition a process by its future, and we discuss how binning may affect our conclusions on this ability. We finally comment on the possible consequences of binning in the detection of spurious phase transitions or in the detection of incorrect evidence of criticality.

  19. Efficient Entanglement Concentration of Nonlocal Two-Photon Polarization-Time-Bin Hyperentangled States

    NASA Astrophysics Data System (ADS)

    Wang, Zi-Hang; Yu, Wen-Xuan; Wu, Xiao-Yuan; Gao, Cheng-Yan; Alzahrani, Faris; Hobiny, Aatef; Deng, Fu-Guo

    2018-03-01

    We present two different hyperentanglement concentration protocols (hyper-ECPs) for two-photon systems in nonlocal polarization-time-bin hyperentangled states with known parameters, including Bell-like and cluster-like states, resorting to the parameter splitting method. They require only one of two parties in quantum communication to operate her photon in the process of entanglement concentration, not two, and they have the maximal success probability. They work with linear optical elements and have good feasibility in experiment, especially in the case that there are a big number of quantum data exchanged as the parties can obtain the information about the parameters of the nonlocal hyperentangled states by sampling a subset of nonlocal hyperentangled two-photon systems and measuring them. As the quantum state of photons in the time-bin degree of freedom suffers from less noise in an optical-fiber channel, these hyper-ECPs may have good applications in practical long-distance quantum communication in the future.

  20. Kinematical Comparison Analysis on the Discus Athletes Throwing Techniques Based on Data Project

    NASA Astrophysics Data System (ADS)

    Junming, Li; Jihe, Zhou; Ting, Long

    2017-09-01

    In the discus final site of throwing event series game of China’s track and field sport in April, 2015, three dimensional camera analytical method which is an application of kinematical data project was used on female discus athletes’ discus throwing technology. And analysis was made for the top four discus throwers’ last exertion action, related kinematics parameter was thus obtained. Analysis results show that: first, Lu Xiaoxin behaves better in body twist tight effect when it is left foot on the ground and in capacity of beyond devices, followed by Su Xinyue and Tan Jian, with Feng Bin relatively weaker; second, our athletes’ discus shots speed is to be upgraded compared with world excellent female discus athletes; third, discus is left slightly earlier, with Tan Jian throwing in a reasonable angle, Feng Bin, Lu Xiaoxin in a larger angle, and Sue Xinyue in a smaller angle. Feng bin has a higher height of release, followed by Lu Xiaoxin and Tan jian.

  1. 7. TROJAN MILL, EXTERIOR FROM NORTHWEST, c. 191828. ADDITIONS FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. TROJAN MILL, EXTERIOR FROM NORTHWEST, c. 1918-28. ADDITIONS FOR PRIMARY THICKENERS No. 1 AND No. 2, SECONDARY THICKENERS No. 1, No. 2, AND No. 3, AGITATORS, AIR COMPRESSOR, AND PORTLAND FILTERS ARE SHOWN COMPLETE. STAIR ON NORTH SIDE OF CRUDE ORE BINS IS PRESENT AS IS THE LIME BIN ADJACENT TO THE WEST CRUDE ORE BIN, AND THE SNOW SHED ADDED OVER THE TRAMLINE SERVING THE EAST AND WEST CRUDE ORE BINS. ALSO PRESENT IS THE BABBITT HOUSE AND ROCK BIN. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  2. Reconstructing the Genomic Content of Microbiome Taxa through Shotgun Metagenomic Deconvolution

    PubMed Central

    Carr, Rogan; Shen-Orr, Shai S.; Borenstein, Elhanan

    2013-01-01

    Metagenomics has transformed our understanding of the microbial world, allowing researchers to bypass the need to isolate and culture individual taxa and to directly characterize both the taxonomic and gene compositions of environmental samples. However, associating the genes found in a metagenomic sample with the specific taxa of origin remains a critical challenge. Existing binning methods, based on nucleotide composition or alignment to reference genomes allow only a coarse-grained classification and rely heavily on the availability of sequenced genomes from closely related taxa. Here, we introduce a novel computational framework, integrating variation in gene abundances across multiple samples with taxonomic abundance data to deconvolve metagenomic samples into taxa-specific gene profiles and to reconstruct the genomic content of community members. This assembly-free method is not bounded by various factors limiting previously described methods of metagenomic binning or metagenomic assembly and represents a fundamentally different approach to metagenomic-based genome reconstruction. An implementation of this framework is available at http://elbo.gs.washington.edu/software.html. We first describe the mathematical foundations of our framework and discuss considerations for implementing its various components. We demonstrate the ability of this framework to accurately deconvolve a set of metagenomic samples and to recover the gene content of individual taxa using synthetic metagenomic samples. We specifically characterize determinants of prediction accuracy and examine the impact of annotation errors on the reconstructed genomes. We finally apply metagenomic deconvolution to samples from the Human Microbiome Project, successfully reconstructing genus-level genomic content of various microbial genera, based solely on variation in gene count. These reconstructed genera are shown to correctly capture genus-specific properties. With the accumulation of metagenomic data, this deconvolution framework provides an essential tool for characterizing microbial taxa never before seen, laying the foundation for addressing fundamental questions concerning the taxa comprising diverse microbial communities. PMID:24146609

  3. Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il

    A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less

  4. Quantitative analysis of multiple high-resolution mass spectrometry images using chemometric methods: quantitation of chlordecone in mouse liver.

    PubMed

    Mohammadi, Saeedeh; Parastar, Hadi

    2018-05-15

    In this work, a chemometrics-based strategy is developed for quantitative mass spectrometry imaging (MSI). In this regard, quantification of chlordecone as a carcinogenic organochlorinated pesticide (C10Cll0O) in mouse liver using the matrix-assisted laser desorption ionization MSI (MALDI-MSI) method is used as a case study. The MSI datasets corresponded to 1, 5 and 10 days of mouse exposure to the standard chlordecone in the quantity range of 0 to 450 μg g-1. The binning approach in the m/z direction is used to group high resolution m/z values and to reduce the big data size. To consider the effect of bin size on the quality of results, three different bin sizes of 0.25, 0.5 and 1.0 were chosen. Afterwards, three-way MSI data arrays (two spatial and one m/z dimensions) for seven standards and four unknown samples were column-wise augmented with m/z values as the common mode. Then, these datasets were analyzed using multivariate curve resolution-alternating least squares (MCR-ALS) using proper constraints. The resolved mass spectra were used for identification of chlordecone in the presence of a complex background and interference. Additionally, the augmented spatial profiles were post-processed and 2D images for each component were obtained in calibration and unknown samples. The sum of these profiles was utilized to set the calibration curve and to obtain the analytical figures of merit (AFOMs). Inspection of the results showed that the lower bin size (i.e., 0.25) provides more accurate results. Finally, the obtained results by MCR for three datasets were compared with those of gas chromatography-mass spectrometry (GC-MS) and MALDI-MSI. The results showed that the MCR-assisted method gives a higher amount of chlordecone than MALDI-MSI and a lower amount than GC-MS. It is concluded that a combination of chemometric methods with MSI can be considered as an alternative way for MSI quantification.

  5. SU-F-J-136: Impact of Audiovisual Biofeedback On Interfraction Motion Over a Course of Liver Cancer Stereotactic Body Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, S; Tse, R; Martin, D

    Purpose: In abdominal radiotherapy inconsistent interfraction respiratory motion can result in deviations during treatment from what was planned in terms of target position and motion. Audiovisual biofeedback (AVB) is an interactive respiratory guide that produces a guiding interface that the patient follows over a course of radiotherapy to facilitate regular respiratory motion. This study assessed the impact of AVB on interfraction motion consistency over a course of liver cancer SBRT. Methods: Five liver cancer patients have been recruited into this study, 3 followed AVB over their course of SBRT and 2 were free breathing (FB). Respiratory signals from the Varianmore » RPM were obtained during 4DCT and each treatment fraction. Respiratory signals were organized into 10 respiratory bins, and interfraction consistency was quantified by the difference between each treatment fraction respiratory bin and each respiratory bin from 4DCT. Interfraction consistency was considered as both the relative difference (as a percentage) and absolute difference (in centimeters) between treatment respiratory bins and 4DCT respiratory bins. Results: The relative difference between 4DCT and treatment respiratory bins was 22 ± 16% for FB, and 15 ± 10% for AVB, an improvement of 32% (p < 0.001) with AVB. The absolute difference between 4DCT and treatment respiratory bins was 0.15 ± 0.10 cm for FB, and 0.14 ± 0.13 cm for AVB, an improvement of 4% (p = 0.6) with AVB. Conclusion: This was the first study to compare the impact of AVB breathing guidance on interfraction motion consistency over a course of radiotherapy. AVB demonstrated to significantly reduce the relative difference between 4DCT and treatment respiratory motion, but the absolute differences were comparable, largely due to one AVB patient exhibiting a larger amplitude than the other patients. This study demonstrates the potential benefit of AVB in reducing motion variations during treatment from what was planned. Paul Keall, Sean Pollock, Ricky OBrien and Kuldeep Makhija are shareholders of Respiratory Innovations, an Australian company that is developing a device to improve breathing stability. No funding or support was provided by Respiratory Innovations. Paul Keall is one of the inventors of US patent # 7955270.« less

  6. Precision of FLEET Velocimetry Using High-speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 micro sec, precisions of 0.5 m/s in air and 0.2 m/s in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision High Speed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  7. Low Frequency Variants, Collapsed Based on Biological Knowledge, Uncover Complexity of Population Stratification in 1000 Genomes Project Data

    PubMed Central

    Moore, Carrie B.; Wallace, John R.; Wolfe, Daniel J.; Frase, Alex T.; Pendergrass, Sarah A.; Weiss, Kenneth M.; Ritchie, Marylyn D.

    2013-01-01

    Analyses investigating low frequency variants have the potential for explaining additional genetic heritability of many complex human traits. However, the natural frequencies of rare variation between human populations strongly confound genetic analyses. We have applied a novel collapsing method to identify biological features with low frequency variant burden differences in thirteen populations sequenced by the 1000 Genomes Project. Our flexible collapsing tool utilizes expert biological knowledge from multiple publicly available database sources to direct feature selection. Variants were collapsed according to genetically driven features, such as evolutionary conserved regions, regulatory regions genes, and pathways. We have conducted an extensive comparison of low frequency variant burden differences (MAF<0.03) between populations from 1000 Genomes Project Phase I data. We found that on average 26.87% of gene bins, 35.47% of intergenic bins, 42.85% of pathway bins, 14.86% of ORegAnno regulatory bins, and 5.97% of evolutionary conserved regions show statistically significant differences in low frequency variant burden across populations from the 1000 Genomes Project. The proportion of bins with significant differences in low frequency burden depends on the ancestral similarity of the two populations compared and types of features tested. Even closely related populations had notable differences in low frequency burden, but fewer differences than populations from different continents. Furthermore, conserved or functionally relevant regions had fewer significant differences in low frequency burden than regions under less evolutionary constraint. This degree of low frequency variant differentiation across diverse populations and feature elements highlights the critical importance of considering population stratification in the new era of DNA sequencing and low frequency variant genomic analyses. PMID:24385916

  8. a High Precision dem Extraction Method Based on Insar Data

    NASA Astrophysics Data System (ADS)

    Wang, Xinshuang; Liu, Lingling; Shi, Xiaoliang; Huang, Xitao; Geng, Wei

    2018-04-01

    In the 13th Five-Year Plan for Geoinformatics Business, it is proposed that the new InSAR technology should be applied to surveying and mapping production, which will become the innovation driving force of geoinformatics industry. This paper will study closely around the new outline of surveying and mapping and then achieve the TerraSAR/TanDEM data of Bin County in Shaanxi Province in X band. The studying steps are as follows; Firstly, the baseline is estimated from the orbital data; Secondly, the interferometric pairs of SAR image are accurately registered; Thirdly, the interferogram is generated; Fourth, the interferometric correlation information is estimated and the flat-earth phase is removed. In order to solve the phase noise and the discontinuity phase existing in the interferometric image of phase, a GAMMA adaptive filtering method is adopted. Aiming at the "hole" problem of missing data in low coherent area, the interpolation method of low coherent area mask is used to assist the phase unwrapping. Then, the accuracy of the interferometric baseline is estimated from the ground control points. Finally, 1 : 50000 DEM is generated, and the existing DEM data is used to verify the accuracy through statistical analysis. The research results show that the improved InSAR data processing method in this paper can obtain the high-precision DEM of the study area, exactly the same with the topography of reference DEM. The R2 can reach to 0.9648, showing a strong positive correlation.

  9. 16. Coke 'fines' bin at Furnace D. After delivery to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Coke 'fines' bin at Furnace D. After delivery to the trestle bins, the coke was screened and the coke 'fines' or breeze, were transported by conveyor to the coke fines bins where it was collected and leaded into dump trucks. The coke fines were then sold for fuel to a sinter plant in Lorain, Ohio. - Central Furnaces, 2650 Broadway, east bank of Cuyahoga River, Cleveland, Cuyahoga County, OH

  10. Qatar: Background and U.S. Relations

    DTIC Science & Technology

    2014-11-04

    Ahmed bin Abdullah bin Ziad Al Mahmoud Foreign Minister Khalid Bin Mohammed Al Attiyah Minister of Energy and Industry Mohammed bin Saleh al Sada...about voter franchise extension were resolved.5 The Advisory Council would have oversight authority over the Council of Ministers and would be able...Sunni armed groups in Syria has the potential to have a more lasting impact on the region, but has challenged the traditional Qatari preference for

  11. Food powders flowability characterization: theory, methods, and applications.

    PubMed

    Juliano, Pablo; Barbosa-Cánovas, Gustavo V

    2010-01-01

    Characterization of food powders flowability is required for predicting powder flow from hoppers in small-scale systems such as vending machines or at the industrial scale from storage silos or bins dispensing into powder mixing systems or packaging machines. This review covers conventional and new methods used to measure flowability in food powders. The method developed by Jenike (1964) for determining hopper outlet diameter and hopper angle has become a standard for the design of bins and is regarded as a standard method to characterize flowability. Moreover, there are a number of shear cells that can be used to determine failure properties defined by Jenike's theory. Other classic methods (compression, angle of repose) and nonconventional methods (Hall flowmeter, Johanson Indicizer, Hosokawa powder tester, tensile strength tester, powder rheometer), used mainly for the characterization of food powder cohesiveness, are described. The effect of some factors preventing flow, such as water content, temperature, time consolidation, particle composition and size distribution, is summarized for the characterization of specific food powders with conventional and other methods. Whereas time-consuming standard methods established for hopper design provide flow properties, there is yet little comparative evidence demonstrating that other rapid methods may provide similar flow prediction.

  12. Toward zero waste events: Reducing contamination in waste streams with volunteer assistance.

    PubMed

    Zelenika, Ivana; Moreau, Tara; Zhao, Jiaying

    2018-06-01

    Public festivals and events generate a tremendous amount of waste, especially when they involve food and drink. To reduce contamination across waste streams, we evaluated three types of interventions at a public event. In a randomized control trial, we examined the impact of volunteer staff assistance, bin tops, and sample 3D items with bin tops, on the amount of contamination and the weight of the organics, recyclable containers, paper, and garbage bins at a public event. The event was the annual Apple Festival held at the University of British Columbia, which was attended by around 10,000 visitors. We found that contamination was the lowest in the volunteer staff condition among all conditions. Specifically, volunteer staff reduced contamination by 96.1% on average in the organics bin, 96.9% in the recyclable containers bin, 97.0% in the paper bin, and 84.9% in the garbage bin. Our interventions did not influence the weight of the materials in the bins. This finding highlights the impact of volunteers on reducing contamination in waste streams at events, and provides suggestions and implications for waste management for event organizers to minimize contamination in all waste streams to achieve zero waste goals. Copyright © 2018. Published by Elsevier Ltd.

  13. Improved Taxation Rate for Bin Packing Games

    NASA Astrophysics Data System (ADS)

    Kern, Walter; Qiu, Xian

    A cooperative bin packing game is a N-person game, where the player set N consists of k bins of capacity 1 each and n items of sizes a 1, ⋯ ,a n . The value of a coalition of players is defined to be the maximum total size of items in the coalition that can be packed into the bins of the coalition. We present an alternative proof for the non-emptiness of the 1/3-core for all bin packing games and show how to improve this bound ɛ= 1/3 (slightly). We conjecture that the true best possible value is ɛ= 1/7.

  14. MetaABC--an integrated metagenomics platform for data adjustment, binning and clustering.

    PubMed

    Su, Chien-Hao; Hsu, Ming-Tsung; Wang, Tse-Yi; Chiang, Sufeng; Cheng, Jen-Hao; Weng, Francis C; Kao, Cheng-Yan; Wang, Daryi; Tsai, Huai-Kuang

    2011-08-15

    MetaABC is a metagenomic platform that integrates several binning tools coupled with methods for removing artifacts, analyzing unassigned reads and controlling sampling biases. It allows users to arrive at a better interpretation via series of distinct combinations of analysis tools. After execution, MetaABC provides outputs in various visual formats such as tables, pie and bar charts as well as clustering result diagrams. MetaABC source code and documentation are available at http://bits2.iis.sinica.edu.tw/MetaABC/ CONTACT: dywang@gate.sinica.edu.tw; hktsai@iis.sinica.edu.tw Supplementary data are available at Bioinformatics online.

  15. Optimized detection of shear peaks in weak lensing maps

    NASA Astrophysics Data System (ADS)

    Marian, Laura; Smith, Robert E.; Hilbert, Stefan; Schneider, Peter

    2012-06-01

    We present a new method to extract cosmological constraints from weak lensing (WL) peak counts, which we denote as ‘the hierarchical algorithm’. The idea of this method is to combine information from WL maps sequentially smoothed with a series of filters of different size, from the largest down to the smallest, thus increasing the cosmological sensitivity of the resulting peak function. We compare the cosmological constraints resulting from the peak abundance measured in this way and the abundance obtained by using a filter of fixed size, which is the standard practice in WL peak studies. For this purpose, we employ a large set of WL maps generated by ray tracing through N-body simulations, and the Fisher matrix formalism. We find that if low signal-to-noise ratio (?) peaks are included in the analysis (?), the hierarchical method yields constraints significantly better than the single-sized filtering. For a large future survey such as Euclid or Large Synoptic Survey Telescope, combined with information from a cosmic microwave background experiment like Planck, the results for the hierarchical (single-sized) method are Δns= 0.0039 (0.004), ΔΩm= 0.002 (0.0045), Δσ8= 0.003 (0.006) and Δw= 0.019 (0.0525). This forecast is conservative, as we assume no knowledge of the redshifts of the lenses, and consider a single broad bin for the redshifts of the sources. If only peaks with ? are considered, then there is little difference between the results of the two methods. We also examine the statistical properties of the hierarchical peak function: Its covariance matrix has off-diagonal terms for bins with ? and aperture mass of M < 3 × 1014 h-1 M⊙, the higher bins being largely uncorrelated and therefore well described by a Poisson distribution.

  16. Conformational Smear Characterization and Binning of Single-Molecule Conductance Measurements for Enhanced Molecular Recognition.

    PubMed

    Korshoj, Lee E; Afsari, Sepideh; Chatterjee, Anushree; Nagpal, Prashant

    2017-11-01

    Electronic conduction or charge transport through single molecules depends primarily on molecular structure and anchoring groups and forms the basis for a wide range of studies from molecular electronics to DNA sequencing. Several high-throughput nanoelectronic methods such as mechanical break junctions, nanopores, conductive atomic force microscopy, scanning tunneling break junctions, and static nanoscale electrodes are often used for measuring single-molecule conductance. In these measurements, "smearing" due to conformational changes and other entropic factors leads to large variances in the observed molecular conductance, especially in individual measurements. Here, we show a method for characterizing smear in single-molecule conductance measurements and demonstrate how binning measurements according to smear can significantly enhance the use of individual conductance measurements for molecular recognition. Using quantum point contact measurements on single nucleotides within DNA macromolecules, we demonstrate that the distance over which molecular junctions are maintained is a measure of smear, and the resulting variance in unbiased single measurements depends on this smear parameter. Our ability to identify individual DNA nucleotides at 20× coverage increases from 81.3% accuracy without smear analysis to 93.9% with smear characterization and binning (SCRIB). Furthermore, merely 7 conductance measurements (7× coverage) are needed to achieve 97.8% accuracy for DNA nucleotide recognition when only low molecular smear measurements are used, which represents a significant improvement over contemporary sequencing methods. These results have important implications in a broad range of molecular electronics applications from designing robust molecular switches to nanoelectronic DNA sequencing.

  17. 8. EAST ELEVATION OF SKIDOO MILL AND UPPER ORE BIN, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. EAST ELEVATION OF SKIDOO MILL AND UPPER ORE BIN, LOOKING WEST FROM ACCESS ROAD. THE ROADWAY ON THIS LEVEL (CENTER) WAS USED FOR UNLOADING ORE BROUGHT ON BURROWS INTO THE ORE BIN AT THE TOP LEVEL OF THE MILL. THE ORE BIN IN THE UPPER LEFT WAS ADDED LATER WHEN ORE WAS BROUGHT TO THE MILL BY TRUCKS. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  18. Effect of horizontal pick and place locations on shoulder kinematics.

    PubMed

    Könemann, R; Bosch, T; Kingma, I; Van Dieën, J H; De Looze, M P

    2015-01-01

    In this study the effects of horizontal bin locations in an order picking workstation on upper arm elevation, trunk inclination and hand use were investigated. Eight subjects moved (self-paced) light or heavy products (0.2 and 3.0 kg) from a central product bin to an inner or outer order bin (at 60 or 150 cm) on the left or right side of the workstation, while movements were recorded. The outer compared to inner bin location resulted in more upper arm elevation and trunk inclination per work cycle, both in terms of number of peak values and in terms of time integrals of angles (which is a dose measure over time). Considering the peak values and time integrals per minute (instead of per work cycle), these effects are reduced, due to the higher cycle times for outer bins. Hand use (left, right or both) was not affected by order bin locations.

  19. Distribution of hybrid entanglement and hyperentanglement with time-bin for secure quantum channel under noise via weak cross-Kerr nonlinearity.

    PubMed

    Heo, Jino; Kang, Min-Sung; Hong, Chang-Ho; Yang, Hyung-Jin; Choi, Seong-Gon; Hong, Jong-Phil

    2017-08-31

    We design schemes to generate and distribute hybrid entanglement and hyperentanglement correlated with degrees of freedom (polarization and time-bin) via weak cross-Kerr nonlinearities (XKNLs) and linear optical devices (including time-bin encoders). In our scheme, the multi-photon gates (which consist of XKNLs, quantum bus [qubus] beams, and photon-number-resolving [PNR] measurement) with time-bin encoders can generate hyperentanglement or hybrid entanglement. And we can also purify the entangled state (polarization) of two photons using only linear optical devices and time-bin encoders under a noisy (bit-flip) channel. Subsequently, through local operations (using a multi-photon gate via XKNLs) and classical communications, it is possible to generate a four-qubit hybrid entangled state (polarization and time-bin). Finally, we discuss how the multi-photon gate using XKNLs, qubus beams, and PNR measurement can be reliably performed under the decoherence effect.

  20. Selection of 3013 Containers for Field Surveillance. Fiscal Year 2016 Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelly, Elizabeth J.; Berg, John M.; Cheadle, Jesse

    2016-04-19

    This update is the eighth in a series of reports that document the binning and sample selection of 3013 containers for the Field Surveillance program as part of the Integrated Surveillance Program. This report documents changes made to both the container binning assignments and the sample selection approach. Binning changes documented in this update are a result of changes to the prompt gamma calibration curves and the reassignment of a small number of Hanford items from the Pressure bin to the Pressure and Corrosion (P&C) bin. Field Surveillance sample selection changes are primarily a result of focusing future destructive examinationsmore » (DEs) on the potential for stress corrosion cracking in higher moisture containers in the P&C bin. The decision to focus the Field Surveillance program on higher moisture items is based on findings from both the Shelf-life testing program and DEs.« less

  1. Mapping global biodiversity connections with DNA barcodes: Lepidoptera of Pakistan.

    PubMed

    Ashfaq, Muhammad; Akhtar, Saleem; Rafi, Muhammad Athar; Mansoor, Shahid; Hebert, Paul D N

    2017-01-01

    Sequences from the DNA barcode region of the mitochondrial COI gene are an effective tool for specimen identification and for the discovery of new species. The Barcode of Life Data Systems (BOLD) (www.boldsystems.org) currently hosts 4.5 million records from animals which have been assigned to more than 490,000 different Barcode Index Numbers (BINs), which serve as a proxy for species. Because a fourth of these BINs derive from Lepidoptera, BOLD has a strong capability to both identify specimens in this order and to support studies of faunal overlap. DNA barcode sequences were obtained from 4503 moths from 329 sites across Pakistan, specimens that represented 981 BINs from 52 families. Among 379 species with a Linnaean name assignment, all were represented by a single BIN excepting five species that showed a BIN split. Less than half (44%) of the 981 BINs had counterparts in other countries; the remaining BINs were unique to Pakistan. Another 218 BINs of Lepidoptera from Pakistan were coupled with the 981 from this study before being compared with all 116,768 BINs for this order. As expected, faunal overlap was highest with India (21%), Sri Lanka (21%), United Arab Emirates (20%) and with other Asian nations (2.1%), but it was very low with other continents including Africa (0.6%), Europe (1.3%), Australia (0.6%), Oceania (1.0%), North America (0.1%), and South America (0.1%). This study indicates the way in which DNA barcoding facilitates measures of faunal overlap even when taxa have not been assigned to a Linnean species.

  2. Mapping global biodiversity connections with DNA barcodes: Lepidoptera of Pakistan

    PubMed Central

    Akhtar, Saleem; Rafi, Muhammad Athar; Mansoor, Shahid; Hebert, Paul D. N.

    2017-01-01

    Sequences from the DNA barcode region of the mitochondrial COI gene are an effective tool for specimen identification and for the discovery of new species. The Barcode of Life Data Systems (BOLD) (www.boldsystems.org) currently hosts 4.5 million records from animals which have been assigned to more than 490,000 different Barcode Index Numbers (BINs), which serve as a proxy for species. Because a fourth of these BINs derive from Lepidoptera, BOLD has a strong capability to both identify specimens in this order and to support studies of faunal overlap. DNA barcode sequences were obtained from 4503 moths from 329 sites across Pakistan, specimens that represented 981 BINs from 52 families. Among 379 species with a Linnaean name assignment, all were represented by a single BIN excepting five species that showed a BIN split. Less than half (44%) of the 981 BINs had counterparts in other countries; the remaining BINs were unique to Pakistan. Another 218 BINs of Lepidoptera from Pakistan were coupled with the 981 from this study before being compared with all 116,768 BINs for this order. As expected, faunal overlap was highest with India (21%), Sri Lanka (21%), United Arab Emirates (20%) and with other Asian nations (2.1%), but it was very low with other continents including Africa (0.6%), Europe (1.3%), Australia (0.6%), Oceania (1.0%), North America (0.1%), and South America (0.1%). This study indicates the way in which DNA barcoding facilitates measures of faunal overlap even when taxa have not been assigned to a Linnean species. PMID:28339501

  3. Untangling taxonomy: a DNA barcode reference library for Canadian spiders.

    PubMed

    Blagoev, Gergin A; deWaard, Jeremy R; Ratnasingham, Sujeevan; deWaard, Stephanie L; Lu, Liuqiong; Robertson, James; Telfer, Angela C; Hebert, Paul D N

    2016-01-01

    Approximately 1460 species of spiders have been reported from Canada, 3% of the global fauna. This study provides a DNA barcode reference library for 1018 of these species based upon the analysis of more than 30,000 specimens. The sequence results show a clear barcode gap in most cases with a mean intraspecific divergence of 0.78% vs. a minimum nearest-neighbour (NN) distance averaging 7.85%. The sequences were assigned to 1359 Barcode index numbers (BINs) with 1344 of these BINs composed of specimens belonging to a single currently recognized species. There was a perfect correspondence between BIN membership and a known species in 795 cases, while another 197 species were assigned to two or more BINs (556 in total). A few other species (26) were involved in BIN merges or in a combination of merges and splits. There was only a weak relationship between the number of specimens analysed for a species and its BIN count. However, three species were clear outliers with their specimens being placed in 11-22 BINs. Although all BIN splits need further study to clarify the taxonomic status of the entities involved, DNA barcodes discriminated 98% of the 1018 species. The present survey conservatively revealed 16 species new to science, 52 species new to Canada and major range extensions for 426 species. However, if most BIN splits detected in this study reflect cryptic taxa, the true species count for Canadian spiders could be 30-50% higher than currently recognized. © 2015 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  4. FPGA-based voltage and current dual drive system for high frame rate electrical impedance tomography.

    PubMed

    Khan, Shadab; Manwaring, Preston; Borsic, Andrea; Halter, Ryan

    2015-04-01

    Electrical impedance tomography (EIT) is used to image the electrical property distribution of a tissue under test. An EIT system comprises complex hardware and software modules, which are typically designed for a specific application. Upgrading these modules is a time-consuming process, and requires rigorous testing to ensure proper functioning of new modules with the existing ones. To this end, we developed a modular and reconfigurable data acquisition (DAQ) system using National Instruments' (NI) hardware and software modules, which offer inherent compatibility over generations of hardware and software revisions. The system can be configured to use up to 32-channels. This EIT system can be used to interchangeably apply current or voltage signal, and measure the tissue response in a semi-parallel fashion. A novel signal averaging algorithm, and 512-point fast Fourier transform (FFT) computation block was implemented on the FPGA. FFT output bins were classified as signal or noise. Signal bins constitute a tissue's response to a pure or mixed tone signal. Signal bins' data can be used for traditional applications, as well as synchronous frequency-difference imaging. Noise bins were used to compute noise power on the FPGA. Noise power represents a metric of signal quality, and can be used to ensure proper tissue-electrode contact. Allocation of these computationally expensive tasks to the FPGA reduced the required bandwidth between PC, and the FPGA for high frame rate EIT. In 16-channel configuration, with a signal-averaging factor of 8, the DAQ frame rate at 100 kHz exceeded 110 frames s (-1), and signal-to-noise ratio exceeded 90 dB across the spectrum. Reciprocity error was found to be for frequencies up to 1 MHz. Static imaging experiments were performed on a high-conductivity inclusion placed in a saline filled tank; the inclusion was clearly localized in the reconstructions obtained for both absolute current and voltage mode data.

  5. Measuring Fast Calcium Fluxes in Cardiomyocytes

    PubMed Central

    Golebiewska, Urszula; Scarlata, Suzanne

    2011-01-01

    Cardiomyocytes have multiple Ca2+ fluxes of varying duration that work together to optimize function 1,2. Changes in Ca2+ activity in response to extracellular agents is predominantly regulated by the phospholipase Cβ- Gαq pathway localized on the plasma membrane which is stimulated by agents such as acetylcholine 3,4. We have recently found that plasma membrane protein domains called caveolae5,6 can entrap activated Gαq7. This entrapment has the effect of stabilizing the activated state of Gαq and resulting in prolonged Ca2+ signals in cardiomyocytes and other cell types8. We uncovered this surprising result by measuring dynamic calcium responses on a fast scale in living cardiomyocytes. Briefly, cells are loaded with a fluorescent Ca2+ indicator. In our studies, we used Ca2+ Green (Invitrogen, Inc.) which exhibits an increase in fluorescence emission intensity upon binding of calcium ions. The fluorescence intensity is then recorded for using a line-scan mode of a laser scanning confocal microscope. This method allows rapid acquisition of the time course of fluorescence intensity in pixels along a selected line, producing several hundreds of time traces on the microsecond time scale. These very fast traces are transferred into excel and then into Sigmaplot for analysis, and are compared to traces obtained for electronic noise, free dye, and other controls. To dissect Ca2+ responses of different flux rates, we performed a histogram analysis that binned pixel intensities with time. Binning allows us to group over 500 traces of scans and visualize the compiled results spatially and temporally on a single plot. Thus, the slow Ca2+ waves that are difficult to discern when the scans are overlaid due to different peak placement and noise, can be readily seen in the binned histograms. Very fast fluxes in the time scale of the measurement show a narrow distribution of intensities in the very short time bins whereas longer Ca2+ waves show binned data with a broad distribution over longer time bins. These different time distributions allow us to dissect the timing of Ca2+fluxes in the cells, and to determine their impact on various cellular events. PMID:22143396

  6. Measuring fast calcium fluxes in cardiomyocytes.

    PubMed

    Golebiewska, Urszula; Scarlata, Suzanne

    2011-11-29

    Cardiomyocytes have multiple Ca(2+) fluxes of varying duration that work together to optimize function (1,2). Changes in Ca(2+) activity in response to extracellular agents is predominantly regulated by the phospholipase Cβ- Gα(q;) pathway localized on the plasma membrane which is stimulated by agents such as acetylcholine (3,4). We have recently found that plasma membrane protein domains called caveolae(5,6) can entrap activated Gα(q;)(7). This entrapment has the effect of stabilizing the activated state of Gα(q;) and resulting in prolonged Ca(2+) signals in cardiomyocytes and other cell types(8). We uncovered this surprising result by measuring dynamic calcium responses on a fast scale in living cardiomyocytes. Briefly, cells are loaded with a fluorescent Ca(2+) indicator. In our studies, we used Ca(2+) Green (Invitrogen, Inc.) which exhibits an increase in fluorescence emission intensity upon binding of calcium ions. The fluorescence intensity is then recorded for using a line-scan mode of a laser scanning confocal microscope. This method allows rapid acquisition of the time course of fluorescence intensity in pixels along a selected line, producing several hundreds of time traces on the microsecond time scale. These very fast traces are transferred into excel and then into Sigmaplot for analysis, and are compared to traces obtained for electronic noise, free dye, and other controls. To dissect Ca(2+) responses of different flux rates, we performed a histogram analysis that binned pixel intensities with time. Binning allows us to group over 500 traces of scans and visualize the compiled results spatially and temporally on a single plot. Thus, the slow Ca(2+) waves that are difficult to discern when the scans are overlaid due to different peak placement and noise, can be readily seen in the binned histograms. Very fast fluxes in the time scale of the measurement show a narrow distribution of intensities in the very short time bins whereas longer Ca(2+) waves show binned data with a broad distribution over longer time bins. These different time distributions allow us to dissect the timing of Ca(2+)fluxes in the cells, and to determine their impact on various cellular events.

  7. The fate of the recombinant DNA in corn during composting.

    PubMed

    Guan, Jiewen; Spencer, J Lloyd; Ma, Bao-Luo

    2005-01-01

    In order to make regulations that safeguard food and the environment, an understanding of the fate oftransgenes from genetically modified (GM) plants is of crucial importance. A compost experiment including mature transgenic corn plants and seeds of event Bt 176 (Zea mays L.) was conducted to trace the fate of the transgene cryIA(b) during the period of composting. In bin 1, shredded corn plants including seeds were composted above a layer of cow manure and samples from the corn layer were collected at intervals during a 12-month period. The samples were tested for the transgene persistence and microbial counts and also the compost was monitored for temperature. In bin 2, piles of corn seeds, surrounded by sheep manure and straw, were composted for 12 months. A method combining nested polymerase chain reaction (PCR) and southern hybridization was developed for detection of the transgene in compost. The detection sensitivity was 200 copies of the transgene per gram of dry composted corn material. Composting commenced on day 0, and the transgene was detected in specimens from bin 1 on days 0 and 7 but not on day 14 or thereafter. The transgene in corn seeds was not detectable after 12 months of composting in bin 2. Temperatures in both bins rose to about 50 degrees C within 2 weeks and remained above that temperature for about 3 months, even when the ambient temperature dropped below -20 degrees C. Extracts from compost were inoculated onto culture plates and then were incubated at 23 to 55 degrees C. Within the first 2 weeks of composting in bin 1, the counts of bacteria incubated at 55 degrees C increased from 3.5 to 7.5 log10, whereas those incubated at 23 degrees C remained at about 7.5 log10. The counts of fungi incubated at 45 degrees C increased slightly from 2.5 to 3.1 log10, but those incubated at 23 degrees C decreased from 6.3 to 3.0 log10. The rapid degradation of the transgene during composting of Bt corn plants suggested that the composting process could be used for safe disposal of transgenic plant wastes.

  8. Binning in Gaussian Kernel Regularization

    DTIC Science & Technology

    2005-04-01

    OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, but reduces the...71.40%) on 966 randomly sampled data. Using the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the...the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, and reduces

  9. General Khalid Bin Waleed: Understanding the 7th Century Campaign against Sassanid Persian Empire from the Perspective of Operational Art

    DTIC Science & Technology

    2012-12-06

    ABSTRACT This monograph investigates Khalid Bin Waleed’s seventh century (AD 633-634) campaign against the Sassanid Persian Empire in Mesopotamia to...Khalid Bin Waleed’s seventh century (AD 633-634) campaign against the Sassanid Persian Empire in Mesopotamia to trace the evidence that substantiates...Bin Waleed employed characteristics and elements of operational art to defeat the Persian forces in Mesopotamia . He established operational objectives

  10. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  11. SeaWiFS technical report series. Volume 32: Level-3 SeaWiFS data products. Spatial and temporal binning algorithms

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Campbell, Janet W.; Blaisdell, John M.; Darzi, Michael

    1995-01-01

    The level-3 data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) are statistical data sets derived from level-2 data. Each data set will be based on a fixed global grid of equal-area bins that are approximately 9 x 9 sq km. Statistics available for each bin include the sum and sum of squares of the natural logarithm of derived level-2 geophysical variables where sums are accumulated over a binning period. Operationally, products with binning periods of 1 day, 8 days, 1 month, and 1 year will be produced and archived. From these accumulated values and for each bin, estimates of the mean, standard deviation, median, and mode may be derived for each geophysical variable. This report contains two major parts: the first (Section 2) is intended as a users' guide for level-3 SeaWiFS data products. It contains an overview of level-0 to level-3 data processing, a discussion of important statistical considerations when using level-3 data, and details of how to use the level-3 data. The second part (Section 3) presents a comparative statistical study of several binning algorithms based on CZCS and moored fluorometer data. The operational binning algorithms were selected based on the results of this study.

  12. Parameterizing correlations between hydrometeor species in mixed-phase Arctic clouds

    NASA Astrophysics Data System (ADS)

    Larson, Vincent E.; Nielsen, Brandon J.; Fan, Jiwen; Ovchinnikov, Mikhail

    2011-01-01

    Mixed-phase Arctic clouds, like other clouds, contain small-scale variability in hydrometeor fields, such as cloud water or snow mixing ratio. This variability may be worth parameterizing in coarse-resolution numerical models. In particular, for modeling multispecies processes such as accretion and aggregation, it would be useful to parameterize subgrid correlations among hydrometeor species. However, one difficulty is that there exist many hydrometeor species and many microphysical processes, leading to complexity and computational expense. Existing lower and upper bounds on linear correlation coefficients are too loose to serve directly as a method to predict subgrid correlations. Therefore, this paper proposes an alternative method that begins with the spherical parameterization framework of Pinheiro and Bates (1996), which expresses the correlation matrix in terms of its Cholesky factorization. The values of the elements of the Cholesky matrix are populated here using a "cSigma" parameterization that we introduce based on the aforementioned bounds on correlations. The method has three advantages: (1) the computational expense is tolerable; (2) the correlations are, by construction, guaranteed to be consistent with each other; and (3) the methodology is fairly general and hence may be applicable to other problems. The method is tested noninteractively using simulations of three Arctic mixed-phase cloud cases from two field experiments: the Indirect and Semi-Direct Aerosol Campaign and the Mixed-Phase Arctic Cloud Experiment. Benchmark simulations are performed using a large-eddy simulation (LES) model that includes a bin microphysical scheme. The correlations estimated by the new method satisfactorily approximate the correlations produced by the LES.

  13. A histogram-free multicanonical Monte Carlo algorithm for the construction of analytical density of states

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisenbach, Markus; Li, Ying Wai

    We report a new multicanonical Monte Carlo (MC) algorithm to obtain the density of states (DOS) for physical systems with continuous state variables in statistical mechanics. Our algorithm is able to obtain an analytical form for the DOS expressed in a chosen basis set, instead of a numerical array of finite resolution as in previous variants of this class of MC methods such as the multicanonical (MUCA) sampling and Wang-Landau (WL) sampling. This is enabled by storing the visited states directly in a data set and avoiding the explicit collection of a histogram. This practice also has the advantage ofmore » avoiding undesirable artificial errors caused by the discretization and binning of continuous state variables. Our results show that this scheme is capable of obtaining converged results with a much reduced number of Monte Carlo steps, leading to a significant speedup over existing algorithms.« less

  14. Spatial clustering of dark matter haloes: secondary bias, neighbour bias, and the influence of massive neighbours on halo properties

    DOE PAGES

    Salcedo, Andres N.; Maller, Ariyeh H.; Berlind, Andreas A.; ...

    2018-01-15

    Here, we explore the phenomenon commonly known as halo assembly bias, whereby dark matter haloes of the same mass are found to be more or less clustered when a second halo property is considered, for haloes in the mass range 3.7 × 10 11–5.0 × 10 13 h –1 M ⊙. Using the Large Suite of Dark Matter Simulations (LasDamas) we consider nine commonly used halo properties and find that a clustering bias exists if haloes are binned by mass or by any other halo property. This secondary bias implies that no single halo property encompasses all the spatial clusteringmore » information of the halo population. The mean values of some halo properties depend on their halo's distance to a more massive neighbour. Halo samples selected by having high values of one of these properties therefore inherit a neighbour bias such that they are much more likely to be close to a much more massive neighbour. This neighbour bias largely accounts for the secondary bias seen in haloes binned by mass and split by concentration or age. However, haloes binned by other mass-like properties still show a secondary bias even when the neighbour bias is removed. The secondary bias of haloes selected by their spin behaves differently than that for other halo properties, suggesting that the origin of the spin bias is different than of other secondary biases.« less

  15. Spatial clustering of dark matter haloes: secondary bias, neighbour bias, and the influence of massive neighbours on halo properties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salcedo, Andres N.; Maller, Ariyeh H.; Berlind, Andreas A.

    Here, we explore the phenomenon commonly known as halo assembly bias, whereby dark matter haloes of the same mass are found to be more or less clustered when a second halo property is considered, for haloes in the mass range 3.7 × 10 11–5.0 × 10 13 h –1 M ⊙. Using the Large Suite of Dark Matter Simulations (LasDamas) we consider nine commonly used halo properties and find that a clustering bias exists if haloes are binned by mass or by any other halo property. This secondary bias implies that no single halo property encompasses all the spatial clusteringmore » information of the halo population. The mean values of some halo properties depend on their halo's distance to a more massive neighbour. Halo samples selected by having high values of one of these properties therefore inherit a neighbour bias such that they are much more likely to be close to a much more massive neighbour. This neighbour bias largely accounts for the secondary bias seen in haloes binned by mass and split by concentration or age. However, haloes binned by other mass-like properties still show a secondary bias even when the neighbour bias is removed. The secondary bias of haloes selected by their spin behaves differently than that for other halo properties, suggesting that the origin of the spin bias is different than of other secondary biases.« less

  16. De novo phasing with X-ray laser reveals mosquito larvicide BinAB structure [A potent binary mosquito larvicide revealed by de novo phasing with an X-ray free-electron laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletier, Jacques -Philippe; Sawaya, Michael R.; Gingery, Mari

    BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally to toxic oligomeric pores. The small size of the crystals—50 unit cells per edge, on average—has impeded structural characterization by conventional means. Here we report the structure of Lysinibacillus sphaericus BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser. The structure reveals tyrosine- and carboxylate-mediated contactsmore » acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears to be responsible for anchoring BinA to receptor-bound BinB for co-internalization. Furthermore, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation.« less

  17. De novo phasing with X-ray laser reveals mosquito larvicide BinAB structure [A potent binary mosquito larvicide revealed by de novo phasing with an X-ray free-electron laser

    DOE PAGES

    Colletier, Jacques -Philippe; Sawaya, Michael R.; Gingery, Mari; ...

    2016-09-28

    BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally to toxic oligomeric pores. The small size of the crystals—50 unit cells per edge, on average—has impeded structural characterization by conventional means. Here we report the structure of Lysinibacillus sphaericus BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser. The structure reveals tyrosine- and carboxylate-mediated contactsmore » acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears to be responsible for anchoring BinA to receptor-bound BinB for co-internalization. Furthermore, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation.« less

  18. On the link between column density distribution and density scaling relation in star formation regions

    NASA Astrophysics Data System (ADS)

    Veltchev, Todor; Donkov, Sava; Stanchev, Orlin

    2017-07-01

    We present a method to derive the density scaling relation ∝ L^{-α} in regions of star formation or in their turbulent vicinities from straightforward binning of the column-density distribution (N-pdf). The outcome of the method is studied for three types of N-pdf: power law (7/5≤α≤5/3), lognormal (0.7≲α≲1.4) and combination of lognormals. In the last case, the method of Stanchev et al. (2015) was also applied for comparison and a very weak (or close to zero) correlation was found. We conclude that the considered `binning approach' reflects rather the local morphology of the N-pdf with no reference to the physical conditions in a considered region. The rough consistency of the derived slopes with the widely adopted Larson's (1981) value α˜1.1 is suggested to support claims that the density-size relation in molecular clouds is indeed an artifact of the observed N-pdf.

  19. Clustering redshift distributions for the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Helsby, Jennifer

    Accurate determination of photometric redshifts and their errors is critical for large scale structure and weak lensing studies for constraining cosmology from deep, wide imaging surveys. Current photometric redshift methods suffer from bias and scatter due to incomplete training sets. Exploiting the clustering between a sample of galaxies for which we have spectroscopic redshifts and a sample of galaxies for which the redshifts are unknown can allow us to reconstruct the true redshift distribution of the unknown sample. Here we use this method in both simulations and early data from the Dark Energy Survey (DES) to determine the true redshift distributions of galaxies in photometric redshift bins. We find that cross-correlating with the spectroscopic samples currently used for training provides a useful test of photometric redshifts and provides reliable estimates of the true redshift distribution in a photometric redshift bin. We discuss the use of the cross-correlation method in validating template- or learning-based approaches to redshift estimation and its future use in Stage IV surveys.

  20. 37. VIEW NORTH FROM EAST CRUDE ORE BIN TO CRUSHER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. VIEW NORTH FROM EAST CRUDE ORE BIN TO CRUSHER ADDITION AND CRUSHED OXIDIZED ORE BIN. VISIBLE ARE DINGS MAGNETIC PULLEY (CENTER), THE 100-TON STEEL CRUSHED UNOXIDIZED ORE BIN, AND UPPER PORTION OF THE STEPHENS-ADAMSON 25 TON/HR BUCKET ELEVATOR. THE UPPER TAILINGS POND LIES BEYOND THE MILL WITH THE UPPER TAILINGS DAM UNDER THE GRAVEL ROAD IN THE UPPER RIGHT CORNER. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  1. The cyclic-di-GMP phosphodiesterase BinA negatively regulates cellulose-containing biofilms in Vibrio fischeri.

    PubMed

    Bassis, Christine M; Visick, Karen L

    2010-03-01

    Bacteria produce different types of biofilms under distinct environmental conditions. Vibrio fischeri has the capacity to produce at least two distinct types of biofilms, one that relies on the symbiosis polysaccharide Syp and another that depends upon cellulose. A key regulator of biofilm formation in bacteria is the intracellular signaling molecule cyclic diguanylate (c-di-GMP). In this study, we focused on a predicted c-di-GMP phosphodiesterase encoded by the gene binA, located directly downstream of syp, a cluster of 18 genes critical for biofilm formation and the initiation of symbiotic colonization of the squid Euprymna scolopes. Disruption or deletion of binA increased biofilm formation in culture and led to increased binding of Congo red and calcofluor, which are indicators of cellulose production. Using random transposon mutagenesis, we determined that the phenotypes of the DeltabinA mutant strain could be disrupted by insertions in genes in the bacterial cellulose biosynthesis cluster (bcs), suggesting that cellulose production is negatively regulated by BinA. Replacement of critical amino acids within the conserved EAL residues of the EAL domain disrupted BinA activity, and deletion of binA increased c-di-GMP levels in the cell. Together, these data support the hypotheses that BinA functions as a phosphodiesterase and that c-di-GMP activates cellulose biosynthesis. Finally, overexpression of the syp regulator sypG induced binA expression. Thus, this work reveals a mechanism by which V. fischeri inhibits cellulose-dependent biofilm formation and suggests that the production of two different polysaccharides may be coordinated through the action of the cellulose inhibitor BinA.

  2. Critical Assessment of Metagenome Interpretation – a benchmark of computational metagenomics software

    PubMed Central

    Sczyrba, Alexander; Hofmann, Peter; Belmann, Peter; Koslicki, David; Janssen, Stefan; Dröge, Johannes; Gregor, Ivan; Majda, Stephan; Fiedler, Jessika; Dahms, Eik; Bremges, Andreas; Fritz, Adrian; Garrido-Oter, Ruben; Jørgensen, Tue Sparholt; Shapiro, Nicole; Blood, Philip D.; Gurevich, Alexey; Bai, Yang; Turaev, Dmitrij; DeMaere, Matthew Z.; Chikhi, Rayan; Nagarajan, Niranjan; Quince, Christopher; Meyer, Fernando; Balvočiūtė, Monika; Hansen, Lars Hestbjerg; Sørensen, Søren J.; Chia, Burton K. H.; Denis, Bertrand; Froula, Jeff L.; Wang, Zhong; Egan, Robert; Kang, Dongwan Don; Cook, Jeffrey J.; Deltel, Charles; Beckstette, Michael; Lemaitre, Claire; Peterlongo, Pierre; Rizk, Guillaume; Lavenier, Dominique; Wu, Yu-Wei; Singer, Steven W.; Jain, Chirag; Strous, Marc; Klingenberg, Heiner; Meinicke, Peter; Barton, Michael; Lingner, Thomas; Lin, Hsin-Hung; Liao, Yu-Chieh; Silva, Genivaldo Gueiros Z.; Cuevas, Daniel A.; Edwards, Robert A.; Saha, Surya; Piro, Vitor C.; Renard, Bernhard Y.; Pop, Mihai; Klenk, Hans-Peter; Göker, Markus; Kyrpides, Nikos C.; Woyke, Tanja; Vorholt, Julia A.; Schulze-Lefert, Paul; Rubin, Edward M.; Darling, Aaron E.; Rattei, Thomas; McHardy, Alice C.

    2018-01-01

    In metagenome analysis, computational methods for assembly, taxonomic profiling and binning are key components facilitating downstream biological data interpretation. However, a lack of consensus about benchmarking datasets and evaluation metrics complicates proper performance assessment. The Critical Assessment of Metagenome Interpretation (CAMI) challenge has engaged the global developer community to benchmark their programs on datasets of unprecedented complexity and realism. Benchmark metagenomes were generated from ~700 newly sequenced microorganisms and ~600 novel viruses and plasmids, including genomes with varying degrees of relatedness to each other and to publicly available ones and representing common experimental setups. Across all datasets, assembly and genome binning programs performed well for species represented by individual genomes, while performance was substantially affected by the presence of related strains. Taxonomic profiling and binning programs were proficient at high taxonomic ranks, with a notable performance decrease below the family level. Parameter settings substantially impacted performances, underscoring the importance of program reproducibility. While highlighting current challenges in computational metagenomics, the CAMI results provide a roadmap for software selection to answer specific research questions. PMID:28967888

  3. DETAIL VIEW OF LOWER TRAM TERMINAL, SECONDARY ORE BIN, CRUSHER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF LOWER TRAM TERMINAL, SECONDARY ORE BIN, CRUSHER FOUNDATION, AND BALL MILL FOUNDATIONS, LOOKING NORTH NORTHWEST. ORE FROM THE MINES WAS DUMPED FROM THE TRAM BUCKETS INTO THE PRIMARY ORE BIN UNDER THE TRAM TERMINAL. A SLIDING CONTROL DOOR INTRODUCED THE INTO THE JAW CRUSHER (FOUNDATIONS,CENTER). THE CRUSHED ORE WAS THEN CONVEYED INTO THE SECONDARY ORE BIN AT CENTER LEFT. A HOLE IN THE FLOOR OF THE ORE BIN PASSED ORE ONTO ANOTHER CONVEYOR THAT BROUGHT IT OUT TO THE BALL MILL(FOUNDATIONS,CENTER BOTTOM). THIS SYSTEM IS MOST LIKELY NOT THE ORIGINAL SET UP, PROBABLY INSTALLED IN THE MINE'S LAST OCCUPATION IN THE EARLY 1940s. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  4. 25. DETAIL OF STRUCTURAL TIMBERS, ORE BIN, AND STAIRWAY TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. DETAIL OF STRUCTURAL TIMBERS, ORE BIN, AND STAIRWAY TO TOP FLOOR OF MILL, LOOKING SOUTH FROM SECOND FLOOR OF MILL. PORTION OF ORE BIN ON RIGHT, STAIRS ON LEFT. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  5. 14. Interior view, grain tanks (bins). Barrel view of tunnel ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. Interior view, grain tanks (bins). Barrel view of tunnel for load-out belt conveyor system located below tanks. Square, numbered spouts gravity-feed grain from overhead bins onto belt. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  6. Spectral CT Reconstruction with Image Sparsity and Spectral Mean

    PubMed Central

    Zhang, Yi; Xi, Yan; Yang, Qingsong; Cong, Wenxiang; Zhou, Jiliu

    2017-01-01

    Photon-counting detectors can acquire x-ray intensity data in different energy bins. The signal to noise ratio of resultant raw data in each energy bin is generally low due to the narrow bin width and quantum noise. To address this problem, here we propose an image reconstruction approach for spectral CT to simultaneously reconstructs x-ray attenuation coefficients in all the energy bins. Because the measured spectral data are highly correlated among the x-ray energy bins, the intra-image sparsity and inter-image similarity are important prior acknowledge for image reconstruction. Inspired by this observation, the total variation (TV) and spectral mean (SM) measures are combined to improve the quality of reconstructed images. For this purpose, a linear mapping function is used to minimalize image differences between energy bins. The split Bregman technique is applied to perform image reconstruction. Our numerical and experimental results show that the proposed algorithms outperform competing iterative algorithms in this context. PMID:29034267

  7. Exenatide once weekly versus daily basal insulin as add-on treatment to metformin with or without a sulfonylurea: a retrospective pooled analysis in patients with poor glycemic control.

    PubMed

    Grimm, Michael; Li, Yan; Brunell, Steven C; Blase, Erich

    2013-09-01

    Basal insulin (b-INS) is typically the add-on treatment of choice for patients with poor glycemic control (ie, glycated hemoglobin [HbA1c] level ≥ 8.5%), but it is unclear whether b-INS is the best option. In this post hoc analysis, the efficacy and tolerability of exenatide once weekly (EQW) were compared with those of b-INS in patients with type 2 diabetes mellitus and a baseline HbA1c level 8.5% who were undergoing treatment with metformin ± a sulfonylurea. Data were pooled from two 26-week, randomized, controlled trials (EQW vs insulin glargine and EQW vs insulin detemir [EQW, N = 137; b-INS, N = 126]). Treatment with either EQW or b-INS for 26 weeks was associated with significant improvements in HbA1c level compared with baseline, although patients treated with EQW experienced a significantly greater decrease in HbA1c level than those treated with b-INS (least squares [LS] mean ± SE: -2.0% ± 0.08% vs -1.6% ± 0.08%; P = 0.0008). Treatment with EQW was associated with a weight loss of 2.4 kg ± 0.23 kg (LS mean ± SE), whereas treatment with b-INS was associated with a weight gain of 2.0 kg ± 0.24 kg (LS mean difference between groups, -4.4 kg ± 0.33; P < 0.0001). Patients in the EQW group were significantly more likely to achieve the composite endpoint of an HbA1c level < 7.0%, no weight gain, and no hypoglycemic events (defined as a blood glucose level < 54 mg/dL requiring self-treatment or assistance to resolve) than patients in the b-INS group (33.6% vs 3.2%; P < 0.0001). The exposure-adjusted hypoglycemic event rates were 0.08 and 0.37 events per patient-year in the EQW and b-INS groups, respectively. Gastrointestinal adverse events occurred at a higher rate in patients who underwent EQW treatment than those who were treated with b-INS. These results show that EQW treatment was associated with significantly greater improvement in HbA1c level compared with b-INS treatment among patients with poor glycemic control, with the added benefits of weight loss (vs weight gain with b-INS therapy) and a lower incidence of hypoglycemic events. These results suggest that EQW is an alternative treatment to b-INS for patients with type 2 diabetes mellitus and a baseline HbA1c level ≥ 8.5%.

  8. Dynamic Histogram Analysis To Determine Free Energies and Rates from Biased Simulations.

    PubMed

    Stelzl, Lukas S; Kells, Adam; Rosta, Edina; Hummer, Gerhard

    2017-12-12

    We present an algorithm to calculate free energies and rates from molecular simulations on biased potential energy surfaces. As input, it uses the accumulated times spent in each state or bin of a histogram and counts of transitions between them. Optimal unbiased equilibrium free energies for each of the states/bins are then obtained by maximizing the likelihood of a master equation (i.e., first-order kinetic rate model). The resulting free energies also determine the optimal rate coefficients for transitions between the states or bins on the biased potentials. Unbiased rates can be estimated, e.g., by imposing a linear free energy condition in the likelihood maximization. The resulting "dynamic histogram analysis method extended to detailed balance" (DHAMed) builds on the DHAM method. It is also closely related to the transition-based reweighting analysis method (TRAM) and the discrete TRAM (dTRAM). However, in the continuous-time formulation of DHAMed, the detailed balance constraints are more easily accounted for, resulting in compact expressions amenable to efficient numerical treatment. DHAMed produces accurate free energies in cases where the common weighted-histogram analysis method (WHAM) for umbrella sampling fails because of slow dynamics within the windows. Even in the limit of completely uncorrelated data, where WHAM is optimal in the maximum-likelihood sense, DHAMed results are nearly indistinguishable. We illustrate DHAMed with applications to ion channel conduction, RNA duplex formation, α-helix folding, and rate calculations from accelerated molecular dynamics. DHAMed can also be used to construct Markov state models from biased or replica-exchange molecular dynamics simulations. By using binless WHAM formulated as a numerical minimization problem, the bias factors for the individual states can be determined efficiently in a preprocessing step and, if needed, optimized globally afterward.

  9. LROC Investigation of Three Strategies for Reducing the Impact of Respiratory Motion on the Detection of Solitary Pulmonary Nodules in SPECT

    NASA Astrophysics Data System (ADS)

    Smyczynski, Mark S.; Gifford, Howard C.; Dey, Joyoni; Lehovich, Andre; McNamara, Joseph E.; Segars, W. Paul; King, Michael A.

    2016-02-01

    The objective of this investigation was to determine the effectiveness of three motion reducing strategies in diminishing the degrading impact of respiratory motion on the detection of small solitary pulmonary nodules (SPNs) in single-photon emission computed tomographic (SPECT) imaging in comparison to a standard clinical acquisition and the ideal case of imaging in the absence of respiratory motion. To do this nonuniform rational B-spline cardiac-torso (NCAT) phantoms based on human-volunteer CT studies were generated spanning the respiratory cycle for a normal background distribution of Tc-99 m NeoTect. Similarly, spherical phantoms of 1.0-cm diameter were generated to model small SPN for each of the 150 uniquely located sites within the lungs whose respiratory motion was based on the motion of normal structures in the volunteer CT studies. The SIMIND Monte Carlo program was used to produce SPECT projection data from these. Normal and single-lesion containing SPECT projection sets with a clinically realistic Poisson noise level were created for the cases of 1) the end-expiration (EE) frame with all counts, 2) respiration-averaged motion with all counts, 3) one fourth of the 32 frames centered around EE (Quarter Binning), 4) one half of the 32 frames centered around EE (Half Binning), and 5) eight temporally binned frames spanning the respiratory cycle. Each of the sets of combined projection data were reconstructed with RBI-EM with system spatial-resolution compensation (RC). Based on the known motion for each of the 150 different lesions, the reconstructed volumes of respiratory bins were shifted so as to superimpose the locations of the SPN onto that in the first bin (Reconstruct and Shift). Five human observers performed localization receiver operating characteristics (LROC) studies of SPN detection. The observer results were analyzed for statistical significance differences in SPN detection accuracy among the three correction strategies, the standard acquisition, and the ideal case of the absence of respiratory motion. Our human-observer LROC determined that Quarter Binning and Half Binning strategies resulted in SPN detection accuracy statistically significantly below ( ) that of standard clinical acquisition, whereas the Reconstruct and Shift strategy resulted in a detection accuracy not statistically significantly different from that of the ideal case. This investigation demonstrates that tumor detection based on acquisitions associated with less than all the counts which could potentially be employed may result in poorer detection despite limiting the motion of the lesion. The Reconstruct and Shift method results in tumor detection that is equivalent to ideal motion correction.

  10. Weather data for simplified energy calculation methods. Volume II. Middle United States: TRY data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, A.R.; Moreno, S.; Deringer, J.

    1984-08-01

    The objective of this report is to provide a source of weather data for direct use with a number of simplified energy calculation methods available today. Complete weather data for a number of cities in the United States are provided for use in the following methods: degree hour, modified degree hour, bin, modified bin, and variable degree day. This report contains sets of weather data for 22 cities in the continental United States using Test Reference Year (TRY) source weather data. The weather data at each city has been summarized in a number of ways to provide differing levels ofmore » detail necessary for alternative simplified energy calculation methods. Weather variables summarized include dry bulb and wet bulb temperature, percent relative humidity, humidity ratio, wind speed, percent possible sunshine, percent diffuse solar radiation, total solar radiation on horizontal and vertical surfaces, and solar heat gain through standard DSA glass. Monthly and annual summaries, in some cases by time of day, are available. These summaries are produced in a series of nine computer generated tables.« less

  11. Automated segmentation of linear time-frequency representations of marine-mammal sounds.

    PubMed

    Dadouchi, Florian; Gervaise, Cedric; Ioana, Cornel; Huillery, Julien; Mars, Jérôme I

    2013-09-01

    Many marine mammals produce highly nonlinear frequency modulations. Determining the time-frequency support of these sounds offers various applications, which include recognition, localization, and density estimation. This study introduces a low parameterized automated spectrogram segmentation method that is based on a theoretical probabilistic framework. In the first step, the background noise in the spectrogram is fitted with a Chi-squared distribution and thresholded using a Neyman-Pearson approach. In the second step, the number of false detections in time-frequency regions is modeled as a binomial distribution, and then through a Neyman-Pearson strategy, the time-frequency bins are gathered into regions of interest. The proposed method is validated on real data of large sequences of whistles from common dolphins, collected in the Bay of Biscay (France). The proposed method is also compared with two alternative approaches: the first is smoothing and thresholding of the spectrogram; the second is thresholding of the spectrogram followed by the use of morphological operators to gather the time-frequency bins and to remove false positives. This method is shown to increase the probability of detection for the same probability of false alarms.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hobbs, R; Le, Y; Armour, E

    Purpose: Dose-response studies in radiation therapy are typically using single response values for tumors across ensembles of tumors. Using the high dose rate (HDR) treatment plan dose grid and pre- and post-therapy FDG-PET images, we look for correlations between voxelized dose and FDG uptake response in individual tumors. Methods: Fifteen patients were treated for localized rectal cancer using 192Ir HDR brachytherapy in conjunction with surgery. FDG-PET images were acquired before HDR therapy and 6–8 weeks after treatment (prior to surgery). Treatment planning was done on a commercial workstation and the dose grid was calculated. The two PETs and the treatmentmore » dose grid were registered to each other using non-rigid registration. The difference in PET SUV values before and after HDR was plotted versus absorbed radiation dose for each voxel. The voxels were then separated into bins for every 400 cGy of absorbed dose and the bin average values plotted similarly. Results: Individual voxel doses did not correlate with PET response; however, when group into tumor subregions corresponding to dose bins, eighty percent of the patients showed a significant positive correlation (R2 > 0) between PET uptake difference in the targeted region and the absorbed dose. Conclusion: By considering larger ensembles of voxels, such as organ average absorbed dose or the dose bins considered here, valuable information may be obtained. The dose-response correlations as measured by FDG-PET difference potentially underlines the importance of FDG-PET as a measure of response, as well as the value of voxelized information.« less

  13. United States Department of Agriculture-Agricultural Research Service stored-grain areawide integrated pest management program.

    PubMed

    Flinn, Paul W; Hagstrum, David W; Reed, Carl; Phillips, Tom W

    2003-01-01

    The USDA Agricultural Research Service (ARS) funded a demonstration project (1998-2002) for areawide IPM for stored wheat in Kansas and Oklahoma. This project was a collaboration of researchers at the ARS Grain Marketing and Production Research Center in Manhattan, Kansas, Kansas State University, and Oklahoma State University. The project utilized two elevator networks, one in each state, for a total of 28 grain elevators. These elevators stored approximately 31 million bushels of wheat, which is approximately 1.2% of the annual national production. Stored wheat was followed as it moved from farm to the country elevator and finally to the terminal elevator. During this study, thousands of grain samples were taken in concrete elevator silos. Wheat stored at elevators was frequently infested by several insect species, which sometimes reached high numbers and damaged the grain. Fumigation using aluminum phosphide pellets was the main method for managing these insect pests in elevators in the USA. Fumigation decisions tended to be based on past experience with controlling stored-grain insects, or were calendar based. Integrated pest management (IPM) requires sampling and risk benefit analysis. We found that the best sampling method for estimating insect density, without turning the grain from one bin to another, was the vacuum probe sampler. Decision support software, Stored Grain Advisor Pro (SGA Pro) was developed that interprets insect sampling data, and provides grain managers with a risk analysis report detailing which bins are at low, moderate or high risk for insect-caused economic losses. Insect density was predicted up to three months in the future based on current insect density, grain temperature and moisture. Because sampling costs money, there is a trade-off between frequency of sampling and the cost of fumigation. The insect growth model in SGA Pro reduces the need to sample as often, thereby making the program more cost-effective. SGA Pro was validated during the final year of the areawide program. Based on data from 533 bins, SGA Pro accurately predicted which bins were at low, moderate or high risk. Only in two out of 533 bins did SGA Pro incorrectly predict bins as being low risk and, in both cases, insect density was only high (> two insects kg(-1)) at the surface, which suggested recent immigration. SGA Pro is superior to calendar-based management because it ensures that grain is only treated when insect densities exceed economic thresholds (two insects kg(-1)). This approach will reduce the frequency of fumigation while maintaining high grain quality. Minimizing the use of fumigant improves worker safety and reduces both control costs and harm to the environment.

  14. Efficient characterization of inhomogeneity in contraction strain pattern.

    PubMed

    Nazzal, Christina M; Mulligan, Lawrence J; Criscione, John C

    2012-05-01

    Cardiac dyssynchrony often accompanies patients with heart failure (HF) and can lead to an increase in mortality rate. Cardiac resynchronization therapy (CRT) has been shown to provide substantial benefits to the HF population with ventricular dyssynchrony; however, there still exists a group of patients who do not respond to this treatment. In order to better understand patient response to CRT, it is necessary to quantitatively characterize both electrical and mechanical dyssynchrony. The quantification of mechanical dyssynchrony via characterization of contraction strain field inhomogeneity is the focus of this modeling investigation. Raw data from a 3D finite element (FE) model were received from Roy Kerckhoffs et al. and analyzed in MATLAB. The FE model consisted of canine left and right ventricles coupled to a closed circulation with the effects of the pericardium acting as a pressure on the epicardial surface. For each of three simulations (normal synchronous, SYNC, right ventricular apical pacing, RVA, and left ventricular free wall pacing, LVFW) the Gauss point locations and values were used to generate lookup tables (LUTs) with each entry representing a location in the heart. In essence, we employed piecewise cubic interpolation to generate a fine point cloud (LUTs) from a course point cloud (Gauss points). Strain was calculated in the fiber direction and was then displayed in multiple ways to better characterize strain inhomogeneity. By plotting average strain and standard deviation over time, the point of maximum contraction and the point of maximal inhomogeneity were found for each simulation. Strain values were organized into seven strain bins to show operative strain ranges and extent of inhomogeneity throughout the heart wall. In order to visualize strain propagation, magnitude, and inhomogeneity over time, we created 2D area maps displaying strain over the entire cardiac cycle. To visualize spatial strain distribution at the time point of maximum inhomogeneity, a 3D point cloud was created for each simulation, and a CURE index was calculated. We found that both the RVA and LFVW simulations took longer to reach maximum contraction than the SYNC simulation, while also exhibiting larger disparities in strain values during contraction. Strain in the hoop direction was also analyzed and was found to be similar to the fiber strain results. It was found that our method of analyzing contraction strain pattern yielded more detailed spacial and temporal information about fiber strain in the heart over the cardiac cycle than the more conventional CURE index method. We also observed that our method of strain binning aids in visualization of the strain fields, and in particular, the separation of the mass points into separate images associated with each strain bin allows the strain pattern to be explicitly compartmentalized.

  15. Suppression of Prostate Tumor Progression by Bin 1

    DTIC Science & Technology

    2006-02-01

    experiment). The full protocol was approved by IACUC review. Cohort A. Castration + Testostrone propionate s.c. + MNU i.v. Cohort B. Castration... Testostrone propionate s.c. + MNU i.v. + testosterone pellet Strain 1. mosaic Bin1 flox)/KO  Strain 2. Bin1 flox/+ (control for strain 1) Strain

  16. Bioinformatics and Astrophysics Cluster (BinAc)

    NASA Astrophysics Data System (ADS)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  17. EFFECTS OF NUMBER AND LOCATION OF BINS ON PLASTIC RECYCLING AT A UNIVERSITY

    PubMed Central

    O'Connor, Ryan T; Lerman, Dorothea C; Fritz, Jennifer N; Hodde, Henry B

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location without the use of posted signs. Manipulating the appearance or number of recycling bins in common areas did not increase recycling. Consumers recycled substantially more plastic bottles when the recycling bins were located in classrooms. PMID:21541154

  18. Effects of number and location of bins on plastic recycling at a university.

    PubMed

    O'Connor, Ryan T; Lerman, Dorothea C; Fritz, Jennifer N; Hodde, Henry B

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location without the use of posted signs. Manipulating the appearance or number of recycling bins in common areas did not increase recycling. Consumers recycled substantially more plastic bottles when the recycling bins were located in classrooms.

  19. BinMag: Widget for comparing stellar observed with theoretical spectra

    NASA Astrophysics Data System (ADS)

    Kochukhov, O.

    2018-05-01

    BinMag examines theoretical stellar spectra computed with Synth/SynthMag/Synmast/Synth3/SME spectrum synthesis codes and compare them to observations. An IDL widget program, BinMag applies radial velocity shift and broadening to the theoretical spectra to account for the effects of stellar rotation, radial-tangential macroturbulence, instrumental smearing. The code can also simulate spectra of spectroscopic binary stars by appropriate coaddition of two synthetic spectra. Additionally, BinMag can be used to measure equivalent width, fit line profile shapes with analytical functions, and to automatically determine radial velocity and broadening parameters. BinMag interfaces with the Synth3 (ascl:1212.010) and SME (ascl:1202.013) codes, allowing the user to determine chemical abundances and stellar atmospheric parameters from the observed spectra.

  20. Parameterizing correlations between hydrometeor species in mixed-phase Arctic clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larson, Vincent E.; Nielsen, Brandon J.; Fan, Jiwen

    2011-08-16

    Mixed-phase Arctic clouds, like other clouds, contain small-scale variability in hydrometeor fields, such as cloud water or snow mixing ratio. This variability may be worth parameterizing in coarse-resolution numerical models. In particular, for modeling processes such as accretion and aggregation, it would be useful to parameterize subgrid correlations among hydrometeor species. However, one difficulty is that there exist many hydrometeor species and many microphysical processes, leading to complexity and computational expense.Existing lower and upper bounds (inequalities) on linear correlation coefficients provide useful guidance, but these bounds are too loose to serve directly as a method to predict subgrid correlations. Therefore,more » this paper proposes an alternative method that is based on a blend of theory and empiricism. The method begins with the spherical parameterization framework of Pinheiro and Bates (1996), which expresses the correlation matrix in terms of its Cholesky factorization. The values of the elements of the Cholesky matrix are parameterized here using a cosine row-wise formula that is inspired by the aforementioned bounds on correlations. The method has three advantages: 1) the computational expense is tolerable; 2) the correlations are, by construction, guaranteed to be consistent with each other; and 3) the methodology is fairly general and hence may be applicable to other problems. The method is tested non-interactively using simulations of three Arctic mixed-phase cloud cases from two different field experiments: the Indirect and Semi-Direct Aerosol Campaign (ISDAC) and the Mixed-Phase Arctic Cloud Experiment (M-PACE). Benchmark simulations are performed using a large-eddy simulation (LES) model that includes a bin microphysical scheme. The correlations estimated by the new method satisfactorily approximate the correlations produced by the LES.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jennings, Elise; Wechsler, Risa H.

    We present the nonlinear 2D galaxy power spectrum, P(k, µ), in redshift space, measured from the Dark Sky simulations, using galaxy catalogs constructed with both halo occupation distribution and subhalo abundance matching methods, chosen to represent an intermediate redshift sample of luminous red galaxies. We find that the information content in individual µ (cosine of the angle to the line of sight) bins is substantially richer then multipole moments, and show that this can be used to isolate the impact of nonlinear growth and redshift space distortion (RSD) effects. Using the µ < 0.2 simulation data, which we show ismore » not impacted by RSD effects, we can successfully measure the nonlinear bias to an accuracy of ~ 5% at k < 0.6hMpc-1 . This use of individual µ bins to extract the nonlinear bias successfully removes a large parameter degeneracy when constraining the linear growth rate of structure. We carry out a joint parameter estimation, using the low µ simulation data to constrain the nonlinear bias, and µ > 0.2 to constrain the growth rate and show that f can be constrained to ~ 26(22)% to a kmax < 0.4(0.6)hMpc-1 from clustering alone using a simple dispersion model, for a range of galaxy models. Our analysis of individual µ bins also reveals interesting physical effects which arise simply from different methods of populating halos with galaxies. We also find a prominent turnaround scale, at which RSD damping effects are greater then the nonlinear growth, which differs not only for each µ bin but also for each galaxy model. These features may provide unique signatures which could be used to shed light on the galaxy–dark matter connection. Furthermore, the idea of separating nonlinear growth and RSD effects making use of the full information in the 2D galaxy power spectrum yields significant improvements in constraining cosmological parameters and may be a promising probe of galaxy formation models.« less

  2. Disentangling Redshift-Space Distortions and Nonlinear Bias using the 2D Power Spectrum

    DOE PAGES

    Jennings, Elise; Wechsler, Risa H.

    2015-08-07

    We present the nonlinear 2D galaxy power spectrum, P(k, µ), in redshift space, measured from the Dark Sky simulations, using galaxy catalogs constructed with both halo occupation distribution and subhalo abundance matching methods, chosen to represent an intermediate redshift sample of luminous red galaxies. We find that the information content in individual µ (cosine of the angle to the line of sight) bins is substantially richer then multipole moments, and show that this can be used to isolate the impact of nonlinear growth and redshift space distortion (RSD) effects. Using the µ < 0.2 simulation data, which we show ismore » not impacted by RSD effects, we can successfully measure the nonlinear bias to an accuracy of ~ 5% at k < 0.6hMpc-1 . This use of individual µ bins to extract the nonlinear bias successfully removes a large parameter degeneracy when constraining the linear growth rate of structure. We carry out a joint parameter estimation, using the low µ simulation data to constrain the nonlinear bias, and µ > 0.2 to constrain the growth rate and show that f can be constrained to ~ 26(22)% to a kmax < 0.4(0.6)hMpc-1 from clustering alone using a simple dispersion model, for a range of galaxy models. Our analysis of individual µ bins also reveals interesting physical effects which arise simply from different methods of populating halos with galaxies. We also find a prominent turnaround scale, at which RSD damping effects are greater then the nonlinear growth, which differs not only for each µ bin but also for each galaxy model. These features may provide unique signatures which could be used to shed light on the galaxy–dark matter connection. Furthermore, the idea of separating nonlinear growth and RSD effects making use of the full information in the 2D galaxy power spectrum yields significant improvements in constraining cosmological parameters and may be a promising probe of galaxy formation models.« less

  3. On the derivation of selection functions from redshift survey data

    NASA Technical Reports Server (NTRS)

    Strauss, Michael A.; Yahil, Amos; Davis, Marc

    1991-01-01

    A previously unrecognized effect is described in the derivation of luminosity functions and selection functions from existing redshift survey data, due to binning of quoted magnitudes and diameters. Corrections are made for this effect in the Center for Astrophysics (CfA) and Southern Sky (SSRS) Redshift Surveys. The correction makes subtle but systematic changes in the derived density fields of the CfA survey, especially within 2000 km/s of the Local Group. The effect on the density field of the SSRS survey is negligible.

  4. diffHic: a Bioconductor package to detect differential genomic interactions in Hi-C data.

    PubMed

    Lun, Aaron T L; Smyth, Gordon K

    2015-08-19

    Chromatin conformation capture with high-throughput sequencing (Hi-C) is a technique that measures the in vivo intensity of interactions between all pairs of loci in the genome. Most conventional analyses of Hi-C data focus on the detection of statistically significant interactions. However, an alternative strategy involves identifying significant changes in the interaction intensity (i.e., differential interactions) between two or more biological conditions. This is more statistically rigorous and may provide more biologically relevant results. Here, we present the diffHic software package for the detection of differential interactions from Hi-C data. diffHic provides methods for read pair alignment and processing, counting into bin pairs, filtering out low-abundance events and normalization of trended or CNV-driven biases. It uses the statistical framework of the edgeR package to model biological variability and to test for significant differences between conditions. Several options for the visualization of results are also included. The use of diffHic is demonstrated with real Hi-C data sets. Performance against existing methods is also evaluated with simulated data. On real data, diffHic is able to successfully detect interactions with significant differences in intensity between biological conditions. It also compares favourably to existing software tools on simulated data sets. These results suggest that diffHic is a viable approach for differential analyses of Hi-C data.

  5. Evaluation of respiratory and cardiac motion correction schemes in dual gated PET/CT cardiac imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamare, F., E-mail: frederic.lamare@chu-bordeaux.fr; Fernandez, P.; CNRS, INCIA, UMR 5287, F-33400 Talence

    Purpose: Cardiac imaging suffers from both respiratory and cardiac motion. One of the proposed solutions involves double gated acquisitions. Although such an approach may lead to both respiratory and cardiac motion compensation there are issues associated with (a) the combination of data from cardiac and respiratory motion bins, and (b) poor statistical quality images as a result of using only part of the acquired data. The main objective of this work was to evaluate different schemes of combining binned data in order to identify the best strategy to reconstruct motion free cardiac images from dual gated positron emission tomography (PET)more » acquisitions. Methods: A digital phantom study as well as seven human studies were used in this evaluation. PET data were acquired in list mode (LM). A real-time position management system and an electrocardiogram device were used to provide the respiratory and cardiac motion triggers registered within the LM file. Acquired data were subsequently binned considering four and six cardiac gates, or the diastole only in combination with eight respiratory amplitude gates. PET images were corrected for attenuation, but no randoms nor scatter corrections were included. Reconstructed images from each of the bins considered above were subsequently used in combination with an affine or an elastic registration algorithm to derive transformation parameters allowing the combination of all acquired data in a particular position in the cardiac and respiratory cycles. Images were assessed in terms of signal-to-noise ratio (SNR), contrast, image profile, coefficient-of-variation (COV), and relative difference of the recovered activity concentration. Results: Regardless of the considered motion compensation strategy, the nonrigid motion model performed better than the affine model, leading to higher SNR and contrast combined with a lower COV. Nevertheless, when compensating for respiration only, no statistically significant differences were observed in the performance of the two motion models considered. Superior image SNR and contrast were seen using the affine respiratory motion model in combination with the diastole cardiac bin in comparison to the use of the whole cardiac cycle. In contrast, when simultaneously correcting for cardiac beating and respiration, the elastic respiratory motion model outperformed the affine model. In this context, four cardiac bins associated with eight respiratory amplitude bins seemed to be adequate. Conclusions: Considering the compensation of respiratory motion effects only, both affine and elastic based approaches led to an accurate resizing and positioning of the myocardium. The use of the diastolic phase combined with an affine model based respiratory motion correction may therefore be a simple approach leading to significant quality improvements in cardiac PET imaging. However, the best performance was obtained with the combined correction for both cardiac and respiratory movements considering all the dual-gated bins independently through the use of an elastic model based motion compensation.« less

  6. SNR improvement for hyperspectral application using frame and pixel binning

    NASA Astrophysics Data System (ADS)

    Rehman, Sami Ur; Kumar, Ankush; Banerjee, Arup

    2016-05-01

    Hyperspectral imaging spectrometer systems are increasingly being used in the field of remote sensing for variety of civilian and military applications. The ability of such instruments in discriminating finer spectral features along with improved spatial and radiometric performance have made such instruments a powerful tool in the field of remote sensing. Design and development of spaceborne hyper spectral imaging spectrometers poses lot of technological challenges in terms of optics, dispersion element, detectors, electronics and mechanical systems. The main factors that define the type of detectors are the spectral region, SNR, dynamic range, pixel size, number of pixels, frame rate, operating temperature etc. Detectors with higher quantum efficiency and higher well depth are the preferred choice for such applications. CCD based Si detectors serves the requirement of high well depth for VNIR band spectrometers but suffers from smear. Smear can be controlled by using CMOS detectors. Si CMOS detectors with large format arrays are available. These detectors generally have smaller pitch and low well depth. Binning technique can be used with available CMOS detectors to meet the large swath, higher resolution and high SNR requirements. Availability of larger dwell time of satellite can be used to bin multiple frames to increase the signal collection even with lesser well depth detectors and ultimately increase the SNR. Lab measurements reveal that SNR improvement by frame binning is more in comparison to pixel binning. Effect of pixel binning as compared to the frame binning will be discussed and degradation of SNR as compared to theoretical value for pixel binning will be analyzed.

  7. Meta-analysis of 32 genome-wide linkage studies of schizophrenia

    PubMed Central

    Ng, MYM; Levinson, DF; Faraone, SV; Suarez, BK; DeLisi, LE; Arinami, T; Riley, B; Paunio, T; Pulver, AE; Irmansyah; Holmans, PA; Escamilla, M; Wildenauer, DB; Williams, NM; Laurent, C; Mowry, BJ; Brzustowicz, LM; Maziade, M; Sklar, P; Garver, DL; Abecasis, GR; Lerer, B; Fallin, MD; Gurling, HMD; Gejman, PV; Lindholm, E; Moises, HW; Byerley, W; Wijsman, EM; Forabosco, P; Tsuang, MT; Hwu, H-G; Okazaki, Y; Kendler, KS; Wormley, B; Fanous, A; Walsh, D; O’Neill, FA; Peltonen, L; Nestadt, G; Lasseter, VK; Liang, KY; Papadimitriou, GM; Dikeos, DG; Schwab, SG; Owen, MJ; O’Donovan, MC; Norton, N; Hare, E; Raventos, H; Nicolini, H; Albus, M; Maier, W; Nimgaonkar, VL; Terenius, L; Mallet, J; Jay, M; Godard, S; Nertney, D; Alexander, M; Crowe, RR; Silverman, JM; Bassett, AS; Roy, M-A; Mérette, C; Pato, CN; Pato, MT; Roos, J Louw; Kohn, Y; Amann-Zalcenstein, D; Kalsi, G; McQuillin, A; Curtis, D; Brynjolfson, J; Sigmundsson, T; Petursson, H; Sanders, AR; Duan, J; Jazin, E; Myles-Worsley, M; Karayiorgou, M; Lewis, CM

    2009-01-01

    A genome scan meta-analysis (GSMA) was carried out on 32 independent genome-wide linkage scan analyses that included 3255 pedigrees with 7413 genotyped cases affected with schizophrenia (SCZ) or related disorders. The primary GSMA divided the autosomes into 120 bins, rank-ordered the bins within each study according to the most positive linkage result in each bin, summed these ranks (weighted for study size) for each bin across studies and determined the empirical probability of a given summed rank (PSR) by simulation. Suggestive evidence for linkage was observed in two single bins, on chromosomes 5q (142-168 Mb) and 2q (103-134 Mb). Genome-wide evidence for linkage was detected on chromosome 2q (119-152 Mb) when bin boundaries were shifted to the middle of the previous bins. The primary analysis met empirical criteria for ‘aggregate’ genome-wide significance, indicating that some or all of 10 bins are likely to contain loci linked to SCZ, including regions of chromosomes 1, 2q, 3q, 4q, 5q, 8p and 10q. In a secondary analysis of 22 studies of European-ancestry samples, suggestive evidence for linkage was observed on chromosome 8p (16-33 Mb). Although the newer genome-wide association methodology has greater power to detect weak associations to single common DNA sequence variants, linkage analysis can detect diverse genetic effects that segregate in families, including multiple rare variants within one locus or several weakly associated loci in the same region. Therefore, the regions supported by this meta-analysis deserve close attention in future studies. PMID:19349958

  8. Altered Splicing of the BIN1 Muscle-Specific Exon in Humans and Dogs with Highly Progressive Centronuclear Myopathy

    PubMed Central

    Böhm, Johann; Vasli, Nasim; Maurer, Marie; Cowling, Belinda; Shelton, G. Diane; Kress, Wolfram; Toussaint, Anne; Prokic, Ivana; Schara, Ulrike; Anderson, Thomas James; Weis, Joachim; Tiret, Laurent; Laporte, Jocelyn

    2013-01-01

    Amphiphysin 2, encoded by BIN1, is a key factor for membrane sensing and remodelling in different cell types. Homozygous BIN1 mutations in ubiquitously expressed exons are associated with autosomal recessive centronuclear myopathy (CNM), a mildly progressive muscle disorder typically showing abnormal nuclear centralization on biopsies. In addition, misregulation of BIN1 splicing partially accounts for the muscle defects in myotonic dystrophy (DM). However, the muscle-specific function of amphiphysin 2 and its pathogenicity in both muscle disorders are not well understood. In this study we identified and characterized the first mutation affecting the splicing of the muscle-specific BIN1 exon 11 in a consanguineous family with rapidly progressive and ultimately fatal centronuclear myopathy. In parallel, we discovered a mutation in the same BIN1 exon 11 acceptor splice site as the genetic cause of the canine Inherited Myopathy of Great Danes (IMGD). Analysis of RNA from patient muscle demonstrated complete skipping of exon 11 and BIN1 constructs without exon 11 were unable to promote membrane tubulation in differentiated myotubes. Comparative immunofluorescence and ultrastructural analyses of patient and canine biopsies revealed common structural defects, emphasizing the importance of amphiphysin 2 in membrane remodelling and maintenance of the skeletal muscle triad. Our data demonstrate that the alteration of the muscle-specific function of amphiphysin 2 is a common pathomechanism for centronuclear myopathy, myotonic dystrophy, and IMGD. The IMGD dog is the first faithful model for human BIN1-related CNM and represents a mammalian model available for preclinical trials of potential therapies. PMID:23754947

  9. Precision of FLEET Velocimetry Using High-Speed CMOS Camera Systems

    NASA Technical Reports Server (NTRS)

    Peters, Christopher J.; Danehy, Paul M.; Bathel, Brett F.; Jiang, Naibo; Calvert, Nathan D.; Miles, Richard B.

    2015-01-01

    Femtosecond laser electronic excitation tagging (FLEET) is an optical measurement technique that permits quantitative velocimetry of unseeded air or nitrogen using a single laser and a single camera. In this paper, we seek to determine the fundamental precision of the FLEET technique using high-speed complementary metal-oxide semiconductor (CMOS) cameras. Also, we compare the performance of several different high-speed CMOS camera systems for acquiring FLEET velocimetry data in air and nitrogen free-jet flows. The precision was defined as the standard deviation of a set of several hundred single-shot velocity measurements. Methods of enhancing the precision of the measurement were explored such as digital binning (similar in concept to on-sensor binning, but done in post-processing), row-wise digital binning of the signal in adjacent pixels and increasing the time delay between successive exposures. These techniques generally improved precision; however, binning provided the greatest improvement to the un-intensified camera systems which had low signal-to-noise ratio. When binning row-wise by 8 pixels (about the thickness of the tagged region) and using an inter-frame delay of 65 microseconds, precisions of 0.5 meters per second in air and 0.2 meters per second in nitrogen were achieved. The camera comparison included a pco.dimax HD, a LaVision Imager scientific CMOS (sCMOS) and a Photron FASTCAM SA-X2, along with a two-stage LaVision HighSpeed IRO intensifier. Excluding the LaVision Imager sCMOS, the cameras were tested with and without intensification and with both short and long inter-frame delays. Use of intensification and longer inter-frame delay generally improved precision. Overall, the Photron FASTCAM SA-X2 exhibited the best performance in terms of greatest precision and highest signal-to-noise ratio primarily because it had the largest pixels.

  10. An imperialist competitive algorithm for virtual machine placement in cloud computing

    NASA Astrophysics Data System (ADS)

    Jamali, Shahram; Malektaji, Sepideh; Analoui, Morteza

    2017-05-01

    Cloud computing, the recently emerged revolution in IT industry, is empowered by virtualisation technology. In this paradigm, the user's applications run over some virtual machines (VMs). The process of selecting proper physical machines to host these virtual machines is called virtual machine placement. It plays an important role on resource utilisation and power efficiency of cloud computing environment. In this paper, we propose an imperialist competitive-based algorithm for the virtual machine placement problem called ICA-VMPLC. The base optimisation algorithm is chosen to be ICA because of its ease in neighbourhood movement, good convergence rate and suitable terminology. The proposed algorithm investigates search space in a unique manner to efficiently obtain optimal placement solution that simultaneously minimises power consumption and total resource wastage. Its final solution performance is compared with several existing methods such as grouping genetic and ant colony-based algorithms as well as bin packing heuristic. The simulation results show that the proposed method is superior to other tested algorithms in terms of power consumption, resource wastage, CPU usage efficiency and memory usage efficiency.

  11. Uncertainty in Modeling Dust Mass Balance and Radiative Forcing from Size Parameterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Chun; Chen, Siyu; Leung, Lai-Yung R.

    2013-11-05

    This study examines the uncertainties in simulating mass balance and radiative forcing of mineral dust due to biases in the aerosol size parameterization. Simulations are conducted quasi-globally (180oW-180oE and 60oS-70oN) using the WRF24 Chem model with three different approaches to represent aerosol size distribution (8-bin, 4-bin, and 3-mode). The biases in the 3-mode or 4-bin approaches against a relatively more accurate 8-bin approach in simulating dust mass balance and radiative forcing are identified. Compared to the 8-bin approach, the 4-bin approach simulates similar but coarser size distributions of dust particles in the atmosphere, while the 3-mode pproach retains more finemore » dust particles but fewer coarse dust particles due to its prescribed og of each mode. Although the 3-mode approach yields up to 10 days longer dust mass lifetime over the remote oceanic regions than the 8-bin approach, the three size approaches produce similar dust mass lifetime (3.2 days to 3.5 days) on quasi-global average, reflecting that the global dust mass lifetime is mainly determined by the dust mass lifetime near the dust source regions. With the same global dust emission (~6000 Tg yr-1), the 8-bin approach produces a dust mass loading of 39 Tg, while the 4-bin and 3-mode approaches produce 3% (40.2 Tg) and 25% (49.1 Tg) higher dust mass loading, respectively. The difference in dust mass loading between the 8-bin approach and the 4-bin or 3-mode approaches has large spatial variations, with generally smaller relative difference (<10%) near the surface over the dust source regions. The three size approaches also result in significantly different dry and wet deposition fluxes and number concentrations of dust. The difference in dust aerosol optical depth (AOD) (a factor of 3) among the three size approaches is much larger than their difference (25%) in dust mass loading. Compared to the 8-bin approach, the 4-bin approach yields stronger dust absorptivity, while the 3-mode approach yields weaker dust absorptivity. Overall, on quasi-global average, the three size parameterizations result in a significant difference of a factor of 2~3 in dust surface cooling (-1.02~-2.87 W m-2) and atmospheric warming (0.39~0.96 W m-2) and in a tremendous difference of a factor of ~10 in dust TOA cooling (-0.24~-2.20 W m-2). An uncertainty of a factor of 2 is quantified in dust emission estimation due to the different size parameterizations. This study also highlights the uncertainties in modeling dust mass and number loading, deposition fluxes, and radiative forcing resulting from different size parameterizations, and motivates further investigation of the impact of size parameterizations on modeling dust impacts on air quality, climate, and ecosystem.« less

  12. Low-Temperature Effects on the Design and Performance of Composting of Explosives-Contaminated Soils

    DTIC Science & Technology

    1991-03-01

    7 7. Aerated bins used in field composting tests on dairy manure ............................. 10 8. Typical temperature developed...during bin compostiag of dairy manure under conditions of constant airflow and optimum moisture ................. 10 9. Effect of agitation on the...temperature profile during bin composting of dairy manure

  13. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  14. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  15. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  16. 90. Photographic copy of plan of bins, section of boot, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    90. Photographic copy of plan of bins, section of boot, and photograph of construction originally published in Plans of Grain Elevators (Chicago; Grain Dealers Journal, 1918), p.53. PLAN OF BINS; SECTION OF BOOT; VIEW OF CONSTRUCTION LOOKING NORTHWEST - Northwestern Consolidated Elevator "A", 119 Fifth Avenue South, Minneapolis, Hennepin County, MN

  17. 64. NORTH WALL OF CRUSHED OXIDIZED ORE BIN. THE PRIMARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    64. NORTH WALL OF CRUSHED OXIDIZED ORE BIN. THE PRIMARY MILL FEEDS AT BOTTOM. MILL SOLUTION TANKS WERE TO THE LEFT (EAST) AND BARREN SOLUTION TANK TO THE RIGHT (WEST) OR THE CRUSHED ORE BIN. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  18. FDD Massive MIMO Channel Estimation With Arbitrary 2D-Array Geometry

    NASA Astrophysics Data System (ADS)

    Dai, Jisheng; Liu, An; Lau, Vincent K. N.

    2018-05-01

    This paper addresses the problem of downlink channel estimation in frequency-division duplexing (FDD) massive multiple-input multiple-output (MIMO) systems. The existing methods usually exploit hidden sparsity under a discrete Fourier transform (DFT) basis to estimate the cdownlink channel. However, there are at least two shortcomings of these DFT-based methods: 1) they are applicable to uniform linear arrays (ULAs) only, since the DFT basis requires a special structure of ULAs, and 2) they always suffer from a performance loss due to the leakage of energy over some DFT bins. To deal with the above shortcomings, we introduce an off-grid model for downlink channel sparse representation with arbitrary 2D-array antenna geometry, and propose an efficient sparse Bayesian learning (SBL) approach for the sparse channel recovery and off-grid refinement. The main idea of the proposed off-grid method is to consider the sampled grid points as adjustable parameters. Utilizing an in-exact block majorization-minimization (MM) algorithm, the grid points are refined iteratively to minimize the off-grid gap. Finally, we further extend the solution to uplink-aided channel estimation by exploiting the angular reciprocity between downlink and uplink channels, which brings enhanced recovery performance.

  19. Model Uncertainty Quantification Methods For Data Assimilation In Partially Observed Multi-Scale Systems

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; van Leeuwen, P. J.

    2017-12-01

    Model Uncertainty Quantification remains one of the central challenges of effective Data Assimilation (DA) in complex partially observed non-linear systems. Stochastic parameterization methods have been proposed in recent years as a means of capturing the uncertainty associated with unresolved sub-grid scale processes. Such approaches generally require some knowledge of the true sub-grid scale process or rely on full observations of the larger scale resolved process. We present a methodology for estimating the statistics of sub-grid scale processes using only partial observations of the resolved process. It finds model error realisations over a training period by minimizing their conditional variance, constrained by available observations. Special is that these realisations are binned conditioned on the previous model state during the minimization process, allowing for the recovery of complex error structures. The efficacy of the approach is demonstrated through numerical experiments on the multi-scale Lorenz 96' model. We consider different parameterizations of the model with both small and large time scale separations between slow and fast variables. Results are compared to two existing methods for accounting for model uncertainty in DA and shown to provide improved analyses and forecasts.

  20. Political orientation moderates worldview defense in response to Osama bin Laden’s death

    PubMed Central

    Chopik, William J.; Konrath, Sara H.

    2016-01-01

    The current study examines Americans’ psychological responses to Osama bin Laden’s death. We tracked changes in how different participants responded to dissimilar others from the night of bin Laden’s death for five weeks. Liberal participants reported lower worldview defense (i.e., a defensive reaction to uphold one’s cultural worldview) immediately after bin Laden’s death but then returned to similar levels as their conservative counterparts over time. Conservative participants reported greater worldview defense during each point of the study and did not significantly change over time. These temporal differences between liberals and conservatives were only present in the year of bin Laden’s death and not one year prior before. The current findings demonstrate that liberals and conservatives may react differently after major societal events in predictable ways considering their moral foundations. PMID:28239251

  1. Use of nonpathogenic, green fluorescent protein-marked Escherichia coli Biotype I cultures to evaluate the self-cleansing capabilities of a commercial beef grinding system after a contamination event.

    PubMed

    Wages, Jennifer A; Williams, Jennifer; Adams, Jacquelyn; George, Bruce; Oxford, Eric; Zelenka, Dan

    2014-11-01

    Inoculated beef trim containing a cocktail of green fluorescent protein-marked Escherichia coli biotype I cultures as surrogates for E. coli O157:H7 was introduced into two large, commercial grinding facilities capable of producing 180,000 kg of ground product in 1 day. Three repetitions were performed over 3 days. Sampling occurred at three different points within the process: postprimary grind, postsecondary grind-blender, and postpackaging. Resulting data show that, as the inoculated meat passes through the system, the presence of the marked surrogate quickly diminishes. The depletion rates are directly related to the amount of product in kilograms (represented by time) that has passed through the system, but these rates vary with each step of the process. The primary grinder appears to rid itself of the contaminant the most quickly; in all repetitions, the contaminant was not detected within 5 min of introduction of the contaminated combo bin into the system, which in all cases, was prior to the introduction of a second combo bin and within 1,800 kg of product. After the blending step and subsequent secondary grinding, the contaminant was detected in product produced from both the parent combo and the combo bin added directly after the parent combo bin; however, for those days on which three combo bins (approximately 2,700 kg) were available for sampling, the contaminant was not detected from product representing the third combo bin. Similarly, at the packaging step, the contaminant was detected in the product produced by both the parent and second combo bins; however, on those days when a third combo bin was available for sampling (repetitions 2 and 3), the contaminant was not detected from product produced from the third combo bin.

  2. Small cell ovarian carcinoma: genomic stability and responsiveness to therapeutics.

    PubMed

    Gamwell, Lisa F; Gambaro, Karen; Merziotis, Maria; Crane, Colleen; Arcand, Suzanna L; Bourada, Valerie; Davis, Christopher; Squire, Jeremy A; Huntsman, David G; Tonin, Patricia N; Vanderhyden, Barbara C

    2013-02-21

    The biology of small cell ovarian carcinoma of the hypercalcemic type (SCCOHT), which is a rare and aggressive form of ovarian cancer, is poorly understood. Tumourigenicity, in vitro growth characteristics, genetic and genomic anomalies, and sensitivity to standard and novel chemotherapeutic treatments were investigated in the unique SCCOHT cell line, BIN-67, to provide further insight in the biology of this rare type of ovarian cancer. The tumourigenic potential of BIN-67 cells was determined and the tumours formed in a xenograft model was compared to human SCCOHT. DNA sequencing, spectral karyotyping and high density SNP array analysis was performed. The sensitivity of the BIN-67 cells to standard chemotherapeutic agents and to vesicular stomatitis virus (VSV) and the JX-594 vaccinia virus was tested. BIN-67 cells were capable of forming spheroids in hanging drop cultures. When xenografted into immunodeficient mice, BIN-67 cells developed into tumours that reflected the hypercalcemia and histology of human SCCOHT, notably intense expression of WT-1 and vimentin, and lack of expression of inhibin. Somatic mutations in TP53 and the most common activating mutations in KRAS and BRAF were not found in BIN-67 cells by DNA sequencing. Spectral karyotyping revealed a largely normal diploid karyotype (in greater than 95% of cells) with a visibly shorter chromosome 20 contig. High density SNP array analysis also revealed few genomic anomalies in BIN-67 cells, which included loss of heterozygosity of an estimated 16.7 Mb interval on chromosome 20. SNP array analyses of four SCCOHT samples also indicated a low frequency of genomic anomalies in the majority of cases. Although resistant to platinum chemotherapeutic drugs, BIN-67 cell viability in vitro was reduced by > 75% after infection with oncolytic viruses. These results show that SCCOHT differs from high-grade serous carcinomas by exhibiting few chromosomal anomalies and lacking TP53 mutations. Although BIN-67 cells are resistant to standard chemotherapeutic agents, their sensitivity to oncolytic viruses suggests that their therapeutic use in SCCOHT should be considered.

  3. Structural, electronic, vibrational and optical properties of Bin clusters

    NASA Astrophysics Data System (ADS)

    Liang, Dan; Shen, Wanting; Zhang, Chunfang; Lu, Pengfei; Wang, Shumin

    2017-10-01

    The neutral, anionic and cationic bismuth clusters with the size n up to 14 are investigated by using B3LYP functional within the regime of density functional theory and the LAN2DZ basis set. By analysis of the geometries of the Bin (n = 2-14) clusters, where cationic and anionic bismuth clusters are largely similar to those of neutral ones, a periodic effect by adding units with one to four atoms into smaller cluster to form larger cluster is drawn for the stable structures of bismuth clusters. An even-odd alteration is shown for the properties of the clusters, such as the calculated binding energies and dissociation energies, as well as frontier orbital energies, electron affinities, ionization energies. All the properties indicate that the Bi4 cluster is the most possible existence in bismuth-containing materials, which supports the most recent experiment. The orbital compositions, infrared and Raman activities and the ultraviolet absorption of the most possible tetramer bismuth cluster are given in detail to reveal the periodic tendency of adding bismuth atoms and the stability of tetramer bismuth cluster.

  4. Improved workflow for quantification of left ventricular volumes and mass using free-breathing motion corrected cine imaging.

    PubMed

    Cross, Russell; Olivieri, Laura; O'Brien, Kendall; Kellman, Peter; Xue, Hui; Hansen, Michael

    2016-02-25

    Traditional cine imaging for cardiac functional assessment requires breath-holding, which can be problematic in some situations. Free-breathing techniques have relied on multiple averages or real-time imaging, producing images that can be spatially and/or temporally blurred. To overcome this, methods have been developed to acquire real-time images over multiple cardiac cycles, which are subsequently motion corrected and reformatted to yield a single image series displaying one cardiac cycle with high temporal and spatial resolution. Application of these algorithms has required significant additional reconstruction time. The use of distributed computing was recently proposed as a way to improve clinical workflow with such algorithms. In this study, we have deployed a distributed computing version of motion corrected re-binning reconstruction for free-breathing evaluation of cardiac function. Twenty five patients and 25 volunteers underwent cardiovascular magnetic resonance (CMR) for evaluation of left ventricular end-systolic volume (ESV), end-diastolic volume (EDV), and end-diastolic mass. Measurements using motion corrected re-binning were compared to those using breath-held SSFP and to free-breathing SSFP with multiple averages, and were performed by two independent observers. Pearson correlation coefficients and Bland-Altman plots tested agreement across techniques. Concordance correlation coefficient and Bland-Altman analysis tested inter-observer variability. Total scan plus reconstruction times were tested for significant differences using paired t-test. Measured volumes and mass obtained by motion corrected re-binning and by averaged free-breathing SSFP compared favorably to those obtained by breath-held SSFP (r = 0.9863/0.9813 for EDV, 0.9550/0.9685 for ESV, 0.9952/0.9771 for mass). Inter-observer variability was good with concordance correlation coefficients between observers across all acquisition types suggesting substantial agreement. Both motion corrected re-binning and averaged free-breathing SSFP acquisition and reconstruction times were shorter than breath-held SSFP techniques (p < 0.0001). On average, motion corrected re-binning required 3 min less than breath-held SSFP imaging, a 37% reduction in acquisition and reconstruction time. The motion corrected re-binning image reconstruction technique provides robust cardiac imaging that can be used for quantification that compares favorably to breath-held SSFP as well as multiple average free-breathing SSFP, but can be obtained in a fraction of the time when using cloud-based distributed computing reconstruction.

  5. An efficient computational approach to model statistical correlations in photon counting x-ray detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faby, Sebastian; Maier, Joscha; Sawall, Stefan

    2016-07-15

    Purpose: To introduce and evaluate an increment matrix approach (IMA) describing the signal statistics of energy-selective photon counting detectors including spatial–spectral correlations between energy bins of neighboring detector pixels. The importance of the occurring correlations for image-based material decomposition is studied. Methods: An IMA describing the counter increase patterns in a photon counting detector is proposed. This IMA has the potential to decrease the number of required random numbers compared to Monte Carlo simulations by pursuing an approach based on convolutions. To validate and demonstrate the IMA, an approximate semirealistic detector model is provided, simulating a photon counting detector inmore » a simplified manner, e.g., by neglecting count rate-dependent effects. In this way, the spatial–spectral correlations on the detector level are obtained and fed into the IMA. The importance of these correlations in reconstructed energy bin images and the corresponding detector performance in image-based material decomposition is evaluated using a statistically optimal decomposition algorithm. Results: The results of IMA together with the semirealistic detector model were compared to other models and measurements using the spectral response and the energy bin sensitivity, finding a good agreement. Correlations between the different reconstructed energy bin images could be observed, and turned out to be of weak nature. These correlations were found to be not relevant in image-based material decomposition. An even simpler simulation procedure based on the energy bin sensitivity was tested instead and yielded similar results for the image-based material decomposition task, as long as the fact that one incident photon can increase multiple counters across neighboring detector pixels is taken into account. Conclusions: The IMA is computationally efficient as it required about 10{sup 2} random numbers per ray incident on a detector pixel instead of an estimated 10{sup 8} random numbers per ray as Monte Carlo approaches would need. The spatial–spectral correlations as described by IMA are not important for the studied image-based material decomposition task. Respecting the absolute photon counts and thus the multiple counter increases by a single x-ray photon, the same material decomposition performance could be obtained with a simpler detector description using the energy bin sensitivity.« less

  6. DNA barcoding for species identification in deep-sea clams (Mollusca: Bivalvia: Vesicomyidae).

    PubMed

    Liu, Jun; Zhang, Haibin

    2018-01-15

    Deep-sea clams (Bivalvia: Vesicomyidae) have been found in reduced environments over the world oceans, but taxonomy of this group remains confusing at species and supraspecific levels due to their high-morphological similarity and plasticity. In the present study, we collected mitochondrial COI sequences to evaluate the utility of DNA barcoding on identifying vesicomyid species. COI dataset identified 56 well-supported putative species/operational taxonomic units (OTUs), approximately covering half of the extant vesicomyid species. One species (OTU2) was first detected, and may represent a new species. Average distances between species ranged from 1.65 to 29.64%, generally higher than average intraspecific distances (0-1.41%) when excluding Pliocardia sp.10 cf. venusta (average intraspecific distance 1.91%). Local barcoding gap existed in 33 of the 35 species when comparing distances of maximum interspecific and minimum interspecific distances with two exceptions (Abyssogena southwardae and Calyptogena rectimargo-starobogatovi). The barcode index number (BIN) system determined 41 of the 56 species/OTUs, each with a unique BIN, indicating their validity. Three species were found to have two BINs, together with their high level of intraspecific variation, implying cryptic diversity within them. Although fewer 16 S sequences were collected, similar results were obtained. Nineteen putative species were determined and no overlap observed between intra- and inter-specific variation. Implications of DNA barcoding for the Vesicomyidae taxonomy were then discussed. Findings of this study will provide important evidence for taxonomic revision in this problematic clam group, and accelerate the discovery of new vesicomyid species in the future.

  7. Survival and aging of a small laboratory population of a marine mollusc, Aplysia californica.

    PubMed

    Hirsch, H R; Peretz, B

    1984-09-01

    In an investigation of the postmetamorphic survival of a population of 112 Aplysia californica, five animals died before 100 days of age and five after 200 days. The number of survivors among the 102 animals which died between 100 and 220 days declined approximately linearly with age. The median age at death was 155 days. The animals studied were those that died of natural causes within a laboratory population that was established to provide Aplysia for sacrifice in an experimental program. Actuarial separation of the former group from the latter was justified by theoretical consideration. Age-specific mortality rates were calculated from the survival data. Statistical fluctuation arising from the small size of the population was reduced by grouping the data in bins of unequal age duration. The durations were specified such that each bin contained approximately the same number of data points. An algorithm for choosing the number of data bins was based on the requirement that the precision with which the age of a group is determined should equal the precision with which the number of deaths in the groups is known. The Gompertz and power laws of mortality were fitted to the age-specific mortality-rate data with equally good results. The positive values of slope associated with the mortality-rate functions as well as the linear shape of the curve of survival provide actuarial evidence that Aplysia age. Since Aplysia grow linearly without approaching a limiting size, the existence of senescence indicates especially clearly the falsity of Bidder's hypothesis that aging is a by-product of the cessation of growth.

  8. Comprehensive DNA barcoding of the herpetofauna of Germany.

    PubMed

    Hawlitschek, O; Morinière, J; Dunz, A; Franzen, M; Rödder, D; Glaw, F; Haszprunar, G

    2016-01-01

    We present the first comprehensive DNA barcoding study of German reptiles and amphibians representing likewise the first on the European herpetofauna. A total of 248 barcodes for all native species and subspecies in the country and a few additional taxa were obtained in the framework of the projects 'Barcoding Fauna Bavarica' (BFB) and 'German Barcode of Life' (GBOL). In contrast to many invertebrate groups, the success rate of the identification of mitochondrial lineages representing species via DNA barcode was almost 100% because no cases of Barcode Index Number (BIN) sharing were detected within German native reptiles and amphibians. However, as expected, a reliable identification of the hybridogenetic species complex in the frog genus Pelophylax was not possible. Deep conspecific lineages resulting in the identification of more than one BIN were found in Lissotriton vulgaris, Natrix natrix and the hybridogenetic Pelophylax complex. A high variety of lineages with different BINs was also found in the barcodes of wall lizards (Podarcis muralis), confirming the existence of many introduced lineages and the frequent occurrence of multiple introductions. Besides the reliable species identification of all life stages and even of tissue remains, our study highlights other potential applications of DNA barcoding concerning German amphibians and reptiles, such as the detection of allochthonous lineages, monitoring of gene flow and also noninvasive sampling via environmental DNA. DNA barcoding based on COI has now proven to be a reliable and efficient tool for studying most amphibians and reptiles as it is already for many other organism groups in zoology. © 2015 John Wiley & Sons Ltd.

  9. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling..., expressed in m2 and rounded to two decimal places. Where we allow you to group multiple configurations... bin based on the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin...

  10. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling... to two decimal places. Where we allow you to group multiple configurations together, measure the drag... the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin I or Bin II...

  11. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling... to two decimal places. Where we allow you to group multiple configurations together, measure the drag... the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin I or Bin II...

  12. A compost bin for handling privy wastes: its fabrication and use

    Treesearch

    R.E. Leonard; S.C. Fay

    1978-01-01

    A 24-ft3 (6.8-m3) fiberglass bin was constructed and tested for its effectiveness in composting privy wastes. A mixture of ground hardwood bark and raw sewage was used for composting. Temperatures in excess of 60°C for 36 hours were produced in the bin by aerobic, thermophilic composting. This temperature is...

  13. Development of a Novel Therapeutic Paradigm Utilizing a Mammary Gland-Targeted, Bin-1 Knockout Mouse Model

    DTIC Science & Technology

    2007-03-01

    Cell. Biol. 23, 4295 (Jun, 2003). Bin1 Ablation in Mammary Gland Delays Tissue Remodeling and Drives Cancer Progression Mee Young Chang, 1...Basu A, et al. Bin1 functionally interacts with Myc in cells and inhibits cell proliferation by multiple mechanisms. Oncogene 1999;18:3564–73. 5. Pineda

  14. Causes of Students' Violence at Al-Hussein Bin Talal University

    ERIC Educational Resources Information Center

    Alrawwad, Theeb M.; Alrfooh, Atif Eid

    2014-01-01

    This study aimed at identifying the causes of students' violence from the student's point of view, and also aimed at investigating the proper solutions to reduce the spread of violence at Al-Hussein Bin Talal University. The study sample consisted of (906) male and female students from Al-Hussein Bin Talal University, who have enrolled the summer…

  15. Conservative Bin-to-Bin Fractional Collisions

    DTIC Science & Technology

    2016-06-28

    BIN FRACTIONAL COLLISIONS Robert Martin ERC INC., SPACECRAFT PROPULSION BRANCH AIR FORCE RESEARCH LABORATORY EDWARDS AIR FORCE BASE, CA USA 30th...IMPORTANCE OF COLLISION PHYSICS Important Collisions in Spacecraft Propulsion : Discharge and Breakdown in FRC Collisional Radiative Cooling/Ionization...UNLIMITED; PA #16326 3 / 18 IMPORTANCE OF COLLISION PHYSICS Important Collisions in Spacecraft Propulsion : Discharge and Breakdown in FRC Collisional

  16. Development and preliminary evaluation of a new bin filler for apple harvesting and and infield sorting

    USDA-ARS?s Scientific Manuscript database

    The bin filler, which is used for filling the fruit container or bin with apples coming from the sorting system, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges in developing the bi...

  17. A Software Assurance Framework for Mitigating the Risks of Malicious Software in Embedded Systems Used in Aircraft

    DTIC Science & Technology

    2011-09-01

    to show cryptographic signature # generation on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp/csdb CODEBASE=. touch "$CSDB" find "$CODEBASE" -type f...artifacts generated earlier. 81 #! /bin/sh # # Demo program to show cryptographic signature # verification on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp

  18. 14. OBLIQUE VIEW OF UPPER ORE BIN AND LOADING DECK, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. OBLIQUE VIEW OF UPPER ORE BIN AND LOADING DECK, LOOKING WEST. DETAIL OF SUPPORTING TIMBERS. THE LOCATION OF THIS ORE BIN IN RELATION TO THE MILL CAN BE SEEN IN MANY OF THE MILL OVERVIEWS. (CA-290-4 THROUGH CA-290-8). - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  19. COCACOLA: binning metagenomic contigs using sequence COmposition, read CoverAge, CO-alignment and paired-end read LinkAge.

    PubMed

    Lu, Yang Young; Chen, Ting; Fuhrman, Jed A; Sun, Fengzhu

    2017-03-15

    The advent of next-generation sequencing technologies enables researchers to sequence complex microbial communities directly from the environment. Because assembly typically produces only genome fragments, also known as contigs, instead of an entire genome, it is crucial to group them into operational taxonomic units (OTUs) for further taxonomic profiling and down-streaming functional analysis. OTU clustering is also referred to as binning. We present COCACOLA, a general framework automatically bin contigs into OTUs based on sequence composition and coverage across multiple samples. The effectiveness of COCACOLA is demonstrated in both simulated and real datasets in comparison with state-of-art binning approaches such as CONCOCT, GroopM, MaxBin and MetaBAT. The superior performance of COCACOLA relies on two aspects. One is using L 1 distance instead of Euclidean distance for better taxonomic identification during initialization. More importantly, COCACOLA takes advantage of both hard clustering and soft clustering by sparsity regularization. In addition, the COCACOLA framework seamlessly embraces customized knowledge to facilitate binning accuracy. In our study, we have investigated two types of additional knowledge, the co-alignment to reference genomes and linkage of contigs provided by paired-end reads, as well as the ensemble of both. We find that both co-alignment and linkage information further improve binning in the majority of cases. COCACOLA is scalable and faster than CONCOCT, GroopM, MaxBin and MetaBAT. The software is available at https://github.com/younglululu/COCACOLA . fsun@usc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  20. Solar Radiation Pressure Binning for the Geosynchronous Orbit

    NASA Technical Reports Server (NTRS)

    Hejduk, M. D.; Ghrist, R. W.

    2011-01-01

    Orbital maintenance parameters for individual satellites or groups of satellites have traditionally been set by examining orbital parameters alone, such as through apogee and perigee height binning; this approach ignored the other factors that governed an individual satellite's susceptibility to non-conservative forces. In the atmospheric drag regime, this problem has been addressed by the introduction of the "energy dissipation rate," a quantity that represents the amount of energy being removed from the orbit; such an approach is able to consider both atmospheric density and satellite frontal area characteristics and thus serve as a mechanism for binning satellites of similar behavior. The geo-synchronous orbit (of broader definition than the geostationary orbit -- here taken to be from 1300 to 1800 minutes in orbital period) is not affected by drag; rather, its principal non-conservative force is that of solar radiation pressure -- the momentum imparted to the satellite by solar radiometric energy. While this perturbation is solved for as part of the orbit determination update, no binning or division scheme, analogous to the drag regime, has been developed for the geo-synchronous orbit. The present analysis has begun such an effort by examining the behavior of geosynchronous rocket bodies and non-stabilized payloads as a function of solar radiation pressure susceptibility. A preliminary examination of binning techniques used in the drag regime gives initial guidance regarding the criteria for useful bin divisions. Applying these criteria to the object type, solar radiation pressure, and resultant state vector accuracy for the analyzed dataset, a single division of "large" satellites into two bins for the purposes of setting related sensor tasking and orbit determination (OD) controls is suggested. When an accompanying analysis of high area-to-mass objects is complete, a full set of binning recommendations for the geosynchronous orbit will be available.

  1. Vicarious revenge and the death of Osama bin Laden.

    PubMed

    Gollwitzer, Mario; Skitka, Linda J; Wisneski, Daniel; Sjöström, Arne; Liberman, Peter; Nazir, Syed Javed; Bushman, Brad J

    2014-05-01

    Three hypotheses were derived from research on vicarious revenge and tested in the context of the assassination of Osama bin Laden in 2011. In line with the notion that revenge aims at delivering a message (the "message hypothesis"), Study 1 shows that Americans' vengeful desires in the aftermath of 9/11 predicted a sense of justice achieved after bin Laden's death, and that this effect was mediated by perceptions that his assassination sent a message to the perpetrators to not "mess" with the United States. In line with the "blood lust hypothesis," his assassination also sparked a desire to take further revenge and to continue the "war on terror." Finally, in line with the "intent hypothesis," Study 2 shows that Americans (but not Pakistanis or Germans) considered the fact that bin Laden was killed intentionally more satisfactory than the possibility of bin Laden being killed accidentally (e.g., in an airplane crash).

  2. Garbage monitoring system using IoT

    NASA Astrophysics Data System (ADS)

    Anitha, A.

    2017-11-01

    Nowadays certain actions are taken to improve the level of cleanliness in the country. People are getting more active in doing all the things possible to clean their surroundings. Various movements are also started by the government to increase cleanliness. We will try to build a system which will notify the corporations to empty the bin on time. In this system, we will put a sensor on top of the garbage bin which will detect the total level of garbage inside it according to the total size of the bin. When the garbage will reach the maximum level, a notification will be sent to the corporation's office, then the employees can take further actions to empty the bin. This system will help in cleaning the city in a better way. By using this system people do not have to check all the systems manually but they will get a notification when the bin will get filled.

  3. Deterministically swapping frequency-bin entanglement from photon-photon to atom-photon hybrid systems

    NASA Astrophysics Data System (ADS)

    Ou, Bao-Quan; Liu, Chang; Sun, Yuan; Chen, Ping-Xing

    2018-02-01

    Inspired by the recent developments of the research on the atom-photon quantum interface and energy-time entanglement between single-photon pulses, we are motivated to study the deterministic protocol for the frequency-bin entanglement of the atom-photon hybrid system, which is analogous to the frequency-bin entanglement between single-photon pulses. We show that such entanglement arises naturally in considering the interaction between a frequency-bin entangled single-photon pulse pair and a single atom coupled to an optical cavity, via straightforward atom-photon phase gate operations. Its anticipated properties and preliminary examples of its potential application in quantum networking are also demonstrated. Moreover, we construct a specific quantum entanglement witness tool to detect such extended frequency-bin entanglement from a reasonably general set of separable states, and prove its capability theoretically. We focus on the energy-time considerations throughout the analysis.

  4. Real-time hyperspectral imaging for food safety applications

    USDA-ARS?s Scientific Manuscript database

    Multispectral imaging systems with selected bands can commonly be used for real-time applications of food processing. Recent research has demonstrated several image processing methods including binning, noise removal filter, and appropriate morphological analysis in real-time mode can remove most fa...

  5. Multibin long-range correlations

    NASA Astrophysics Data System (ADS)

    Bialas, A.; Zalewski, K.

    2011-06-01

    A new method to study the long-range correlations in multiparticle production is developed. It is proposed to measure the joint factorial moments or cumulants of multiplicity distribution in several (more than two) bins. It is shown that this step dramatically increases the discriminative power of data.

  6. Resolving prokaryotic taxonomy without rRNA: longer oligonucleotide word lengths improve genome and metagenome taxonomic classification.

    PubMed

    Alsop, Eric B; Raymond, Jason

    2013-01-01

    Oligonucleotide signatures, especially tetranucleotide signatures, have been used as method for homology binning by exploiting an organism's inherent biases towards the use of specific oligonucleotide words. Tetranucleotide signatures have been especially useful in environmental metagenomics samples as many of these samples contain organisms from poorly classified phyla which cannot be easily identified using traditional homology methods, including NCBI BLAST. This study examines oligonucleotide signatures across 1,424 completed genomes from across the tree of life, substantially expanding upon previous work. A comprehensive analysis of mononucleotide through nonanucleotide word lengths suggests that longer word lengths substantially improve the classification of DNA fragments across a range of sizes of relevance to high throughput sequencing. We find that, at present, heptanucleotide signatures represent an optimal balance between prediction accuracy and computational time for resolving taxonomy using both genomic and metagenomic fragments. We directly compare the ability of tetranucleotide and heptanucleotide world lengths (tetranucleotide signatures are the current standard for oligonucleotide word usage analyses) for taxonomic binning of metagenome reads. We present evidence that heptanucleotide word lengths consistently provide more taxonomic resolving power, particularly in distinguishing between closely related organisms that are often present in metagenomic samples. This implies that longer oligonucleotide word lengths should replace tetranucleotide signatures for most analyses. Finally, we show that the application of longer word lengths to metagenomic datasets leads to more accurate taxonomic binning of DNA scaffolds and have the potential to substantially improve taxonomic assignment and assembly of metagenomic data.

  7. Comparison through a prospective and randomized study of two replenishment methods at polyvalent hospitalization units with two-bin storage systems

    PubMed

    Bernal, José Luis; Mera-Flores, Ana María; Baena Lázaro, Pedro Pablo; Sebastián Viana, Tomás

    2017-11-27

    Two-bin storage systems increase nursing staff satisfaction and decrease inventories, but the implications that logistic staff would determine the needs of replenishment are unknown. This study aimed to evaluate whether entrust to logistics staff this responsibility at the polyvalent hospitalization units with two-bin storage is associated with higher risk of outstanding orders. This was a prospective randomized experiment whit masking. Outstanding orders were considered variable response, those corresponding to assessments of the logistics staff were included in the control group and those corresponding to the nursing staff in the control group. Concordance between observers was analyzed using the Bland-Altman method; the difference between groups, with the U of Mann-Whitney and the cumulative incidence of outstanding orders and their relative risk was calculated. The mean amount requested by the logistic and nursing staff was 29.9 (SD:167.4) and 36 (SD:190) units respectively, the mean difference between observers was 6.11 (SD:128.95) units and no significant differences were found between groups (p = 0.430). The incidence of outstanding orders was 0.64% in the intervention group and 0.15% in the control group; the relative risk, 2.31 (0.83 - 6.48) and the number of cases required for an outstanding order, 516. Outstanding order relative risk is not associated with the category of the staff that identifies the replenishment needs at the polyvalent hospitalization units.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lazaris, Andreas; Hwang, Kristy S.; Goukasian, Naira

    Objective: We investigated the association between apoE protein plasma levels and brain amyloidosis and the effect of the top 10 Alzheimer disease (AD) risk genes on this association. Methods: Our dataset consisted of 18 AD, 52 mild cognitive impairment, and 3 cognitively normal Alzheimer's Disease Neuroimaging Initiative 1 (ADNI1) participants with available [ 11C]-Pittsburgh compound B (PiB) and peripheral blood protein data. We used cortical pattern matching to study associations between plasma apoE and cortical PiB binding and the effect of carrier status for the top 10 AD risk genes. Results: Low plasma apoE was significantly associated with high PiBmore » SUVR, except in the sensorimotor and entorhinal cortex. For BIN1 rs744373, the association was observed only in minor allele carriers. For CD2AP rs9349407 and CR1 rs3818361, the association was preserved only in minor allele noncarriers. We did not find evidence for modulation by CLU, PICALM, ABCA7, BIN1, and MS4A6A. Conclusions: Our data show that BIN1 rs744373, CD2AP rs9349407, and CR1 rs3818361 genotypes modulate the association between apoE protein plasma levels and brain amyloidosis, implying a potential epigenetic/downstream interaction.« less

  9. Getting coal to go with the flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumbaugh, G.D.

    1984-01-01

    There are three accepted methods of recovering storage piles. They are surface reclaiming, sub-grade hopper sections or bins, and flat surface storage with ground level ports. In general, the decision to use either approach is a matter of economics, reliability, labor intensity, and other related practical factors. The concept of induced vertical flow of bulk solids was initiated in 1962 with the birth of the bin activator. Its performance was at times questionable until the elusive cycle type operation was finally discovered. This solved the problems of coupling induced vertical flow units with feeders. Surprisingly, an operator in a cementmore » plant was the first to demonstrate this principle of operation in 1965, but it needed at least five more years for it to be fully understood. The storage pile discharger with its drawdown skirt and unique stroke action was developed out of sheer necessity in 1964. However, it was not until 1979 that the railcar discharger was introduced. Frankly, it took that long to recognize a railcar could be temporarily converted to a huge rectangular shaped activated binexclamation Significantly, all induced vertical flow units are designed and operated for the sole purpose of bulk solid storage withdrawal. They have no other function. For many reasons, the successful evolution of the concept of induced vertical flow of bulk solids has been one of more perspiration than of meditation. Armed with time proven application guidelines and cycle type operation to minimize the effects of feeder flow streams, bin activators, activated bins, storage pile dischargers, and railcar dischargers can be applied confidently and predictably.« less

  10. Development of a global aerosol model using a two-dimensional sectional method: 1. Model design

    NASA Astrophysics Data System (ADS)

    Matsui, H.

    2017-08-01

    This study develops an aerosol module, the Aerosol Two-dimensional bin module for foRmation and Aging Simulation version 2 (ATRAS2), and implements the module into a global climate model, Community Atmosphere Model. The ATRAS2 module uses a two-dimensional (2-D) sectional representation with 12 size bins for particles from 1 nm to 10 μm in dry diameter and 8 black carbon (BC) mixing state bins. The module can explicitly calculate the enhancement of absorption and cloud condensation nuclei activity of BC-containing particles by aging processes. The ATRAS2 module is an extension of a 2-D sectional aerosol module ATRAS used in our previous studies within a framework of a regional three-dimensional model. Compared with ATRAS, the computational cost of the aerosol module is reduced by more than a factor of 10 by simplifying the treatment of aerosol processes and 2-D sectional representation, while maintaining good accuracy of aerosol parameters in the simulations. Aerosol processes are simplified for condensation of sulfate, ammonium, and nitrate, organic aerosol formation, coagulation, and new particle formation processes, and box model simulations show that these simplifications do not substantially change the predicted aerosol number and mass concentrations and their mixing states. The 2-D sectional representation is simplified (the number of advected species is reduced) primarily by the treatment of chemical compositions using two interactive bin representations. The simplifications do not change the accuracy of global aerosol simulations. In part 2, comparisons with measurements and the results focused on aerosol processes such as BC aging processes are shown.

  11. Looking through the same lens: Shear calibration for LSST, Euclid, and WFIRST with stage 4 CMB lensing

    NASA Astrophysics Data System (ADS)

    Schaan, Emmanuel; Krause, Elisabeth; Eifler, Tim; Doré, Olivier; Miyatake, Hironao; Rhodes, Jason; Spergel, David N.

    2017-06-01

    The next-generation weak lensing surveys (i.e., LSST, Euclid, and WFIRST) will require exquisite control over systematic effects. In this paper, we address shear calibration and present the most realistic forecast to date for LSST/Euclid/WFIRST and CMB lensing from a stage 4 CMB experiment ("CMB S4"). We use the cosmolike code to simulate a joint analysis of all the two-point functions of galaxy density, galaxy shear, and CMB lensing convergence. We include the full Gaussian and non-Gaussian covariances and explore the resulting joint likelihood with Monte Carlo Markov chains. We constrain shear calibration biases while simultaneously varying cosmological parameters, galaxy biases, and photometric redshift uncertainties. We find that CMB lensing from CMB S4 enables the calibration of the shear biases down to 0.2%-3% in ten tomographic bins for LSST (below the ˜0.5 % requirements in most tomographic bins), down to 0.4%-2.4% in ten bins for Euclid, and 0.6%-3.2% in ten bins for WFIRST. For a given lensing survey, the method works best at high redshift where shear calibration is otherwise most challenging. This self-calibration is robust to Gaussian photometric redshift uncertainties and to a reasonable level of intrinsic alignment. It is also robust to changes in the beam and the effectiveness of the component separation of the CMB experiment, and slowly dependent on its depth, making it possible with third-generation CMB experiments such as AdvACT and SPT-3G, as well as the Simons Observatory.

  12. Multi-Level Bitmap Indexes for Flash Memory Storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Madduri, Kamesh; Canon, Shane

    2010-07-23

    Due to their low access latency, high read speed, and power-efficient operation, flash memory storage devices are rapidly emerging as an attractive alternative to traditional magnetic storage devices. However, tests show that the most efficient indexing methods are not able to take advantage of the flash memory storage devices. In this paper, we present a set of multi-level bitmap indexes that can effectively take advantage of flash storage devices. These indexing methods use coarsely binned indexes to answer queries approximately, and then use finely binned indexes to refine the answers. Our new methods read significantly lower volumes of data atmore » the expense of an increased disk access count, thus taking full advantage of the improved read speed and low access latency of flash devices. To demonstrate the advantage of these new indexes, we measure their performance on a number of storage systems using a standard data warehousing benchmark called the Set Query Benchmark. We observe that multi-level strategies on flash drives are up to 3 times faster than traditional indexing strategies on magnetic disk drives.« less

  13. Taking Halo-Independent Dark Matter Methods Out of the Bin

    DOE PAGES

    Fox, Patrick J.; Kahn, Yonatan; McCullough, Matthew

    2014-10-30

    We develop a new halo-independent strategy for analyzing emerging DM hints, utilizing the method of extended maximum likelihood. This approach does not require the binning of events, making it uniquely suited to the analysis of emerging DM direct detection hints. It determines a preferred envelope, at a given confidence level, for the DM velocity integral which best fits the data using all available information and can be used even in the case of a single anomalous scattering event. All of the halo-independent information from a direct detection result may then be presented in a single plot, allowing simple comparisons betweenmore » multiple experiments. This results in the halo-independent analogue of the usual mass and cross-section plots found in typical direct detection analyses, where limit curves may be compared with best-fit regions in halo-space. The method is straightforward to implement, using already-established techniques, and its utility is demonstrated through the first unbinned halo-independent comparison of the three anomalous events observed in the CDMS-Si detector with recent limits from the LUX experiment.« less

  14. 16 CFR § 1301.6 - Test conditions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Test conditions. § 1301.6 Section § 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  15. 29 CFR 1917.49 - Spouts, chutes, hoppers, bins, and associated equipment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the entry; and (2) The power supply to the equipment carrying the cargo to the bin shall be turned off... been notified of the entry; (2) The power supply to the equipment carrying the cargo to the bin is... adjustments are made to a power shovel, wire, or associated equipment, the power supply to the shovel shall be...

  16. 29 CFR 1917.49 - Spouts, chutes, hoppers, bins, and associated equipment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the entry; and (2) The power supply to the equipment carrying the cargo to the bin shall be turned off... been notified of the entry; (2) The power supply to the equipment carrying the cargo to the bin is... adjustments are made to a power shovel, wire, or associated equipment, the power supply to the shovel shall be...

  17. DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN. CONVEYOR PLATFORM,TRAM TRESTLE, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN. CONVEYOR PLATFORM,TRAM TRESTLE, AND LOADING PLATFORM. LOOKING SOUTHWEST. THE HOLE IN THE ORE BIN FLOOR CAN BE SEEN, AND BALL MILL FOUNDATION AT LOWER LEFT CORNER. SEE CA-291-47(CT) FOR IDENTICAL COLOR TRANSPARENCY. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  18. DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN, CONVEYOR PLATFORM TRAM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN, CONVEYOR PLATFORM TRAM TRESTLE, AND LOADING PLATFORM, LOOKING SOUTHWEST. THE HOLE IN THE ORE BIN FLOOR CAN BE SEEN, AND BALL MILL FOUNDATION AT LOWER LEFT CORNER. SEE CA-291-13 FOR IDENTICAL B&W NEGATIVE. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  19. 19 CFR 19.29 - Sealing of bins or other bonded space.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Bonded for the Storage of Wheat § 19.29 Sealing of bins or other bonded space. The outlets to all bins or other space bonded for the storage of imported wheat shall be sealed by affixing locks or in bond seals... which will effectively prevent the removal of, or access to, the wheat in the bonded space except under...

  20. Multi-step-ahead Method for Wind Speed Prediction Correction Based on Numerical Weather Prediction and Historical Measurement Data

    NASA Astrophysics Data System (ADS)

    Wang, Han; Yan, Jie; Liu, Yongqian; Han, Shuang; Li, Li; Zhao, Jing

    2017-11-01

    Increasing the accuracy of wind speed prediction lays solid foundation to the reliability of wind power forecasting. Most traditional correction methods for wind speed prediction establish the mapping relationship between wind speed of the numerical weather prediction (NWP) and the historical measurement data (HMD) at the corresponding time slot, which is free of time-dependent impacts of wind speed time series. In this paper, a multi-step-ahead wind speed prediction correction method is proposed with consideration of the passing effects from wind speed at the previous time slot. To this end, the proposed method employs both NWP and HMD as model inputs and the training labels. First, the probabilistic analysis of the NWP deviation for different wind speed bins is calculated to illustrate the inadequacy of the traditional time-independent mapping strategy. Then, support vector machine (SVM) is utilized as example to implement the proposed mapping strategy and to establish the correction model for all the wind speed bins. One Chinese wind farm in northern part of China is taken as example to validate the proposed method. Three benchmark methods of wind speed prediction are used to compare the performance. The results show that the proposed model has the best performance under different time horizons.

  1. Time-frequency analysis-based time-windowing algorithm for the inverse synthetic aperture radar imaging of ships

    NASA Astrophysics Data System (ADS)

    Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong

    2018-01-01

    An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.

  2. Fungal volatiles associated with moldy grain in ventilated and non-ventilated bin-stored wheat.

    PubMed

    Sinha, R N; Tuma, D; Abramson, D; Muir, W E

    1988-01-01

    The fungal odor compounds 3-methyl-1-butanol, 1-octen-3-ol and 3-octanone were monitored in nine experimental bins in Winnipeg, Manitoba containing a hard red spring wheat during the autumn, winter and summer seasons of 1984-85. Quality changes were associated with seed-borne microflora and moisture content in both ventilated and non-ventilated bins containing wheat of 15.6 and 18.2% initial moisture content. All three odor compounds occurred in considerably greater amounts in bulk wheat in non-ventilated than in ventilated bins, particularly in those with wheat having 18.2% moisture content. The presence of these compounds usually coincided with infection of the seeds by the fungi Alternaria alternata (Fr.) Keissler, Aspergillus repens DeBarry, A. versicolor (Vuill.) Tiraboschi, Penicillium crustosum Thom, P. oxalicum Currie and Thom, P. aurantiogriesum Dierckx, and P. citrinum Thom. High production of all three odor compounds in damp wheat stored in non-ventilated bins was associated with heavy fungal infection of the seeds and reduction in seed germinability. High initial moisture content of the harvested grain accelerated the production of all three fungal volatiles in non-ventilated bins.

  3. Redshift distributions of galaxies in the Dark Energy Survey Science Verification shear catalogue and implications for weak lensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonnett, C.; Troxel, M. A.; Hartley, W.

    We present photometric redshift estimates for galaxies used in the weak lensing analysis of the Dark Energy Survey Science Verification (DES SV) data. Four model- or machine learning-based photometric redshift methods { annz2, bpz calibrated against BCC-U fig simulations, skynet, and tpz { are analysed. For training, calibration, and testing of these methods, we also construct a catalogue of spectroscopically confirmed galaxies matched against DES SV data. The performance of the methods is evalu-ated against the matched spectroscopic catalogue, focusing on metrics relevant for weak lensing analyses, with additional validation against COSMOS photo-zs. From the galaxies in the DES SVmore » shear catalogue, which have mean redshift 0.72 ±0.01 over the range 0:3 < z < 1:3, we construct three tomographic bins with means of z = {0.45; 0.67,1.00g}. These bins each have systematic uncertainties δ z ≲ 0.05 in the mean of the fiducial skynet photo-z n(z). We propagate the errors in the redshift distributions through to their impact on cosmological parameters estimated with cosmic shear, and find that they cause shifts in the value of σ 8 of approx. 3%. This shift is within the one sigma statistical errors on σ8 for the DES SV shear catalog. We also found that further study of the potential impact of systematic differences on the critical surface density, Σ crit, contained levels of bias safely less than the statistical power of DES SV data. We recommend a final Gaussian prior for the photo-z bias in the mean of n(z) of width 0:05 for each of the three tomographic bins, and show that this is a sufficient bias model for the corresponding cosmology analysis.« less

  4. NNvPDB: Neural Network based Protein Secondary Structure Prediction with PDB Validation.

    PubMed

    Sakthivel, Seethalakshmi; S K M, Habeeb

    2015-01-01

    The predicted secondary structural states are not cross validated by any of the existing servers. Hence, information on the level of accuracy for every sequence is not reported by the existing servers. This was overcome by NNvPDB, which not only reported greater Q3 but also validates every prediction with the homologous PDB entries. NNvPDB is based on the concept of Neural Network, with a new and different approach of training the network every time with five PDB structures that are similar to query sequence. The average accuracy for helix is 76%, beta sheet is 71% and overall (helix, sheet and coil) is 66%. http://bit.srmuniv.ac.in/cgi-bin/bit/cfpdb/nnsecstruct.pl.

  5. Dark Energy Survey Year 1 results: cross-correlation redshifts - methods and systematics characterization

    NASA Astrophysics Data System (ADS)

    Gatti, M.; Vielzeuf, P.; Davis, C.; Cawthon, R.; Rau, M. M.; DeRose, J.; De Vicente, J.; Alarcon, A.; Rozo, E.; Gaztanaga, E.; Hoyle, B.; Miquel, R.; Bernstein, G. M.; Bonnett, C.; Carnero Rosell, A.; Castander, F. J.; Chang, C.; da Costa, L. N.; Gruen, D.; Gschwend, J.; Hartley, W. G.; Lin, H.; MacCrann, N.; Maia, M. A. G.; Ogando, R. L. C.; Roodman, A.; Sevilla-Noarbe, I.; Troxel, M. A.; Wechsler, R. H.; Asorey, J.; Davis, T. M.; Glazebrook, K.; Hinton, S. R.; Lewis, G.; Lidman, C.; Macaulay, E.; Möller, A.; O'Neill, C. R.; Sommer, N. E.; Uddin, S. A.; Yuan, F.; Zhang, B.; Abbott, T. M. C.; Allam, S.; Annis, J.; Bechtol, K.; Brooks, D.; Burke, D. L.; Carollo, D.; Carrasco Kind, M.; Carretero, J.; Cunha, C. E.; D'Andrea, C. B.; DePoy, D. L.; Desai, S.; Eifler, T. F.; Evrard, A. E.; Flaugher, B.; Fosalba, P.; Frieman, J.; García-Bellido, J.; Gerdes, D. W.; Goldstein, D. A.; Gruendl, R. A.; Gutierrez, G.; Honscheid, K.; Hoormann, J. K.; Jain, B.; James, D. J.; Jarvis, M.; Jeltema, T.; Johnson, M. W. G.; Johnson, M. D.; Krause, E.; Kuehn, K.; Kuhlmann, S.; Kuropatkin, N.; Li, T. S.; Lima, M.; Marshall, J. L.; Melchior, P.; Menanteau, F.; Nichol, R. C.; Nord, B.; Plazas, A. A.; Reil, K.; Rykoff, E. S.; Sako, M.; Sanchez, E.; Scarpine, V.; Schubnell, M.; Sheldon, E.; Smith, M.; Smith, R. C.; Soares-Santos, M.; Sobreira, F.; Suchyta, E.; Swanson, M. E. C.; Tarle, G.; Thomas, D.; Tucker, B. E.; Tucker, D. L.; Vikram, V.; Walker, A. R.; Weller, J.; Wester, W.; Wolf, R. C.

    2018-06-01

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing source galaxies from the Dark Energy Survey Year 1 sample with redMaGiC galaxies (luminous red galaxies with secure photometric redshifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We apply the method to two photo-z codes run in our simulated data: Bayesian Photometric Redshift and Directional Neighbourhood Fitting. We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering versus photo-zs. The systematic uncertainty in the mean redshift bias of the source galaxy sample is Δz ≲ 0.02, though the precise value depends on the redshift bin under consideration. We discuss possible ways to mitigate the impact of our dominant systematics in future analyses.

  6. Weather data for simplified energy calculation methods. Volume IV. United States: WYEC data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olsen, A.R.; Moreno, S.; Deringer, J.

    The objective of this report is to provide a source of weather data for direct use with a number of simplified energy calculation methods available today. Complete weather data for a number of cities in the United States are provided for use in the following methods: degree hour, modified degree hour, bin, modified bin, and variable degree day. This report contains sets of weather data for 23 cities using Weather Year for Energy Calculations (WYEC) source weather data. Considerable overlap is present in cities (21) covered by both the TRY and WYEC data. The weather data at each city hasmore » been summarized in a number of ways to provide differing levels of detail necessary for alternative simplified energy calculation methods. Weather variables summarized include dry bulb and wet bulb temperature, percent relative humidity, humidity ratio, wind speed, percent possible sunshine, percent diffuse solar radiation, total solar radiation on horizontal and vertical surfaces, and solar heat gain through standard DSA glass. Monthly and annual summaries, in some cases by time of day, are available. These summaries are produced in a series of nine computer generated tables.« less

  7. Construction technique of disposable bin from sludge cake and its environmental risk.

    PubMed

    Kongmuang, Udomsak; Kiykaew, Duangta; Morioka, Ikuharu

    2015-01-01

    Now, a lot of researchers have tried to make recycled rigid materials from the sludge cake produced in paper mill industries for the purpose of decreasing its volume. In this study, the researchers tried to make economically a disposable bin and to examine whether it is toxic or not to the outside environment. To make a disposable bin, the researchers used the sludge cake, a plastic basket, as a fixed mold, white cloth or newspaper, as a removable supporter for wrapping around the mold, and latex or plaster, as a binder. The strength of the samples was measured by tensile-stress testing. The water absorption was evaluated by Cobb test. As toxicological tests, leaching test and seed germination test were selected. It was possible to form the disposal bin from the cleaned sludge cake. They seemed safe to carry garbage in the industry judging from the results of tensile-stress testing. Some of them showed less water absorptiveness (higher water resistance) in the results of Cobb test. The results of leaching test showed small values of three heavy metals, lead, nickel and copper, in the leachate. The seed germination test suggested no adverse effects of the bins in the clay and sand on the tomato growth. The results of these tests suggest that the bins have good strength, sufficient water resistance and no toxicological effect on the environment. This new recycled bin has the possibility to solve the environmental and health problems at disposing the sludge cake.

  8. Sensitivity of alpine and subalpine lakes to acidification from atmospheric deposition in Grand Teton National Park and Yellowstone National Park, Wyoming

    USGS Publications Warehouse

    Nanus, Leora; Campbell, Donald H.; Williams, Mark W.

    2005-01-01

    The sensitivity of 400 lakes in Grand Teton and Yellowstone National Parks to acidification from atmospheric deposition of nitrogen and sulfur was estimated based on statistical relations between acid-neutralizing capacity concentrations and basin characteristics to aid in the design of a long-term monitoring plan for Outstanding Natural Resource Waters. Acid-neutralizing capacity concentrations that were measured at 52 lakes in Grand Teton and 23 lakes in Yellowstone during synoptic surveys were used to calibrate the statistical models. Three acid-neutralizing capacity concentration bins (bins) were selected that are within the U.S. Environmental Protection Agency criteria of sensitive to acidification; less than 50 microequivalents per liter (?eq/L) (0-50), less than 100 ?eq/L (0-100), and less than 200 ?eq/L (0-200). The development of discrete bins enables resource managers to have the ability to change criteria based on the focus of their study. Basin-characteristic information was derived from Geographic Information System data sets. The explanatory variables that were considered included bedrock type, basin slope, basin aspect, basin elevation, lake area, basin area, inorganic nitrogen deposition, sulfate deposition, hydrogen ion deposition, basin precipitation, soil type, and vegetation type. A logistic regression model was developed and applied to lake basins greater than 1 hectare in Grand Teton (n = 106) and Yellowstone (n = 294). A higher percentage of lakes in Grand Teton than in Yellowstone were predicted to be sensitive to atmospheric deposition in all three bins. For Grand Teton, 7 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-50 bin, 36 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-100 bin, and 59 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-200 bin. The elevation of the lake outlet and the area of the basin with northeast aspects were determined to be statistically significant and were used as the explanatory variables in the multivariate logistic regression model for the 0-100 bin. For Yellowstone, results indicated that 13 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-100 bin, and 27 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-200 bin. Only the elevation of the lake outlet was determined to be statistically significant and was used as the explanatory variable for the 0-100 bin. The lakes that exceeded 60-percent probability of having an acid-neutralizing capacity concentration in the 0-100 bin, and therefore had the greatest sensitivity to acidification from atmospheric deposition, are located at elevations greater than 2,790 meters in Grand Teton, and greater than 2,590 meters in Yellowstone.

  9. The Relationship between Mono-abundance and Mono-age Stellar Populations in the Milky Way Disk

    NASA Astrophysics Data System (ADS)

    Minchev, I.; Steinmetz, M.; Chiappini, C.; Martig, M.; Anders, F.; Matijevic, G.; de Jong, R. S.

    2017-01-01

    Studying the Milky Way disk structure using stars in narrow bins of [Fe/H] and [α/Fe] has recently been proposed as a powerful method to understand the Galactic thick and thin disk formation. It has been assumed so far that these mono-abundance populations (MAPs) are also coeval, or mono-age, populations. Here we study this relationship for a Milky Way chemodynamical model and show that equivalence between MAPs and mono-age populations exists only for the high-[α/Fe] tail, where the chemical evolution curves of different Galactic radii are far apart. At lower [α/Fe]-values an MAP is composed of stars with a range in ages, even for small observational uncertainties and a small MAP bin size. Due to the disk inside-out formation, for these MAPs younger stars are typically located at larger radii, which results in negative radial age gradients that can be as large as 2 Gyr kpc-1. Positive radial age gradients can result for MAPs at the lowest [α/Fe] and highest [Fe/H] end. Such variations with age prevent the simple interpretation of observations for which accurate ages are not available. Studying the variation with radius of the stellar surface density and scale height in our model, we find good agreement to recent analyses of the APOGEE red-clump (RC) sample when 1-4 Gyr old stars dominate (as expected for the RC). Our results suggest that the APOGEE data are consistent with a Milky Way model for which mono-age populations flare for all ages. We propose observational tests for the validity of our predictions and argue that using accurate age measurements, such as from asteroseismology, is crucial for putting constraints on Galactic formation and evolution.

  10. Exploiting Defect Clustering to Screen Bare Die for Infant Mortality Failure: An Experimental Study

    NASA Technical Reports Server (NTRS)

    Lakin, David R., II; Singh, Adit D.

    1999-01-01

    We present the first experimental results to establish that a binning strategy based on defect clustering can be used to screen bare die for early life failures. The data for this study comes from the SEMATECH test methods experiment.

  11. Multifractal characteristics of multiparticle production in heavy-ion collisions at SPS energies

    NASA Astrophysics Data System (ADS)

    Khan, Shaista; Ahmad, Shakeel

    Entropy, dimensions and other multifractal characteristics of multiplicity distributions of relativistic charged hadrons produced in ion-ion collisions at SPS energies are investigated. The analysis of the experimental data is carried out in terms of phase space bin-size dependence of multiplicity distributions following the Takagi’s approach. Yet another method is also followed to study the multifractality which, is not related to the bin-width and (or) the detector resolution, rather involves multiplicity distribution of charged particles in full phase space in terms of information entropy and its generalization, Rényi’s order-q information entropy. The findings reveal the presence of multifractal structure — a remarkable property of the fluctuations. Nearly constant values of multifractal specific heat “c” estimated by the two different methods of analysis followed indicate that the parameter “c” may be used as a universal characteristic of the particle production in high energy collisions. The results obtained from the analysis of the experimental data agree well with the predictions of Monte Carlo model AMPT.

  12. "Observation Obscurer" - Time Series Viewer, Editor and Processor

    NASA Astrophysics Data System (ADS)

    Andronov, I. L.

    The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).

  13. Developing and Evaluating Prototype of Waste Volume Monitoring Using Internet of Things

    NASA Astrophysics Data System (ADS)

    Fathhan Arief, Mohamad; Lumban Gaol, Ford

    2017-06-01

    In Indonesia, especially Jakarta have a lot of garbage strewn that can be an eyesore and also cause pollution that can carry diseases. Garbage strewn can cause many things, one of her dues is bins are overflowing due to the full so it can not accommodate the waste dumped from other people. Thus, the author created a new method for waste disposal more systematic. In creating new method requires a technology to supports, then the author makes a prototype for waste volume monitoring. By using the internet of things prototype of waste volume monitoring may give notification to the sanitary agency that waste in the trash bin needs to be disposal. In this study, conducted the design and manufactured of prototype waste volume monitoring using LinkItONE board based by Arduino and an ultrasonic sensor for appliance senses. Once the prototype is completed, evaluation in order to determine whether the prototype will function properly. The result showed that the expected function of a prototype waste volume monitoring can work well.

  14. BLIND EXTRACTION OF AN EXOPLANETARY SPECTRUM THROUGH INDEPENDENT COMPONENT ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waldmann, I. P.; Tinetti, G.; Hollis, M. D. J.

    2013-03-20

    Blind-source separation techniques are used to extract the transmission spectrum of the hot-Jupiter HD189733b recorded by the Hubble/NICMOS instrument. Such a 'blind' analysis of the data is based on the concept of independent component analysis. The detrending of Hubble/NICMOS data using the sole assumption that nongaussian systematic noise is statistically independent from the desired light-curve signals is presented. By not assuming any prior or auxiliary information but the data themselves, it is shown that spectroscopic errors only about 10%-30% larger than parametric methods can be obtained for 11 spectral bins with bin sizes of {approx}0.09 {mu}m. This represents a reasonablemore » trade-off between a higher degree of objectivity for the non-parametric methods and smaller standard errors for the parametric de-trending. Results are discussed in light of previous analyses published in the literature. The fact that three very different analysis techniques yield comparable spectra is a strong indication of the stability of these results.« less

  15. Statistical technique for analysing functional connectivity of multiple spike trains.

    PubMed

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Benefits of a Hospital Two-Bin Kanban System

    DTIC Science & Technology

    2014-09-01

    Reader : Michael Dixon THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting burden for...Approved by: Nedialko Dimitrov Thesis Advisor Rachel Silvestrini Co-Advisor Michael Dixon Second Reader Robert F. Dell Chair...stocks split between primary (in front) and secondary bins (directly behind). RFID tags are placed on the front of each bin. Photos taken at WRNMMC

  17. SeaQuaKE: Sea-optimized Quantum Key Exchange

    DTIC Science & Technology

    2015-01-01

    of photon pairs in both polarization [3] and time-bin [4] degrees of freedom simultaneously. Entanglement analysis components in both the...greater throughput per entangled photon pair compared to alternative sources that encode in only a Photon -pair source Time-bin entanglement ...Polarization Entanglement & Pair Generation Hyperentangled Photon Pair Source •Wavelength availability • Power • Pulse rate Time-bin Mux • Waveguide vs

  18. Optical Extinction Measurements of Dust Density in the GMRO Regolith Test Bin

    NASA Technical Reports Server (NTRS)

    Lane, J.; Mantovani, J.; Mueller, R.; Nugent, M.; Nick, A.; Schuler, J.; Townsend, I.

    2016-01-01

    A regolith simulant test bin was constructed and completed in the Granular Mechanics and Regolith Operations (GMRO) Lab in 2013. This Planetary Regolith Test Bed (PRTB) is a 64 sq m x 1 m deep test bin, is housed in a climate-controlled facility, and contains 120 MT of lunar-regolith simulant, called Black Point-1 or BP-1, from Black Point, AZ. One of the current uses of the test bin is to study the effects of difficult lighting and dust conditions on Telerobotic Perception Systems to better assess and refine regolith operations for asteroid, Mars and polar lunar missions. Low illumination and low angle of incidence lighting pose significant problems to computer vision and human perception. Levitated dust on Asteroids interferes with imaging and degrades depth perception. Dust Storms on Mars pose a significant problem. Due to these factors, the likely performance of telerobotics is poorly understood for future missions. Current space telerobotic systems are only operated in bright lighting and dust-free conditions. This technology development testing will identify: (1) the impact of degraded lighting and environmental dust on computer vision and operator perception, (2) potential methods and procedures for mitigating these impacts, (3) requirements for telerobotic perception systems for asteroid capture, Mars dust storms and lunar regolith ISRU missions. In order to solve some of the Telerobotic Perception system problems, a plume erosion sensor (PES) was developed in the Lunar Regolith Simulant Bin (LRSB), containing 2 MT of JSC-1a lunar simulant. PES is simply a laser and digital camera with a white target. Two modes of operation have been investigated: (1) single laser spot - the brightness of the spot is dependent on the optical extinction due to dust and is thus an indirect measure of particle number density, and (2) side-scatter - the camera images the laser from the side, showing beam entrance into the dust cloud and the boundary between dust and void. Both methods must assume a mean particle size in order to extract a number density. The optical extinction measurement yields the product of the 2nd moment of the particle size distribution and the extinction efficiency Qe. For particle sizes in the range of interest (greater than 1 micrometer), Qe approximately equal to 2. Scaling up of the PES single laser and camera system is underway in the PRTB, where an array of lasers penetrate a con-trolled dust cloud, illuminating multiple targets. Using high speed HD GoPro video cameras, the evolution of the dust cloud and particle size density can be studied in detail.

  19. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  20. Get Sample Copy
  21. Recommend to Your Librarian
  22. E-MailPrint
  1. Multi-dimensional photonic states from a quantum dot

    NASA Astrophysics Data System (ADS)

    Lee, J. P.; Bennett, A. J.; Stevenson, R. M.; Ellis, D. J. P.; Farrer, I.; Ritchie, D. A.; Shields, A. J.

    2018-04-01

    Quantum states superposed across multiple particles or degrees of freedom offer an advantage in the development of quantum technologies. Creating these states deterministically and with high efficiency is an ongoing challenge. A promising approach is the repeated excitation of multi-level quantum emitters, which have been shown to naturally generate light with quantum statistics. Here we describe how to create one class of higher dimensional quantum state, a so called W-state, which is superposed across multiple time bins. We do this by repeated Raman scattering of photons from a charged quantum dot in a pillar microcavity. We show this method can be scaled to larger dimensions with no reduction in coherence or single-photon character. We explain how to extend this work to enable the deterministic creation of arbitrary time-bin encoded qudits.

  2. Project Worm Bin.

    ERIC Educational Resources Information Center

    McGuire, Daniel C.

    1987-01-01

    Describes a project centering around earthworm activity in a compost bin. Includes suggestions for exercises involving biological and conservation concepts, gardening skills, and dramatical presentations. (ML)

  3. Qatar: Background and U.S. Relations

    DTIC Science & Technology

    2014-01-30

    Industry Mohammed bin Saleh al Sada Minister of State for Defense Affairs MG Hamad bin Ali Al Attiyah Chief of Staff, Qatari Armed Forces MG Ghanim bin...May 2008 after concerns about voter franchise extension were resolved.5 The Advisory Council would have oversight authority over the Council of...have a more lasting impact on the region, but has challenged the traditional Qatari preference for remaining engaged with all sides in regional

  4. Afghanistan, the Taliban, and Osama bin Laden: The Background to September 11

    ERIC Educational Resources Information Center

    Social Education, 2011

    2011-01-01

    On May 1, 2011, a group of U.S. soldiers boarded helicopters at a base in Afghanistan, hoping to find a man named Osama bin Laden. Bin Laden, the leader of the al Qaeda terrorist network, was responsible for a number of terrorist attacks around the world, including those of September 11, 2001, that killed nearly 3,000 people in the United States.…

  5. Characterization of photon-counting multislit breast tomosynthesis.

    PubMed

    Berggren, Karl; Cederström, Björn; Lundqvist, Mats; Fredenberg, Erik

    2018-02-01

    It has been shown that breast tomosynthesis may improve sensitivity and specificity compared to two-dimensional mammography, resulting in increased detection-rate of cancers or lowered call-back rates. The purpose of this study is to characterize a spectral photon-counting multislit breast tomosynthesis system that is able to do single-scan spectral imaging with multiple collimated x-ray beams. The system differs in many aspects compared to conventional tomosynthesis using energy-integrating flat-panel detectors. The investigated system was a prototype consisting of a dual-threshold photon-counting detector with 21 collimated line detectors scanning across the compressed breast. A review of the system is done in terms of detector, acquisition geometry, and reconstruction methods. Three reconstruction methods were used, simple back-projection, filtered back-projection and an iterative algebraic reconstruction technique. The image quality was evaluated by measuring the modulation transfer-function (MTF), normalized noise-power spectrum, detective quantum-efficiency (DQE), and artifact spread-function (ASF) on reconstructed spectral tomosynthesis images for a total-energy bin (defined by a low-energy threshold calibrated to remove electronic noise) and for a high-energy bin (with a threshold calibrated to split the spectrum in roughly equal parts). Acquisition was performed using a 29 kVp W/Al x-ray spectrum at a 0.24 mGy exposure. The difference in MTF between the two energy bins was negligible, that is, there was no energy dependence on resolution. The MTF dropped to 50% at 1.5 lp/mm to 2.3 lp/mm in the scan direction and 2.4 lp/mm to 3.3 lp/mm in the slit direction, depending on the reconstruction method. The full width at half maximum of the ASF was found to range from 13.8 mm to 18.0 mm for the different reconstruction methods. The zero-frequency DQE of the system was found to be 0.72. The fraction of counts in the high-energy bin was measured to be 59% of the total detected spectrum. Scantimes ranged from 4 s to 16.5 s depending on voltage and current settings. The characterized system generates spectral tomosynthesis images with a dual-energy photon-counting detector. Measurements show a high DQE, enabling high image quality at a low dose, which is beneficial for low-dose applications such as screening. The single-scan spectral images open up for applications such as quantitative material decomposition and contrast-enhanced tomosynthesis. © 2017 American Association of Physicists in Medicine.

  6. An Accurate and Efficient Algorithm for Detection of Radio Bursts with an Unknown Dispersion Measure, for Single-dish Telescopes and Interferometers

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.

    2017-01-01

    Astronomical radio signals are subjected to phase dispersion while traveling through the interstellar medium. To optimally detect a short-duration signal within a frequency band, we have to precisely compensate for the unknown pulse dispersion, which is a computationally demanding task. We present the “fast dispersion measure transform” algorithm for optimal detection of such signals. Our algorithm has a low theoretical complexity of 2{N}f{N}t+{N}t{N}{{Δ }}{{log}}2({N}f), where Nf, Nt, and NΔ are the numbers of frequency bins, time bins, and dispersion measure bins, respectively. Unlike previously suggested fast algorithms, our algorithm conserves the sensitivity of brute-force dedispersion. Our tests indicate that this algorithm, running on a standard desktop computer and implemented in a high-level programming language, is already faster than the state-of-the-art dedispersion codes running on graphical processing units (GPUs). We also present a variant of the algorithm that can be efficiently implemented on GPUs. The latter algorithm’s computation and data-transport requirements are similar to those of a two-dimensional fast Fourier transform, indicating that incoherent dedispersion can now be considered a nonissue while planning future surveys. We further present a fast algorithm for sensitive detection of pulses shorter than the dispersive smearing limits of incoherent dedispersion. In typical cases, this algorithm is orders of magnitude faster than enumerating dispersion measures and coherently dedispersing by convolution. We analyze the computational complexity of pulsed signal searches by radio interferometers. We conclude that, using our suggested algorithms, maximally sensitive blind searches for dispersed pulses are feasible using existing facilities. We provide an implementation of these algorithms in Python and MATLAB.

  7. Fitting the constitution type Ia supernova data with the redshift-binned parametrization method

    NASA Astrophysics Data System (ADS)

    Huang, Qing-Guo; Li, Miao; Li, Xiao-Dong; Wang, Shuang

    2009-10-01

    In this work, we explore the cosmological consequences of the recently released Constitution sample of 397 Type Ia supernovae (SNIa). By revisiting the Chevallier-Polarski-Linder (CPL) parametrization, we find that, for fitting the Constitution set alone, the behavior of dark energy (DE) significantly deviates from the cosmological constant Λ, where the equation of state (EOS) w and the energy density ρΛ of DE will rapidly decrease along with the increase of redshift z. Inspired by this clue, we separate the redshifts into different bins, and discuss the models of a constant w or a constant ρΛ in each bin, respectively. It is found that for fitting the Constitution set alone, w and ρΛ will also rapidly decrease along with the increase of z, which is consistent with the result of CPL model. Moreover, a step function model in which ρΛ rapidly decreases at redshift z˜0.331 presents a significant improvement (Δχ2=-4.361) over the CPL parametrization, and performs better than other DE models. We also plot the error bars of DE density of this model, and find that this model deviates from the cosmological constant Λ at 68.3% confidence level (CL); this may arise from some biasing systematic errors in the handling of SNIa data, or more interestingly from the nature of DE itself. In addition, for models with same number of redshift bins, a piecewise constant ρΛ model always performs better than a piecewise constant w model; this shows the advantage of using ρΛ, instead of w, to probe the variation of DE.

  8. CFHTLenS and RCSLenS: testing photometric redshift distributions using angular cross-correlations with spectroscopic galaxy surveys

    NASA Astrophysics Data System (ADS)

    Choi, A.; Heymans, C.; Blake, C.; Hildebrandt, H.; Duncan, C. A. J.; Erben, T.; Nakajima, R.; Van Waerbeke, L.; Viola, M.

    2016-12-01

    We determine the accuracy of galaxy redshift distributions as estimated from photometric redshift probability distributions p(z). Our method utilizes measurements of the angular cross-correlation between photometric galaxies and an overlapping sample of galaxies with spectroscopic redshifts. We describe the redshift leakage from a galaxy photometric redshift bin j into a spectroscopic redshift bin I using the sum of the p(z) for the galaxies residing in bin j. We can then predict the angular cross-correlation between photometric and spectroscopic galaxies due to intrinsic galaxy clustering when I ≠ j as a function of the measured angular cross-correlation when I = j. We also account for enhanced clustering arising from lensing magnification using a halo model. The comparison of this prediction with the measured signal provides a consistency check on the validity of using the summed p(z) to determine galaxy redshift distributions in cosmological analyses, as advocated by the Canada-France-Hawaii Telescope Lensing Survey (CFHTLenS). We present an analysis of the photometric redshifts measured by CFHTLenS, which overlaps the Baryon Oscillation Spectroscopic Survey (BOSS). We also analyse the Red-sequence Cluster Lensing Survey, which overlaps both BOSS and the WiggleZ Dark Energy Survey. We find that the summed p(z) from both surveys are generally biased with respect to the true underlying distributions. If unaccounted for, this bias would lead to errors in cosmological parameter estimation from CFHTLenS by less than ˜4 per cent. For photometric redshift bins which spatially overlap in 3D with our spectroscopic sample, we determine redshift bias corrections which can be used in future cosmological analyses that rely on accurate galaxy redshift distributions.

  9. Quantitative computed tomography of lung parenchyma in patients with emphysema: analysis of higher-density lung regions

    NASA Astrophysics Data System (ADS)

    Lederman, Dror; Leader, Joseph K.; Zheng, Bin; Sciurba, Frank C.; Tan, Jun; Gur, David

    2011-03-01

    Quantitative computed tomography (CT) has been widely used to detect and evaluate the presence (or absence) of emphysema applying the density masks at specific thresholds, e.g., -910 or -950 Hounsfield Unit (HU). However, it has also been observed that subjects with similar density-mask based emphysema scores could have varying lung function, possibly indicating differences of disease severity. To assess this possible discrepancy, we investigated whether density distribution of "viable" lung parenchyma regions with pixel values > -910 HU correlates with lung function. A dataset of 38 subjects, who underwent both pulmonary function testing and CT examinations in a COPD SCCOR study, was assembled. After the lung regions depicted on CT images were automatically segmented by a computerized scheme, we systematically divided the lung parenchyma into different density groups (bins) and computed a number of statistical features (i.e., mean, standard deviation (STD), skewness of the pixel value distributions) in these density bins. We then analyzed the correlations between each feature and lung function. The correlation between diffusion lung capacity (DLCO) and STD of pixel values in the bin of -910HU <= PV < -750HU was -0.43, as compared with a correlation of -0.49 obtained between the post-bronchodilator ratio (FEV1/FVC) measured by the forced expiratory volume in 1 second (FEV1) dividing the forced vital capacity (FVC) and the STD of pixel values in the bin of -1024HU <= PV < -910HU. The results showed an association between the distribution of pixel values in "viable" lung parenchyma and lung function, which indicates that similar to the conventional density mask method, the pixel value distribution features in "viable" lung parenchyma areas may also provide clinically useful information to improve assessments of lung disease severity as measured by lung functional tests.

  10. Fitting the constitution type Ia supernova data with the redshift-binned parametrization method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Qingguo; Kavli Institute for Theoretical Physics China, Chinese Academy of Sciences, Beijing 100190; Li Miao

    2009-10-15

    In this work, we explore the cosmological consequences of the recently released Constitution sample of 397 Type Ia supernovae (SNIa). By revisiting the Chevallier-Polarski-Linder (CPL) parametrization, we find that, for fitting the Constitution set alone, the behavior of dark energy (DE) significantly deviates from the cosmological constant {lambda}, where the equation of state (EOS) w and the energy density {rho}{sub {lambda}} of DE will rapidly decrease along with the increase of redshift z. Inspired by this clue, we separate the redshifts into different bins, and discuss the models of a constant w or a constant {rho}{sub {lambda}} in each bin,more » respectively. It is found that for fitting the Constitution set alone, w and {rho}{sub {lambda}} will also rapidly decrease along with the increase of z, which is consistent with the result of CPL model. Moreover, a step function model in which {rho}{sub {lambda}} rapidly decreases at redshift z{approx}0.331 presents a significant improvement ({delta}{chi}{sup 2}=-4.361) over the CPL parametrization, and performs better than other DE models. We also plot the error bars of DE density of this model, and find that this model deviates from the cosmological constant {lambda} at 68.3% confidence level (CL); this may arise from some biasing systematic errors in the handling of SNIa data, or more interestingly from the nature of DE itself. In addition, for models with same number of redshift bins, a piecewise constant {rho}{sub {lambda}} model always performs better than a piecewise constant w model; this shows the advantage of using {rho}{sub {lambda}}, instead of w, to probe the variation of DE.« less

  11. Color inference in visual communication: the meaning of colors in recycling.

    PubMed

    Schloss, Karen B; Lessard, Laurent; Walmsley, Charlotte S; Foley, Kathleen

    2018-01-01

    People interpret abstract meanings from colors, which makes color a useful perceptual feature for visual communication. This process is complicated, however, because there is seldom a one-to-one correspondence between colors and meanings. One color can be associated with many different concepts (one-to-many mapping) and many colors can be associated with the same concept (many-to-one mapping). We propose that to interpret color-coding systems, people perform assignment inference to determine how colors map onto concepts. We studied assignment inference in the domain of recycling. Participants saw images of colored but unlabeled bins and were asked to indicate which bins they would use to discard different kinds of recyclables and trash. In Experiment 1, we tested two hypotheses for how people perform assignment inference. The local assignment hypothesis predicts that people simply match objects with their most strongly associated color. The global assignment hypothesis predicts that people also account for the association strengths between all other objects and colors within the scope of the color-coding system. Participants discarded objects in bins that optimized the color-object associations of the entire set, which is consistent with the global assignment hypothesis. This sometimes resulted in discarding objects in bins whose colors were weakly associated with the object, even when there was a stronger associated option available. In Experiment 2, we tested different methods for encoding color-coding systems and found that people were better at assignment inference when color sets simultaneously maximized the association strength between assigned color-object parings while minimizing associations between unassigned pairings. Our study provides an approach for designing intuitive color-coding systems that facilitate communication through visual media such as graphs, maps, signs, and artifacts.

  12. Development of a measure of asthma-specific quality of life among adults.

    PubMed

    Eberhart, Nicole K; Sherbourne, Cathy D; Edelen, Maria Orlando; Stucky, Brian D; Sin, Nancy L; Lara, Marielena

    2014-04-01

    A key goal in asthma treatment is improvement in quality of life (QoL), but existing measures often confound QoL with symptoms and functional impairment. The current study addresses these limitations and the need for valid patient-reported outcome measures by using state-of-the-art methods to develop an item bank assessing QoL in adults with asthma. This article describes the process for developing an initial item pool for field testing. Five focus group interviews were conducted with a total of 50 asthmatic adults. We used "pile sorting/binning" and "winnowing" methods to identify key QoL dimensions and develop a pool of items based on statements made in the focus group interviews. We then conducted a literature review and consulted with an expert panel to ensure that no key concepts were omitted. Finally, we conducted individual cognitive interviews to ensure that items were well understood and inform final item refinement. Six hundred and sixty-one QoL statements were identified from focus group interview transcripts and subsequently used to generate a pool of 112 items in 16 different content areas. Items covering a broad range of content were developed that can serve as a valid gauge of individuals' perceptions of the effects of asthma and its treatment on their lives. These items do not directly measure symptoms or functional impairment, yet they include a broader range of content than most existent measures of asthma-specific QoL.

  13. The Effectiveness of the Curriculum Biography of the Prophet in the Development of Social Intelligence Skills of Al-Hussein Bin Talal University Students

    ERIC Educational Resources Information Center

    Al-Khateeb, Omar; Alrub, Mohammad Abo

    2015-01-01

    This study aimed to find out how the effectiveness of the curriculum biography of the Prophet in the development of social intelligence skills of Al-Hussein Bin Talal University students and the study sample consisted of 365 students from Al-Hussein Bin Talal University for the first semester 2014-2015 students were selected in accessible manner.…

  14. Nuclear Terrorism: Assessing the Threat, Developing a Response

    DTIC Science & Technology

    2009-01-01

    these weapons fit within bin Laden’s worldview: Since the late 1980s and certainly since 1991, bin Laden has seen the United States as the principal...leaders and its current senior managers, including bin Laden, Zawahiri, and their key lieutenants; > A number of affiliated groups or “ franchises ...source of danger appears to have shifted toward the independent and quasi-independent franchises as well as local extremists.73 As a result of this

  15. Ain't No Neuroscience Mountain High Enough: Experiences of a Neurogardener.

    PubMed

    Abdullah, Jafri Malin

    2015-01-01

    16 years have passed since the idea was mooted in 1999 by five neurosurgeons in the corridors of Hotel Perdana, Kota Bharu. They were Dato' Dr Johari Siregar Bin Adnan, Dato' Professor Dr Ahmad Zubaidi Abdul Latif , Dr Azmin Kass Bin Rosman, Dato' Dr Mohammed Saffari Bin Mohammed Haspani and Professor Dato' Dr Jafri Malin Abdullah. They initiated the beginning of the first programme in Neurosurgery in Malaysia. The rest is history.

  16. Fundamental Limits of Delay and Security in Device-to-Device Communication

    DTIC Science & Technology

    2013-01-01

    systematic MDS (maximum distance separable) codes and random binning strategies that achieve a Pareto optimal delayreconstruction tradeoff. The erasure MD...file, and a coding scheme based on erasure compression and Slepian-Wolf binning is presented. The coding scheme is shown to provide a Pareto optimal...ble) codes and random binning strategies that achieve a Pareto optimal delay- reconstruction tradeoff. The erasure MD setup is then used to propose a

  17. SOUTH ELEVATION OF GOLD HILL MILL, LOOKING NORTH. THE PRIMARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SOUTH ELEVATION OF GOLD HILL MILL, LOOKING NORTH. THE PRIMARY ORE BIN IS A CENTER, WITH A JAW CRUSHER JUST TO THE RIGHT. A CONVEYOR (MISSING) WAS USED TO CARRY CRUSHED ORE UP AND INTO THE SECONDARY ORE BIN. THE STONE RAMP TO THE LEFT OF THE ORE BIN WAS USED TO DRIVE TRUCKS UP TO DUMPING LEVEL. - Gold Hill Mill, Warm Spring Canyon Road, Death Valley Junction, Inyo County, CA

  18. BAO from Angular Clustering: Optimization and Mitigation of Theoretical Systematics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crocce, M.; et al.

    We study the theoretical systematics and optimize the methodology in Baryon Acoustic Oscillations (BAO) detections using the angular correlation function with tomographic bins. We calibrate and optimize the pipeline for the Dark Energy Survey Year 1 dataset using 1800 mocks. We compare the BAO fitting results obtained with three estimators: the Maximum Likelihood Estimator (MLE), Profile Likelihood, and Markov Chain Monte Carlo. The MLE method yields the least bias in the fit results (bias/spreadmore » $$\\sim 0.02$$) and the error bar derived is the closest to the Gaussian results (1% from 68% Gaussian expectation). When there is mismatch between the template and the data either due to incorrect fiducial cosmology or photo-$z$ error, the MLE again gives the least-biased results. The BAO angular shift that is estimated based on the sound horizon and the angular diameter distance agree with the numerical fit. Various analysis choices are further tested: the number of redshift bins, cross-correlations, and angular binning. We propose two methods to correct the mock covariance when the final sample properties are slightly different from those used to create the mock. We show that the sample changes can be accommodated with the help of the Gaussian covariance matrix or more effectively using the eigenmode expansion of the mock covariance. The eigenmode expansion is significantly less susceptible to statistical fluctuations relative to the direct measurements of the covariance matrix because the number of free parameters is substantially reduced [$p$ parameters versus $p(p+1)/2$ from direct measurement].« less

  19. Alzheimer risk genes modulate the relationship between plasma apoE and cortical PiB binding

    DOE PAGES

    Lazaris, Andreas; Hwang, Kristy S.; Goukasian, Naira; ...

    2015-10-15

    Objective: We investigated the association between apoE protein plasma levels and brain amyloidosis and the effect of the top 10 Alzheimer disease (AD) risk genes on this association. Methods: Our dataset consisted of 18 AD, 52 mild cognitive impairment, and 3 cognitively normal Alzheimer's Disease Neuroimaging Initiative 1 (ADNI1) participants with available [ 11C]-Pittsburgh compound B (PiB) and peripheral blood protein data. We used cortical pattern matching to study associations between plasma apoE and cortical PiB binding and the effect of carrier status for the top 10 AD risk genes. Results: Low plasma apoE was significantly associated with high PiBmore » SUVR, except in the sensorimotor and entorhinal cortex. For BIN1 rs744373, the association was observed only in minor allele carriers. For CD2AP rs9349407 and CR1 rs3818361, the association was preserved only in minor allele noncarriers. We did not find evidence for modulation by CLU, PICALM, ABCA7, BIN1, and MS4A6A. Conclusions: Our data show that BIN1 rs744373, CD2AP rs9349407, and CR1 rs3818361 genotypes modulate the association between apoE protein plasma levels and brain amyloidosis, implying a potential epigenetic/downstream interaction.« less

  20. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  21. Get Sample Copy
  22. Recommend to Your Librarian
  23. E-MailPrint
  1. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  2. Get Sample Copy
  3. Recommend to Your Librarian
  4. E-MailPrint
  1. Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.

    Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…

  2. Learning Compact Binary Face Descriptor for Face Recognition.

    PubMed

    Lu, Jiwen; Liong, Venice Erin; Zhou, Xiuzhuang; Zhou, Jie

    2015-10-01

    Binary feature descriptors such as local binary patterns (LBP) and its variations have been widely used in many face recognition systems due to their excellent robustness and strong discriminative power. However, most existing binary face descriptors are hand-crafted, which require strong prior knowledge to engineer them by hand. In this paper, we propose a compact binary face descriptor (CBFD) feature learning method for face representation and recognition. Given each face image, we first extract pixel difference vectors (PDVs) in local patches by computing the difference between each pixel and its neighboring pixels. Then, we learn a feature mapping to project these pixel difference vectors into low-dimensional binary vectors in an unsupervised manner, where 1) the variance of all binary codes in the training set is maximized, 2) the loss between the original real-valued codes and the learned binary codes is minimized, and 3) binary codes evenly distribute at each learned bin, so that the redundancy information in PDVs is removed and compact binary codes are obtained. Lastly, we cluster and pool these binary codes into a histogram feature as the final representation for each face image. Moreover, we propose a coupled CBFD (C-CBFD) method by reducing the modality gap of heterogeneous faces at the feature level to make our method applicable to heterogeneous face recognition. Extensive experimental results on five widely used face datasets show that our methods outperform state-of-the-art face descriptors.

  3. Gesture-controlled interfaces for self-service machines and other applications

    NASA Technical Reports Server (NTRS)

    Cohen, Charles J. (Inventor); Jacobus, Charles J. (Inventor); Paul, George (Inventor); Beach, Glenn (Inventor); Foulk, Gene (Inventor); Obermark, Jay (Inventor); Cavell, Brook (Inventor)

    2004-01-01

    A gesture recognition interface for use in controlling self-service machines and other devices is disclosed. A gesture is defined as motions and kinematic poses generated by humans, animals, or machines. Specific body features are tracked, and static and motion gestures are interpreted. Motion gestures are defined as a family of parametrically delimited oscillatory motions, modeled as a linear-in-parameters dynamic system with added geometric constraints to allow for real-time recognition using a small amount of memory and processing time. A linear least squares method is preferably used to determine the parameters which represent each gesture. Feature position measure is used in conjunction with a bank of predictor bins seeded with the gesture parameters, and the system determines which bin best fits the observed motion. Recognizing static pose gestures is preferably performed by localizing the body/object from the rest of the image, describing that object, and identifying that description. The disclosure details methods for gesture recognition, as well as the overall architecture for using gesture recognition to control of devices, including self-service machines.

  4. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine.

    PubMed

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L; Balleteros, Francisco

    2016-12-07

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets.

  5. Modeling of a Robust Confidence Band for the Power Curve of a Wind Turbine

    PubMed Central

    Hernandez, Wilmar; Méndez, Alfredo; Maldonado-Correa, Jorge L.; Balleteros, Francisco

    2016-01-01

    Having an accurate model of the power curve of a wind turbine allows us to better monitor its operation and planning of storage capacity. Since wind speed and direction is of a highly stochastic nature, the forecasting of the power generated by the wind turbine is of the same nature as well. In this paper, a method for obtaining a robust confidence band containing the power curve of a wind turbine under test conditions is presented. Here, the confidence band is bound by two curves which are estimated using parametric statistical inference techniques. However, the observations that are used for carrying out the statistical analysis are obtained by using the binning method, and in each bin, the outliers are eliminated by using a censorship process based on robust statistical techniques. Then, the observations that are not outliers are divided into observation sets. Finally, both the power curve of the wind turbine and the two curves that define the robust confidence band are estimated using each of the previously mentioned observation sets. PMID:27941604

  6. See Also:physica status solidi (b)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  7. Get Sample Copy
  8. Recommend to Your Librarian
  9. E-MailPrint
  1. SUGAR BIN WITH EAST WALL OF CRUSHING MILL TO ITS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    SUGAR BIN WITH EAST WALL OF CRUSHING MILL TO ITS RIGHT. CONVEYOR FROM BOILING HOUSE ABOVE. VIEW FROM THE NORTHEAST - Kekaha Sugar Company, Sugar Mill Building, 8315 Kekaha Road, Kekaha, Kauai County, HI

  2. 31. Panoramic shot, Huber Breaker (left), Retail Coal Storage Bins ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. Panoramic shot, Huber Breaker (left), Retail Coal Storage Bins (center), Boney Elevator (right) Photographs taken by Joseph E.B. Elliot - Huber Coal Breaker, 101 South Main Street, Ashley, Luzerne County, PA

  3. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts in the DES -- Calibration of the Weak Lensing Source Redshift Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, C.; et al.

    We present the calibration of the Dark Energy Survey Year 1 (DES Y1) weak lensing source galaxy redshift distributions from clustering measurements. By cross-correlating the positions of source galaxies with luminous red galaxies selected by the redMaGiC algorithm we measure the redshift distributions of the source galaxies as placed into different tomographic bins. These measurements constrain any such shifts to an accuracy ofmore » $$\\sim0.02$$ and can be computed even when the clustering measurements do not span the full redshift range. The highest-redshift source bin is not constrained by the clustering measurements because of the minimal redshift overlap with the redMaGiC galaxies. We compare our constraints with those obtained from $$\\texttt{COSMOS}$$ 30-band photometry and find that our two very different methods produce consistent constraints.« less

  4. Hyperentanglement concentration for polarization-spatial-time-bin hyperentangled photon systems with linear optics

    NASA Astrophysics Data System (ADS)

    Wang, Hong; Ren, Bao-Cang; Alzahrani, Faris; Hobiny, Aatef; Deng, Fu-Guo

    2017-10-01

    Hyperentanglement has significant applications in quantum information processing. Here we present an efficient hyperentanglement concentration protocol (hyper-ECP) for partially hyperentangled Bell states simultaneously entangled in polarization, spatial-mode and time-bin degrees of freedom (DOFs) with the parameter-splitting method, where the parameters of the partially hyperentangled Bell states are known to the remote parties. In this hyper-ECP, only one remote party is required to perform some local operations on the three DOFs of a photon, only the linear optical elements are considered, and the success probability can achieve the maximal value. Our hyper-ECP can be easily generalized to concentrate the N-photon partially hyperentangled Greenberger-Horne-Zeilinger states with known parameters, where the multiple DOFs have largely improved the channel capacity of long-distance quantum communication. All of these make our hyper-ECP more practical and useful in high-capacity long-distance quantum communication.

  5. 50. VIEW OF CRUSHER ADDITION FROM EAST. SHOWS 100TON STEEL ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    50. VIEW OF CRUSHER ADDITION FROM EAST. SHOWS 100-TON STEEL UNOXIDIZED ORE BIN, STEPHENS-ADAMSON 15 TON/HR INCLINED BUCKET ELEVATOR, AND DUST COLLECTION BIN IN UPPER RIGHT QUADRANT. THE ROD MILL CIRCUIT STOOD IN FRONT OF THE BUCKET ELEVATOR AND BEHIND THE BAKER COOLER (LEFT CENTER). MILL SOLUTION TANKS WERE IN FRONT OF THE CRUSHED OXIDIZED ORE BIN (CENTER), AND THE MILL FLOOR WAS THE NEXT LEVEL DOWN (RIGHT). - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  6. 41. VIEW NORTH OF UPPER LEVEL OF CRUSHER ADDITION. DINGS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    41. VIEW NORTH OF UPPER LEVEL OF CRUSHER ADDITION. DINGS MAGNETIC PULLEY AT CENTER. ALSO SHOWS 100-TON CRUSHED UNOXIDIZED ORE BIN (RIGHT), PULLEY FORM 18 INCH BELT CONVEYOR CRUSHED OXIDIZED ORE BIN FEED AND STEPHENSADAMSON 25 TON/HR BUCKET ELEVATOR (UPPER CENTER). THE UPPER PORTION OF THE SAMPLING ELEVATOR IS ABOVE THE MAGNETIC PULLEY (CENTER LEFT) WITH THE ROUTE OF THE 16 INCH BELT CONVEYOR FINES FEED TO CRUSHED OXIDIZED ORE BIN TO ITS LEFT. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  7. Development of an aerosol microphysical module: Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS)

    NASA Astrophysics Data System (ADS)

    Matsui, H.; Koike, M.; Kondo, Y.; Fast, J. D.; Takigawa, M.

    2014-09-01

    Number concentrations, size distributions, and mixing states of aerosols are essential parameters for accurate estimations of aerosol direct and indirect effects. In this study, we develop an aerosol module, designated the Aerosol Two-dimensional bin module for foRmation and Aging Simulation (ATRAS), that can explicitly represent these parameters by considering new particle formation (NPF), black carbon (BC) aging, and secondary organic aerosol (SOA) processes. A two-dimensional bin representation is used for particles with dry diameters from 40 nm to 10 μm to resolve both aerosol sizes (12 bins) and BC mixing states (10 bins) for a total of 120 bins. The particles with diameters between 1 and 40 nm are resolved using additional eight size bins to calculate NPF. The ATRAS module is implemented in the WRF-Chem model and applied to examine the sensitivity of simulated mass, number, size distributions, and optical and radiative parameters of aerosols to NPF, BC aging, and SOA processes over East Asia during the spring of 2009. The BC absorption enhancement by coating materials is about 50% over East Asia during the spring, and the contribution of SOA processes to the absorption enhancement is estimated to be 10-20% over northern East Asia and 20-35% over southern East Asia. A clear north-south contrast is also found between the impacts of NPF and SOA processes on cloud condensation nuclei (CCN) concentrations: NPF increases CCN concentrations at higher supersaturations (smaller particles) over northern East Asia, whereas SOA increases CCN concentrations at lower supersaturations (larger particles) over southern East Asia. The application of ATRAS in East Asia also shows that the impact of each process on each optical and radiative parameter depends strongly on the process and the parameter in question. The module can be used in the future as a benchmark model to evaluate the accuracy of simpler aerosol models and examine interactions between NPF, BC aging, and SOA processes under different meteorological conditions and emissions.

  8. Data Parallel Bin-Based Indexing for Answering Queries on Multi-Core Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gosink, Luke; Wu, Kesheng; Bethel, E. Wes

    2009-06-02

    The multi-core trend in CPUs and general purpose graphics processing units (GPUs) offers new opportunities for the database community. The increase of cores at exponential rates is likely to affect virtually every server and client in the coming decade, and presents database management systems with a huge, compelling disruption that will radically change how processing is done. This paper presents a new parallel indexing data structure for answering queries that takes full advantage of the increasing thread-level parallelism emerging in multi-core architectures. In our approach, our Data Parallel Bin-based Index Strategy (DP-BIS) first bins the base data, and then partitionsmore » and stores the values in each bin as a separate, bin-based data cluster. In answering a query, the procedures for examining the bin numbers and the bin-based data clusters offer the maximum possible level of concurrency; each record is evaluated by a single thread and all threads are processed simultaneously in parallel. We implement and demonstrate the effectiveness of DP-BIS on two multi-core architectures: a multi-core CPU and a GPU. The concurrency afforded by DP-BIS allows us to fully utilize the thread-level parallelism provided by each architecture--for example, our GPU-based DP-BIS implementation simultaneously evaluates over 12,000 records with an equivalent number of concurrently executing threads. In comparing DP-BIS's performance across these architectures, we show that the GPU-based DP-BIS implementation requires significantly less computation time to answer a query than the CPU-based implementation. We also demonstrate in our analysis that DP-BIS provides better overall performance than the commonly utilized CPU and GPU-based projection index. Finally, due to data encoding, we show that DP-BIS accesses significantly smaller amounts of data than index strategies that operate solely on a column's base data; this smaller data footprint is critical for parallel processors that possess limited memory resources (e.g., GPUs).« less

  9. DNA barcoding Neotropical fishes: recent advances from the Pampa Plain, Argentina.

    PubMed

    Rosso, J J; Mabragaña, E; Castro, M González; de Astarloa, J M Díaz

    2012-11-01

    The fish fauna of the Pampa Plain, the southernmost distribution range of many Neotropical species, was barcoded in this study. COI sequences were analysed by means of distance (K2P/NJ) and character-based (ML) models, as well as the Barcode Index Number (BIN). K2P/NJ analysis was able to discriminate among all previously identified species while also revealing the likely occurrence of two cryptic species that were further supported by BIN and ML analyses. On the other hand, both BIN and ML were not able to discriminate between two species of Rineloricaria. Despite the small genetic divergence between A. cf. pampa and A. eigenmanniorum, a tight array of haplotypes was observed for each species in both the distance and character-based methods. Deep intraspecific divergences were detected in Cnesterodon decemmaculatus (5%) and Salminus brasiliensis (6%). For Salminus brasiliensis, these findings were further supported by character-based (ML) evidence and meristic and morphological data. Our results also showed that Pampa Plain representatives of Salminus brasiliensis, Rhamdia quelen, Hoplias malabaricus, Synbranchus marmoratus, Australoheros facetus, Oligosarcus jenynsii and Corydoras paleatus differed by more than 3% from their conspecifics from other parts of South America. Overall, this study was able to highlight the likely occurrence of a cryptic species in Salminus brasiliensis and also illustrate the strong geographical structure in the COI sequence composition of seven fish species from South America. © 2012 Blackwell Publishing Ltd.

  10. Health State Monitoring of Bladed Machinery with Crack Growth Detection in BFG Power Plant Using an Active Frequency Shift Spectral Correction Method.

    PubMed

    Sun, Weifang; Yao, Bin; He, Yuchao; Chen, Binqiang; Zeng, Nianyin; He, Wangpeng

    2017-08-09

    Power generation using waste-gas is an effective and green way to reduce the emission of the harmful blast furnace gas (BFG) in pig-iron producing industry. Condition monitoring of mechanical structures in the BFG power plant is of vital importance to guarantee their safety and efficient operations. In this paper, we describe the detection of crack growth of bladed machinery in the BFG power plant via vibration measurement combined with an enhanced spectral correction technique. This technique enables high-precision identification of amplitude, frequency, and phase information (the harmonic information) belonging to deterministic harmonic components within the vibration signals. Rather than deriving all harmonic information using neighboring spectral bins in the fast Fourier transform spectrum, this proposed active frequency shift spectral correction method makes use of some interpolated Fourier spectral bins and has a better noise-resisting capacity. We demonstrate that the identified harmonic information via the proposed method is of suppressed numerical error when the same level of noises is presented in the vibration signal, even in comparison with a Hanning-window-based correction method. With the proposed method, we investigated vibration signals collected from a centrifugal compressor. Spectral information of harmonic tones, related to the fundamental working frequency of the centrifugal compressor, is corrected. The extracted spectral information indicates the ongoing development of an impeller blade crack that occurred in the centrifugal compressor. This method proves to be a promising alternative to identify blade cracks at early stages.

  11. 12. Interior view, grain tanks (bins). Profile view of overhead ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. Interior view, grain tanks (bins). Profile view of overhead (fill) conveyor gallery bridge extending through tops of tanks just below roofs. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  12. VIEW OF MILL FROM KEKAHA ROAD, WITH SUGAR BIN, CANE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF MILL FROM KEKAHA ROAD, WITH SUGAR BIN, CANE CLEANING PLANT AND CRUSHING MILL TO THE FORE. VIEW FROM THE EAST - Kekaha Sugar Company, Sugar Mill Building, 8315 Kekaha Road, Kekaha, Kauai County, HI

  13. 53. VIEW OF CRUSHED OXIDIZED ORE BIN FROM EAST. SHOWS ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    53. VIEW OF CRUSHED OXIDIZED ORE BIN FROM EAST. SHOWS ACCESS STAIR TO FEED LEVEL; DUST COLLECTOR ON LEFT. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  14. 6. Detail of interior bin wall section, during demolition. Shows ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Detail of interior bin wall section, during demolition. Shows alternating courses of channel tile with steel bands and largers hollow tile. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  15. On the Security of a Two-Factor Authentication and Key Agreement Scheme for Telecare Medicine Information Systems.

    PubMed

    Arshad, Hamed; Teymoori, Vahid; Nikooghadam, Morteza; Abbassi, Hassan

    2015-08-01

    Telecare medicine information systems (TMISs) aim to deliver appropriate healthcare services in an efficient and secure manner to patients. A secure mechanism for authentication and key agreement is required to provide proper security in these systems. Recently, Bin Muhaya demonstrated some security weaknesses of Zhu's authentication and key agreement scheme and proposed a security enhanced authentication and key agreement scheme for TMISs. However, we show that Bin Muhaya's scheme is vulnerable to off-line password guessing attacks and does not provide perfect forward secrecy. Furthermore, in order to overcome the mentioned weaknesses, we propose a new two-factor anonymous authentication and key agreement scheme using the elliptic curve cryptosystem. Security and performance analyses demonstrate that the proposed scheme not only overcomes the weaknesses of Bin Muhaya's scheme, but also is about 2.73 times faster than Bin Muhaya's scheme.

  16. A multichannel block-matching denoising algorithm for spectral photon-counting CT images.

    PubMed

    Harrison, Adam P; Xu, Ziyue; Pourmorteza, Amir; Bluemke, David A; Mollura, Daniel J

    2017-06-01

    We present a denoising algorithm designed for a whole-body prototype photon-counting computed tomography (PCCT) scanner with up to 4 energy thresholds and associated energy-binned images. Spectral PCCT images can exhibit low signal to noise ratios (SNRs) due to the limited photon counts in each simultaneously-acquired energy bin. To help address this, our denoising method exploits the correlation and exact alignment between energy bins, adapting the highly-effective block-matching 3D (BM3D) denoising algorithm for PCCT. The original single-channel BM3D algorithm operates patch-by-patch. For each small patch in the image, a patch grouping action collects similar patches from the rest of the image, which are then collaboratively filtered together. The resulting performance hinges on accurate patch grouping. Our improved multi-channel version, called BM3D_PCCT, incorporates two improvements. First, BM3D_PCCT uses a more accurate shared patch grouping based on the image reconstructed from photons detected in all 4 energy bins. Second, BM3D_PCCT performs a cross-channel decorrelation, adding a further dimension to the collaborative filtering process. These two improvements produce a more effective algorithm for PCCT denoising. Preliminary results compare BM3D_PCCT against BM3D_Naive, which denoises each energy bin independently. Experiments use a three-contrast PCCT image of a canine abdomen. Within five regions of interest, selected from paraspinal muscle, liver, and visceral fat, BM3D_PCCT reduces the noise standard deviation by 65.0%, compared to 40.4% for BM3D_Naive. Attenuation values of the contrast agents in calibration vials also cluster much tighter to their respective lines of best fit. Mean angular differences (in degrees) for the original, BM3D_Naive, and BM3D_PCCT images, respectively, were 15.61, 7.34, and 4.45 (iodine); 12.17, 7.17, and 4.39 (galodinium); and 12.86, 6.33, and 3.96 (bismuth). We outline a multi-channel denoising algorithm tailored for spectral PCCT images, demonstrating improved performance over an independent, yet state-of-the-art, single-channel approach. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  17. Hidden diversity revealed by genome-resolved metagenomics of iron-oxidizing microbial mats from Lō'ihi Seamount, Hawai'i.

    PubMed

    Fullerton, Heather; Hager, Kevin W; McAllister, Sean M; Moyer, Craig L

    2017-08-01

    The Zetaproteobacteria are ubiquitous in marine environments, yet this class of Proteobacteria is only represented by a few closely-related cultured isolates. In high-iron environments, such as diffuse hydrothermal vents, the Zetaproteobacteria are important members of the community driving its structure. Biogeography of Zetaproteobacteria has shown two ubiquitous operational taxonomic units (OTUs), yet much is unknown about their genomic diversity. Genome-resolved metagenomics allows for the specific binning of microbial genomes based on genomic signatures present in composite metagenome assemblies. This resulted in the recovery of 93 genome bins, of which 34 were classified as Zetaproteobacteria. Form II ribulose 1,5-bisphosphate carboxylase genes were recovered from nearly all the Zetaproteobacteria genome bins. In addition, the Zetaproteobacteria genome bins contain genes for uptake and utilization of bioavailable nitrogen, detoxification of arsenic, and a terminal electron acceptor adapted for low oxygen concentration. Our results also support the hypothesis of a Cyc2-like protein as the site for iron oxidation, now detected across a majority of the Zetaproteobacteria genome bins. Whole genome comparisons showed a high genomic diversity across the Zetaproteobacteria OTUs and genome bins that were previously unidentified by SSU rRNA gene analysis. A single lineage of cosmopolitan Zetaproteobacteria (zOTU 2) was found to be monophyletic, based on cluster analysis of average nucleotide identity and average amino acid identity comparisons. From these data, we can begin to pinpoint genomic adaptations of the more ecologically ubiquitous Zetaproteobacteria, and further understand their environmental constraints and metabolic potential.

  18. Convection Fingerprints on the Vertical Profiles of Q1 and Q2

    NASA Astrophysics Data System (ADS)

    Chang, C.; Lin, H.; Chou, C.

    2013-12-01

    Different types of tropical convection left their fingerprints on vertical structures of apparent heat source (Q1) and apparent moisture sink (Q2). Profile of deep convection on condensation heating and drying has been well-documented, yet direct assessment of shallow convection remains to be explored. Shallow convection prevails over subtropical ocean, where large-scale subsidence is primarily balanced by radiative cooling and moistening due to surface evaporation instead of moist convection. In this study a united framework is designed to investigate the vertical structures of tropical marine convections in three reanalysis data, including ERA-Interim, MERRA, and CFSR. It starts by sorting and binning data from the lightest to the heaviest rain. Then the differences between two neighboring bins are used to examine the direct effects for precipitation change, in light of the fact that non-convective processes would change slowly from bin to bin. It is shown that all three reanalyses reveal the shallow convective processes in light rain bins, featured by re-evaporating and detraining at the top of boundary layer and lower free troposphere. For heavy rain bins, three reanalyses mainly differ in their numbers and altitudes of heating and drying peaks, implying no universal agreement has been reached on partitioning of cloud populations. Coherent variations in temperature, moisture, and vertical motion are also discussed. This approach permits a systematical survey and comparison of tropical convection in GCM-type models, and preliminary studies of three reanalyses suggest certain degree of inconsistency in simulated convective feedback to large-scale heat and moisture budgets.

  19. Methodology and software to detect viral integration site hot-spots

    PubMed Central

    2011-01-01

    Background Modern gene therapy methods have limited control over where a therapeutic viral vector inserts into the host genome. Vector integration can activate local gene expression, which can cause cancer if the vector inserts near an oncogene. Viral integration hot-spots or 'common insertion sites' (CIS) are scrutinized to evaluate and predict patient safety. CIS are typically defined by a minimum density of insertions (such as 2-4 within a 30-100 kb region), which unfortunately depends on the total number of observed VIS. This is problematic for comparing hot-spot distributions across data sets and patients, where the VIS numbers may vary. Results We develop two new methods for defining hot-spots that are relatively independent of data set size. Both methods operate on distributions of VIS across consecutive 1 Mb 'bins' of the genome. The first method 'z-threshold' tallies the number of VIS per bin, converts these counts to z-scores, and applies a threshold to define high density bins. The second method 'BCP' applies a Bayesian change-point model to the z-scores to define hot-spots. The novel hot-spot methods are compared with a conventional CIS method using simulated data sets and data sets from five published human studies, including the X-linked ALD (adrenoleukodystrophy), CGD (chronic granulomatous disease) and SCID-X1 (X-linked severe combined immunodeficiency) trials. The BCP analysis of the human X-linked ALD data for two patients separately (774 and 1627 VIS) and combined (2401 VIS) resulted in 5-6 hot-spots covering 0.17-0.251% of the genome and containing 5.56-7.74% of the total VIS. In comparison, the CIS analysis resulted in 12-110 hot-spots covering 0.018-0.246% of the genome and containing 5.81-22.7% of the VIS, corresponding to a greater number of hot-spots as the data set size increased. Our hot-spot methods enable one to evaluate the extent of VIS clustering, and formally compare data sets in terms of hot-spot overlap. Finally, we show that the BCP hot-spots from the repopulating samples coincide with greater gene and CpG island density than the median genome density. Conclusions The z-threshold and BCP methods are useful for comparing hot-spot patterns across data sets of disparate sizes. The methodology and software provided here should enable one to study hot-spot conservation across a variety of VIS data sets and evaluate vector safety for gene therapy trials. PMID:21914224

  20. VIEW OF INTERIOR SPACE WITH ANODIZING TANK AND LIQUID BIN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    VIEW OF INTERIOR SPACE WITH ANODIZING TANK AND LIQUID BIN STORAGE TANK IN FOREGROUND, FACING NORTH. - Douglas Aircraft Company Long Beach Plant, Aircraft Parts Receiving & Storage Building, 3855 Lakewood Boulevard, Long Beach, Los Angeles County, CA

  1. OVERALL VIEW OF THE MILL WITH SUGAR BIN LEFT OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    OVERALL VIEW OF THE MILL WITH SUGAR BIN LEFT OF CENTER, CLEANING PLANT TO RIGHT, SEED TREATMENT PLANT TO LEFT. VIEW FROM THE EAST - Kekaha Sugar Company, Sugar Mill Building, 8315 Kekaha Road, Kekaha, Kauai County, HI

  2. 55. VIEW OF ROASTER ADDITION FROM NORTH. ELEVATOR/ORE BIN ADDITION ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    55. VIEW OF ROASTER ADDITION FROM NORTH. ELEVATOR/ORE BIN ADDITION TO RIGHT (WEST) WITH BAKER COOLER IN FRONT. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  3. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE PAGES

    Gatti, M.

    2018-02-22

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  4. Dark Energy Survey Year 1 Results: Cross-Correlation Redshifts - Methods and Systematics Characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gatti, M.

    We use numerical simulations to characterize the performance of a clustering-based method to calibrate photometric redshift biases. In particular, we cross-correlate the weak lensing (WL) source galaxies from the Dark Energy Survey Year 1 (DES Y1) sample with redMaGiC galaxies (luminous red galaxies with secure photometric red- shifts) to estimate the redshift distribution of the former sample. The recovered redshift distributions are used to calibrate the photometric redshift bias of standard photo-z methods applied to the same source galaxy sample. We also apply the method to three photo-z codes run in our simulated data: Bayesian Photometric Redshift (BPZ), Directional Neighborhoodmore » Fitting (DNF), and Random Forest-based photo-z (RF). We characterize the systematic uncertainties of our calibration procedure, and find that these systematic uncertainties dominate our error budget. The dominant systematics are due to our assumption of unevolving bias and clustering across each redshift bin, and to differences between the shapes of the redshift distributions derived by clustering vs photo-z's. The systematic uncertainty in the mean redshift bias of the source galaxy sample is z ≲ 0.02, though the precise value depends on the redshift bin under consideration. Here, we discuss possible ways to mitigate the impact of our dominant systematics in future analyses.« less

  5. TH-A-18C-03: Noise Correlation in CBCT Projection Data and Its Application for Noise Reduction in Low-Dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ZHANG, H; Huang, J; Ma, J

    2014-06-15

    Purpose: To study the noise correlation properties of cone-beam CT (CBCT) projection data and to incorporate the noise correlation information to a statistics-based projection restoration algorithm for noise reduction in low-dose CBCT. Methods: In this study, we systematically investigated the noise correlation properties among detector bins of CBCT projection data by analyzing repeated projection measurements. The measurements were performed on a TrueBeam on-board CBCT imaging system with a 4030CB flat panel detector. An anthropomorphic male pelvis phantom was used to acquire 500 repeated projection data at six different dose levels from 0.1 mAs to 1.6 mAs per projection at threemore » fixed angles. To minimize the influence of the lag effect, lag correction was performed on the consecutively acquired projection data. The noise correlation coefficient between detector bin pairs was calculated from the corrected projection data. The noise correlation among CBCT projection data was then incorporated into the covariance matrix of the penalized weighted least-squares (PWLS) criterion for noise reduction of low-dose CBCT. Results: The analyses of the repeated measurements show that noise correlation coefficients are non-zero between the nearest neighboring bins of CBCT projection data. The average noise correlation coefficients for the first- and second- order neighbors are about 0.20 and 0.06, respectively. The noise correlation coefficients are independent of the dose level. Reconstruction of the pelvis phantom shows that the PWLS criterion with consideration of noise correlation (PWLS-Cor) results in a lower noise level as compared to the PWLS criterion without considering the noise correlation (PWLS-Dia) at the matched resolution. Conclusion: Noise is correlated among nearest neighboring detector bins of CBCT projection data. An accurate noise model of CBCT projection data can improve the performance of the statistics-based projection restoration algorithm for low-dose CBCT.« less

  6. An accelerated line-by-line option for MODTRAN combining on-the-fly generation of line center absorption within 0.1 cm-1 bins and pre-computed line tails

    NASA Astrophysics Data System (ADS)

    Berk, Alexander; Conforti, Patrick; Hawes, Fred

    2015-05-01

    A Line-By-Line (LBL) option is being developed for MODTRAN6. The motivation for this development is two-fold. Firstly, when MODTRAN is validated against an independent LBL model, it is difficult to isolate the source of discrepancies. One must verify consistency between pressure, temperature and density profiles, between column density calculations, between continuum and particulate data, between spectral convolution methods, and more. Introducing a LBL option directly within MODTRAN will insure common elements for all calculations other than those used to compute molecular transmittances. The second motivation for the LBL upgrade is that it will enable users to compute high spectral resolution transmittances and radiances for the full range of current MODTRAN applications. In particular, introducing the LBL feature into MODTRAN will enable first-principle calculations of scattered radiances, an option that is often not readily available with LBL models. MODTRAN will compute LBL transmittances within one 0.1 cm-1 spectral bin at a time, marching through the full requested band pass. The LBL algorithm will use the highly accurate, pressure- and temperature-dependent MODTRAN Padé approximant fits of the contribution from line tails to define the absorption from all molecular transitions centered more than 0.05 cm-1 from each 0.1 cm-1 spectral bin. The beauty of this approach is that the on-the-fly computations for each 0.1 cm-1 bin will only require explicit LBL summing of transitions centered within a 0.2 cm-1 spectral region. That is, the contribution from the more distant lines will be pre-computed via the Padé approximants. The status of the LBL effort will be presented. This will include initial thermal and solar radiance calculations, validation calculations, and self-validations of the MODTRAN band model against its own LBL calculations.

  7. Landscape-scale soil moisture heterogeneity and its influence on surface fluxes at the Jornada LTER site: Evaluating a new model parameterization for subgrid-scale soil moisture variability

    NASA Astrophysics Data System (ADS)

    Baker, I. T.; Prihodko, L.; Vivoni, E. R.; Denning, A. S.

    2017-12-01

    Arid and semiarid regions represent a large fraction of global land, with attendant importance of surface energy and trace gas flux to global totals. These regions are characterized by strong seasonality, especially in precipitation, that defines the level of ecosystem stress. Individual plants have been observed to respond non-linearly to increasing soil moisture stress, where plant function is generally maintained as soils dry down to a threshold at which rapid closure of stomates occurs. Incorporating this nonlinear mechanism into landscape-scale models can result in unrealistic binary "on-off" behavior that is especially problematic in arid landscapes. Subsequently, models have `relaxed' their simulation of soil moisture stress on evapotranspiration (ET). Unfortunately, these relaxations are not physically based, but are imposed upon model physics as a means to force a more realistic response. Previously, we have introduced a new method to represent soil moisture regulation of ET, whereby the landscape is partitioned into `BINS' of soil moisture wetness, each associated with a fractional area of the landscape or grid cell. A physically- and observationally-based nonlinear soil moisture stress function is applied, but when convolved with the relative area distribution represented by wetness BINS the system has the emergent property of `smoothing' the landscape-scale response without the need for non-physical impositions on model physics. In this research we confront BINS simulations of Bowen ratio, soil moisture variability and trace gas flux with soil moisture and eddy covariance observations taken at the Jornada LTER dryland site in southern New Mexico. We calculate the mean annual wetting cycle and associated variability about the mean state and evaluate model performance against this variability and time series of land surface fluxes from the highly instrumented Tromble Weir watershed. The BINS simulations capture the relatively rapid reaction to wetting events and more prolonged response to drying cycles, as opposed to binary behavior in the control.

  8. Towards enhanced and interpretable clustering/classification in integrative genomics

    PubMed Central

    Lu, Yang Young; Lv, Jinchi; Fuhrman, Jed A.

    2017-01-01

    Abstract High-throughput technologies have led to large collections of different types of biological data that provide unprecedented opportunities to unravel molecular heterogeneity of biological processes. Nevertheless, how to jointly explore data from multiple sources into a holistic, biologically meaningful interpretation remains challenging. In this work, we propose a scalable and tuning-free preprocessing framework, Heterogeneity Rescaling Pursuit (Hetero-RP), which weighs important features more highly than less important ones in accord with implicitly existing auxiliary knowledge. Finally, we demonstrate effectiveness of Hetero-RP in diverse clustering and classification applications. More importantly, Hetero-RP offers an interpretation of feature importance, shedding light on the driving forces of the underlying biology. In metagenomic contig binning, Hetero-RP automatically weighs abundance and composition profiles according to the varying number of samples, resulting in markedly improved performance of contig binning. In RNA-binding protein (RBP) binding site prediction, Hetero-RP not only improves the prediction performance measured by the area under the receiver operating characteristic curves (AUC), but also uncovers the evidence supported by independent studies, including the distribution of the binding sites of IGF2BP and PUM2, the binding competition between hnRNPC and U2AF2, and the intron–exon boundary of U2AF2 [availability: https://github.com/younglululu/Hetero-RP]. PMID:28977511

  9. Ionization-potential depression and other dense plasma statistical property studies - Application to spectroscopic diagnostics.

    NASA Astrophysics Data System (ADS)

    Calisti, Annette; Ferri, Sandrine; Mossé, Caroline; Talin, Bernard

    2017-02-01

    The radiative properties of an emitter surrounded by a plasma, are modified through various mechanisms. For instance the line shapes emitted by bound-bound transitions are broadened and carry useful information for plasma diagnostics. Depending on plasma conditions the electrons occupying the upper quantum levels of radiators no longer exist as they belong to the plasma free electron population. All the charges present in the radiator environment contribute to the lowering of the energy required to free an electron in the fundamental state. This mechanism is known as ionization potential depression (IPD). The knowledge of IPD is useful as it affects both the radiative properties of the various ionic states and their populations. Its evaluation deals with highly complex n-body coupled systems, involving particles with different dynamics and attractive ion-electron forces. A classical molecular dynamics (MD) code, the BinGo-TCP code, has been recently developed to simulate neutral multi-component (various charge state ions and electrons) plasma accounting for all the charge correlations. In the present work, results on IPD and other dense plasma statistical properties obtained using the BinGo-TCP code are presented. The study focuses on aluminum plasmas for different densities and several temperatures in order to explore different plasma coupling conditions.

  10. Defense.gov Special Report: The Demise of Osama bin Laden

    Science.gov Websites

    official said. Story | Transcript Bin Laden's Death May Impact Afghanistan SEYMOUR-JOHNSON AIR FORCE BASE -Johnson Air Force Base, N.C. Story Leaders Honor 9/11 Victims at Ground Zero WASHINGTON, May 5, 2011

  11. 5. Detail of bin wall, showing the thinner exterior wall ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Detail of bin wall, showing the thinner exterior wall next to the inner wall with its alternating courses of channel tile and hollow tile. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  12. Tensor network states in time-bin quantum optics

    NASA Astrophysics Data System (ADS)

    Lubasch, Michael; Valido, Antonio A.; Renema, Jelmer J.; Kolthammer, W. Steven; Jaksch, Dieter; Kim, M. S.; Walmsley, Ian; García-Patrón, Raúl

    2018-06-01

    The current shift in the quantum optics community towards experiments with many modes and photons necessitates new classical simulation techniques that efficiently encode many-body quantum correlations and go beyond the usual phase-space formulation. To address this pressing demand we formulate linear quantum optics in the language of tensor network states. We extensively analyze the quantum and classical correlations of time-bin interference in a single fiber loop. We then generalize our results to more complex time-bin quantum setups and identify different classes of architectures for high-complexity and low-overhead boson sampling experiments.

  13. Mycoflora and aflatoxin production in pigeon pea stored in jute sacks and iron bins.

    PubMed

    Bankole, S A; Eseigbe, D A; Enikuomehin, O A

    The mycoflora, moisture content and aflatoxin contamination of pigeon pea (Cajanus cajan (L.) Millisp) stored in jute sacks and iron bins were determined at monthly intervals for a year. The predominant fungi on freshly harvested seeds were Alternaria spp., Botryodiplodia theobromae, Fusarium spp. and Phoma spp. These fungi gradually disappeared from stored seeds with time and by 5-6 months, most were not isolated. The fungi that succeeded the initially dominant ones were mainly members of the general Aspergillus, Penicillium and Rhizopus. Population of these fungi increased up to the end of one year storage. Higher incidence of mycoflora and Aspergillus flavus were recorded in jute-sack samples throughout the storage period. The moisture content of stored seeds was found to fluctuate with the prevailing weather conditions, being low during the dry season and slightly high during the wet season. The stored seeds were free of aflatoxins for 3 and 5 months in jute sacks and iron bins respectively. The level of aflatoxins detected in jute-sack storage system was considerably higher than that occurring in the iron bin system. Of 196 isolates of A. flavus screened, 48% were toxigenic in liquid culture (54% from jute sacks and 41% from iron bins).

  14. Molecular typing of antibiotic-resistant Staphylococcus aureus in Nigeria.

    PubMed

    O'Malley, S M; Emele, F E; Nwaokorie, F O; Idika, N; Umeizudike, A K; Emeka-Nwabunnia, I; Hanson, B M; Nair, R; Wardyn, S E; Smith, T C

    2015-01-01

    Antibiotic-resistant Staphylococcus aureus including methicillin-resistant strains (MRSA) are a major concern in densely populated urban areas. Initial studies of S. aureus in Nigeria indicated existence of antibiotic-resistant S. aureus strains in clinical and community settings. 73 biological samples (40 throat, 23 nasal, 10 wound) were collected from patients and healthcare workers in three populations in Nigeria: Lagos University Teaching Hospital, Nigerian Institute of Medical Research, and Owerri General Hospital. S. aureus was isolated from 38 of 73 samples (52%). Of the 38 S. aureus samples, 9 (24%) carried the Panton-Valentine leukocidin gene (PVL) while 16 (42%) possessed methicillin resistance genes (mecA). Antibiotic susceptibility profiles indicated resistance to several broad-spectrum antibiotics. Antibiotic-resistant S. aureus isolates were recovered from clinical and community settings in Nigeria. Insight about S. aureus in Nigeria may be used to improve antibiotic prescription methods and minimize the spread of antibiotic-resistant organisms in highly populated urban communities similar to Lagos, Nigeria. Copyright © 2014 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  15. A 3.9 ps Time-Interval RMS Precision Time-to-Digital Converter Using a Dual-Sampling Method in an UltraScale FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2016-10-01

    Field programmable gate arrays (FPGAs) manufactured with more advanced processing technology have faster carry chains and smaller delay elements, which are favorable for the design of tapped delay line (TDL)-style time-to-digital converters (TDCs) in FPGA. However, new challenges are posed in using them to implement TDCs with a high time precision. In this paper, we propose a bin realignment method and a dual-sampling method for TDC implementation in a Xilinx UltraScale FPGA. The former realigns the disordered time delay taps so that the TDC precision can approach the limit of its delay granularity, while the latter doubles the number of taps in the delay line so that the TDC precision beyond the cell delay limitation can be expected. Two TDC channels were implemented in a Kintex UltraScale FPGA, and the effectiveness of the new methods was evaluated. For fixed time intervals in the range from 0 to 440 ns, the average RMS precision measured by the two TDC channels reaches 5.8 ps using the bin realignment, and it further improves to 3.9 ps by using the dual-sampling method. The time precision has a 5.6% variation in the measured temperature range. Every part of the TDC, including dual-sampling, encoding, and on-line calibration, could run at a 500 MHz clock frequency. The system measurement dead time is only 4 ns.

  16. 35. Coal Fuel Elevator (diagonal in center), Fuel Elevator (left), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    35. Coal Fuel Elevator (diagonal in center), Fuel Elevator (left), Fuel Storage Bins (center), and Power Plant (far center), and Retail Coal Storage Bins (right) Photograph taken by George Harven - Huber Coal Breaker, 101 South Main Street, Ashley, Luzerne County, PA

  17. 34. Coal Fuel Elevator (diagonal in foreground), Fuel Elevator (left), ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    34. Coal Fuel Elevator (diagonal in foreground), Fuel Elevator (left), Fuel Storage Bins (center), and Power Plant (far center), and Retail Coal Storage Bins (right) Photograph taken by George Harven - Huber Coal Breaker, 101 South Main Street, Ashley, Luzerne County, PA

  18. 9. EMPIRE STATE MINE, BOTTOM ORE BIN/SHOOT. TIN ROOF OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. EMPIRE STATE MINE, BOTTOM ORE BIN/SHOOT. TIN ROOF OF SOUTHERN MOST BUILDING AND UPPER ORE SHOOT VISIBLE. CAMERA POINTED EAST-NORTHEAST. - Florida Mountain Mining Sites, Empire State Mine, West side of Florida Mountain, Silver City, Owyhee County, ID

  19. Technical Note: spektr 3.0—A computational tool for x-ray spectrum modeling and analysis

    PubMed Central

    Punnoose, J.; Xu, J.; Sisniega, A.; Zbijewski, W.; Siewerdsen, J. H.

    2016-01-01

    Purpose: A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The spektr code generates x-ray spectra (photons/mm2/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available. PMID:27487888

  20. Technical Note: SPEKTR 3.0—A computational tool for x-ray spectrum modeling and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, J.; Xu, J.; Sisniega, A.

    2016-08-15

    Purpose: A computational toolkit (SPEKTR 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a MATLAB (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The SPEKTR code generates x-ray spectra (photons/mm{sup 2}/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins overmore » beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, SPEKTR, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the SPEKTR function library, UI, and optimization tool are available.« less

  1. First Measurement of the E Double-polarization Observable for the γη to Κ +Σ - with CLAS & a New Forward Tagger Hodoscope for CLAS12

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleming, Jamie

    Establishing the excitation spectrum of the nucleon would be a key advance to further our understanding of nucleon structure and Quantum Chromodynamics (QCD). Recent theoretical advances allow predictions of the excitation spectrum of the nucleon and other nucleon properties directly from QCD in the nonperturbative regime, via numerical methods (such as Lattice QCD), complementing existing constituent quark models. There is an ongoing world programme in meson photoproduction from the nucleon, which has already led to a number of nucleon resonances being discovered and established. This advance has largely been made possible by the first accurate measurement of polarisation observables. Availablemore » data has been obtained for proton targets, whereas for a complete picture of meson photoproduction, data from the neutron must also be obtained. This is important, as nucleon resonances can have very different photo-couplings to the proton and neutron. This thesis presents the rst measurement of the E double-polarisation observable for the exclusive γη -> Κ +Σ - reaction using a polarised hydrogendeuterium target from the g14 run period at CLAS. Circularly polarised photons of energies between 1:1 and 2:3 GeV were used, with results shown in 200 MeV bins in E and bins of 0:4 in cosθ C.M./ Κ+ . Further to this, CLAS has undergone a detector upgrade in order to facilitate electrons of up to 12 GeV from Jefferson Lab's upgraded accelerator. Essential to this, is a new system for tagging quasi-real photons by detecting electrons scattered at very small angles. My work includes signi cant contributions to the design, realisation and construction of a hodoscope for this forward photon tagging apparatus. Presented in this thesis is a comprehensive overview of my work in developing and constructing the scintillating hodoscope for the CLAS12 Forward Tagger.« less

  2. Metagenomics Reveals Pervasive Bacterial Populations and Reduced Community Diversity across the Alaska Tundra Ecosystem

    DOE PAGES

    Johnston, Eric R.; Rodriguez-R, Luis M.; Luo, Chengwei; ...

    2016-04-25

    How soil microbial communities contrast with respect to taxonomic and functional composition within and between ecosystems remains an unresolved question that is central to predicting how global anthropogenic change will affect soil functioning and services. In particular, it remains unclear how small-scale observations of soil communities based on the typical volume sampled (1-2 g) are generalizable to ecosystem-scale responses and processes. This is especially relevant for remote, northern latitude soils, which are challenging to sample and are also thought to be more vulnerable to climate change compared to temperate soils. Here, we employed well-replicated shotgun metagenome and 16S rRNA genemore » amplicon sequencing to characterize community composition and metabolic potential in Alaskan tundra soils, combining our own datasets with those publically available from distant tundra and temperate grassland and agriculture habitats. We found that the abundance of many taxa and metabolic functions differed substantially between tundra soil metagenomes relative to those from temperate soils, and that a high degree of OTU-sharing exists between tundra locations. Tundra soils were an order of magnitude less complex than their temperate counterparts, allowing for near-complete coverage of microbial community richness (~92% breadth) by sequencing, and the recovery of 27 high-quality, almost complete ( > 80% completeness) population bins. These population bins, collectively, made up to ~10% of the metagenomic datasets, and represented diverse taxonomic groups and metabolic lifestyles tuned toward sulfur cycling, hydrogen metabolism, methanotrophy, and organic matter oxidation. Several population bins, including members of Acidobacteria, Actinobacteria, and Proteobacteria, were also present in geographically distant (~100-530 km apart) tundra habitats (full genome representation and up to 99.6% genome-derived average nucleotide identity). Collectively, our results revealed that Alaska tundra microbial communities are less diverse and more homogenous across spatial scales than previously anticipated, and provided DNA sequences of abundant populations and genes that would be relevant for future studies of the effects of environmental change on tundra ecosystems.« less

  3. Metagenomics Reveals Pervasive Bacterial Populations and Reduced Community Diversity across the Alaska Tundra Ecosystem.

    PubMed

    Johnston, Eric R; Rodriguez-R, Luis M; Luo, Chengwei; Yuan, Mengting M; Wu, Liyou; He, Zhili; Schuur, Edward A G; Luo, Yiqi; Tiedje, James M; Zhou, Jizhong; Konstantinidis, Konstantinos T

    2016-01-01

    How soil microbial communities contrast with respect to taxonomic and functional composition within and between ecosystems remains an unresolved question that is central to predicting how global anthropogenic change will affect soil functioning and services. In particular, it remains unclear how small-scale observations of soil communities based on the typical volume sampled (1-2 g) are generalizable to ecosystem-scale responses and processes. This is especially relevant for remote, northern latitude soils, which are challenging to sample and are also thought to be more vulnerable to climate change compared to temperate soils. Here, we employed well-replicated shotgun metagenome and 16S rRNA gene amplicon sequencing to characterize community composition and metabolic potential in Alaskan tundra soils, combining our own datasets with those publically available from distant tundra and temperate grassland and agriculture habitats. We found that the abundance of many taxa and metabolic functions differed substantially between tundra soil metagenomes relative to those from temperate soils, and that a high degree of OTU-sharing exists between tundra locations. Tundra soils were an order of magnitude less complex than their temperate counterparts, allowing for near-complete coverage of microbial community richness (~92% breadth) by sequencing, and the recovery of 27 high-quality, almost complete (>80% completeness) population bins. These population bins, collectively, made up to ~10% of the metagenomic datasets, and represented diverse taxonomic groups and metabolic lifestyles tuned toward sulfur cycling, hydrogen metabolism, methanotrophy, and organic matter oxidation. Several population bins, including members of Acidobacteria, Actinobacteria, and Proteobacteria, were also present in geographically distant (~100-530 km apart) tundra habitats (full genome representation and up to 99.6% genome-derived average nucleotide identity). Collectively, our results revealed that Alaska tundra microbial communities are less diverse and more homogenous across spatial scales than previously anticipated, and provided DNA sequences of abundant populations and genes that would be relevant for future studies of the effects of environmental change on tundra ecosystems.

  4. Metagenomics Reveals Pervasive Bacterial Populations and Reduced Community Diversity across the Alaska Tundra Ecosystem

    PubMed Central

    Johnston, Eric R.; Rodriguez-R, Luis M.; Luo, Chengwei; Yuan, Mengting M.; Wu, Liyou; He, Zhili; Schuur, Edward A. G.; Luo, Yiqi; Tiedje, James M.; Zhou, Jizhong; Konstantinidis, Konstantinos T.

    2016-01-01

    How soil microbial communities contrast with respect to taxonomic and functional composition within and between ecosystems remains an unresolved question that is central to predicting how global anthropogenic change will affect soil functioning and services. In particular, it remains unclear how small-scale observations of soil communities based on the typical volume sampled (1–2 g) are generalizable to ecosystem-scale responses and processes. This is especially relevant for remote, northern latitude soils, which are challenging to sample and are also thought to be more vulnerable to climate change compared to temperate soils. Here, we employed well-replicated shotgun metagenome and 16S rRNA gene amplicon sequencing to characterize community composition and metabolic potential in Alaskan tundra soils, combining our own datasets with those publically available from distant tundra and temperate grassland and agriculture habitats. We found that the abundance of many taxa and metabolic functions differed substantially between tundra soil metagenomes relative to those from temperate soils, and that a high degree of OTU-sharing exists between tundra locations. Tundra soils were an order of magnitude less complex than their temperate counterparts, allowing for near-complete coverage of microbial community richness (~92% breadth) by sequencing, and the recovery of 27 high-quality, almost complete (>80% completeness) population bins. These population bins, collectively, made up to ~10% of the metagenomic datasets, and represented diverse taxonomic groups and metabolic lifestyles tuned toward sulfur cycling, hydrogen metabolism, methanotrophy, and organic matter oxidation. Several population bins, including members of Acidobacteria, Actinobacteria, and Proteobacteria, were also present in geographically distant (~100–530 km apart) tundra habitats (full genome representation and up to 99.6% genome-derived average nucleotide identity). Collectively, our results revealed that Alaska tundra microbial communities are less diverse and more homogenous across spatial scales than previously anticipated, and provided DNA sequences of abundant populations and genes that would be relevant for future studies of the effects of environmental change on tundra ecosystems. PMID:27199914

  5. Balmer Filaments in Tycho’s Supernova Remnant: An Interplay between Cosmic-ray and Broad-neutral Precursors

    NASA Astrophysics Data System (ADS)

    Knežević, Sladjana; Läsker, Ronald; van de Ven, Glenn; Font, Joan; Raymond, John C.; Bailer-Jones, Coryn A. L.; Beckman, John; Morlino, Giovanni; Ghavamian, Parviz; Hughes, John P.; Heng, Kevin

    2017-09-01

    We present Hα spectroscopic observations and detailed modeling of the Balmer filaments in the supernova remnant (SNR) Tycho (SN 1572). We used GH α FaS (Galaxy Hα Fabry-Pérot Spectrometer) on the William Herschel Telescope with a 3.‧4 × 3.‧4 field of view, 0.″2 pixel scale, and {σ }{instr}=8.1 km s-1 resolution at 1″ seeing for ˜10 hr, resulting in 82 spatial-spectral bins that resolve the narrow Hα line in the entire SN 1572 northeastern rim. For the first time, we can therefore mitigate artificial line broadening from unresolved differential motion and probe Hα emission parameters in varying shock and ambient medium conditions. Broad Hα line remains unresolved within spectral coverage of 392 km s-1. We employed Bayesian inference to obtain reliable parameter confidence intervals and to quantify the evidence for models with multiple line components. The median Hα narrow-line (NL) FWHM of all bins and models is {W}{NL}=(54.8+/- 1.8) km s-1 at the 95% confidence level, varying within [35, 72] km s-1 between bins and clearly broadened compared to the intrinsic (thermal) ≈20 km s-1. Possible line splits are accounted for, significant in ≈ 18 % of the filament, and presumably due to remaining projection effects. We also find widespread evidence for intermediate-line emission of a broad-neutral precursor, with a median {W}{IL}=(180+/- 14) km s-1 (95% confidence). Finally, we present a measurement of the remnant’s systemic velocity, {V}{LSR}=-34 km s-1, and map differential line-of-sight motions. Our results confirm the existence and interplay of shock precursors in Tycho’s remnant. In particular, we show that suprathermal NL emission is near-universal in SN 1572, and that, in the absence of an alternative explanation, collisionless SNR shocks constitute a viable acceleration source for Galactic TeV cosmic-ray protons.

  6. Scuttle Flies (Diptera: Phoridae) Inhabiting Rabbit Carcasses Confined to Plastic Waste Bins in Malaysia Include New Records and an Undescribed Species.

    PubMed

    Zuha, Raja M; Huong-Wen, See; Disney, R Henry L; Omar, Baharudin

    2017-01-01

    Scuttle flies (Diptera: Phoridae) are small-sized insects of forensic importance. They are well known for diversified species and habitats, but in the context of forensic entomology, scuttle flies' inhabitance of corpses remains inadequately explored. With recent reports indicating the existence of more scuttle fly species possibly inhabiting these environments, a decomposition study using animal carcasses in enclosed environments was conducted. The aim was to record the occurrence of scuttle flies on rabbit carcasses placed in sealed plastic waste bins for a 40-day period. The study was conducted as two replicates in Bangi, Selangor. Sampling was carried out at different time intervals inside a modified mosquito net as a trap. Inside the trap, adult scuttle flies were aspirated and preserved in 70% ethanol. The fly larvae and pupae were reared until their adult stage to facilitate identification. From this study, six scuttle fly species were collected, i.e., Dahliphora sigmoides (Schmitz) ♂, Gymnoptera simplex (Brues) ♀ , Megaselia scalaris (Loew) ♂♀ , Puliciphora borinquenensis (Wheeler) ♂, Puliciphora obtecta Meijere ♀ and Spiniphora sp. ♀ . Both D. sigmoides and P. obtecta were newly recorded in Malaysia, whilst the Spiniphora sp. was considered an unknown species until it was linked to its male counterpart. The sealed waste bins were found to be accessible for the scuttle flies with delayed arrival (day 4-5). Megaselia scalaris was the primary scuttle fly species attracted to the carcass, and its occurrence could be observed between days 4-7 (replicate 1) and days 5-33 (replicate 2). This study also revealed Sarcophaga spp. (Diptera: Sarcophagidae) as the earliest species to colonize the remains and the longest to inhabit them (days 2-40). The larvae of Hermetia illucens (Linneaus) (Diptera: Stratiomyidae) and Fannia sp . (Diptera: Fanniidae) were found on the carcasses during the mid-advanced decay period. These findings expand the knowledge on the diversity of forensically important scuttle flies and coexisting dipterans in enclosed environments in Malaysia.

  7. RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOZLOWSKI, S.D.

    2007-05-30

    This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditionsmore » for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.« less

  8. Temporal binning of time-correlated single photon counting data improves exponential decay fits and imaging speed

    PubMed Central

    Walsh, Alex J.; Sharick, Joe T.; Skala, Melissa C.; Beier, Hope T.

    2016-01-01

    Time-correlated single photon counting (TCSPC) enables acquisition of fluorescence lifetime decays with high temporal resolution within the fluorescence decay. However, many thousands of photons per pixel are required for accurate lifetime decay curve representation, instrument response deconvolution, and lifetime estimation, particularly for two-component lifetimes. TCSPC imaging speed is inherently limited due to the single photon per laser pulse nature and low fluorescence event efficiencies (<10%) required to reduce bias towards short lifetimes. Here, simulated fluorescence lifetime decays are analyzed by SPCImage and SLIM Curve software to determine the limiting lifetime parameters and photon requirements of fluorescence lifetime decays that can be accurately fit. Data analysis techniques to improve fitting accuracy for low photon count data were evaluated. Temporal binning of the decays from 256 time bins to 42 time bins significantly (p<0.0001) improved fit accuracy in SPCImage and enabled accurate fits with low photon counts (as low as 700 photons/decay), a 6-fold reduction in required photons and therefore improvement in imaging speed. Additionally, reducing the number of free parameters in the fitting algorithm by fixing the lifetimes to known values significantly reduced the lifetime component error from 27.3% to 3.2% in SPCImage (p<0.0001) and from 50.6% to 4.2% in SLIM Curve (p<0.0001). Analysis of nicotinamide adenine dinucleotide–lactate dehydrogenase (NADH-LDH) solutions confirmed temporal binning of TCSPC data and a reduced number of free parameters improves exponential decay fit accuracy in SPCImage. Altogether, temporal binning (in SPCImage) and reduced free parameters are data analysis techniques that enable accurate lifetime estimation from low photon count data and enable TCSPC imaging speeds up to 6x and 300x faster, respectively, than traditional TCSPC analysis. PMID:27446663

  9. Sensor Technologies for Particulate Detection and Characterization

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.

    2008-01-01

    Planned Lunar missions have resulted in renewed attention to problems attributable to fine particulates. While the difficulties experienced during the sequence of Apollo missions did not prove critical in all cases, the comparatively long duration of impending missions may present a different situation. This situation creates the need for a spectrum of particulate sensing technologies. From a fundamental perspective, an improved understanding of the properties of the dust fraction is required. Described here is laboratory-based reference instrumentation for the measurement of fundamental particle size distribution (PSD) functions from 2.5 nanometers to 20 micrometers. Concomitant efforts for separating samples into fractional size bins are also presented. A requirement also exists for developing mission compatible sensors. Examples include provisions for air quality monitoring in spacecraft and remote habitation modules. Required sensor attributes such as low mass, volume, and power consumption, autonomy of operation, and extended reliability cannot be accommodated by existing technologies.

  10. Optimal updating magnitude in adaptive flat-distribution sampling

    NASA Astrophysics Data System (ADS)

    Zhang, Cheng; Drake, Justin A.; Ma, Jianpeng; Pettitt, B. Montgomery

    2017-11-01

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  11. Optimal updating magnitude in adaptive flat-distribution sampling.

    PubMed

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  12. Variation in temperature and mechanical properties of asphaltic concrete due to storage in surge bins.

    DOT National Transportation Integrated Search

    1973-03-01

    The Louisiana Department of Highways specifications on asphaltic concrete allow the contractors to use silos or surge bins for storage of asphaltic concrete mixtures. However, the maximum allowable storage time of the hot mix, if the contractor elect...

  13. 13. Interior view, grain tanks (bins). Barrel view of overhead ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Interior view, grain tanks (bins). Barrel view of overhead (fill) conveyor gallery bridge extending through tops of tanks just below roofs. Grain tripper straddles belt conveyor at mid-view. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  14. Building with integral solar-heat storage--Starkville, Mississippi

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Column supporting roof also houses rock-storage bin of solar-energy system supplying more than half building space heating load. Conventional heaters supply hot water. Since bin is deeper and narrower than normal, individual pebble size was increased to keep airflow resistance at minimum.

  15. 31. VIEW FROM SOUTHWEST TO CORNER WHERE SAMPLING/CRUSHING ADDITIONS ABUT ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    31. VIEW FROM SOUTHWEST TO CORNER WHERE SAMPLING/CRUSHING ADDITIONS ABUT CRUSHED OXIDIZED ORE BIN. INTACT BARREN SOLUTION TANK VISIBLE IN FRONT OF CRUSHED ORE BIN. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  16. Entanglement between a Photonic Time-Bin Qubit and a Collective Atomic Spin Excitation.

    PubMed

    Farrera, Pau; Heinze, Georg; de Riedmatten, Hugues

    2018-03-09

    Entanglement between light and matter combines the advantage of long distance transmission of photonic qubits with the storage and processing capabilities of atomic qubits. To distribute photonic states efficiently over long distances several schemes to encode qubits have been investigated-time-bin encoding being particularly promising due to its robustness against decoherence in optical fibers. Here, we demonstrate the generation of entanglement between a photonic time-bin qubit and a single collective atomic spin excitation (spin wave) in a cold atomic ensemble, followed by the mapping of the atomic qubit onto another photonic qubit. A magnetic field that induces a periodic dephasing and rephasing of the atomic excitation ensures the temporal distinguishability of the two time bins and plays a central role in the entanglement generation. To analyze the generated quantum state, we use largely imbalanced Mach-Zehnder interferometers to perform projective measurements in different qubit bases and verify the entanglement by violating a Clauser-Horne-Shimony-Holt Bell inequality.

  17. Entanglement between a Photonic Time-Bin Qubit and a Collective Atomic Spin Excitation

    NASA Astrophysics Data System (ADS)

    Farrera, Pau; Heinze, Georg; de Riedmatten, Hugues

    2018-03-01

    Entanglement between light and matter combines the advantage of long distance transmission of photonic qubits with the storage and processing capabilities of atomic qubits. To distribute photonic states efficiently over long distances several schemes to encode qubits have been investigated—time-bin encoding being particularly promising due to its robustness against decoherence in optical fibers. Here, we demonstrate the generation of entanglement between a photonic time-bin qubit and a single collective atomic spin excitation (spin wave) in a cold atomic ensemble, followed by the mapping of the atomic qubit onto another photonic qubit. A magnetic field that induces a periodic dephasing and rephasing of the atomic excitation ensures the temporal distinguishability of the two time bins and plays a central role in the entanglement generation. To analyze the generated quantum state, we use largely imbalanced Mach-Zehnder interferometers to perform projective measurements in different qubit bases and verify the entanglement by violating a Clauser-Horne-Shimony-Holt Bell inequality.

  18. Coagulation algorithms with size binning

    NASA Technical Reports Server (NTRS)

    Statton, David M.; Gans, Jason; Williams, Eric

    1994-01-01

    The Smoluchowski equation describes the time evolution of an aerosol particle size distribution due to aggregation or coagulation. Any algorithm for computerized solution of this equation requires a scheme for describing the continuum of aerosol particle sizes as a discrete set. One standard form of the Smoluchowski equation accomplishes this by restricting the particle sizes to integer multiples of a basic unit particle size (the monomer size). This can be inefficient when particle concentrations over a large range of particle sizes must be calculated. Two algorithms employing a geometric size binning convention are examined: the first assumes that the aerosol particle concentration as a function of size can be considered constant within each size bin; the second approximates the concentration as a linear function of particle size within each size bin. The output of each algorithm is compared to an analytical solution in a special case of the Smoluchowski equation for which an exact solution is known . The range of parameters more appropriate for each algorithm is examined.

  19. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    PubMed

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  20. Using fault tree analysis to identify contributing factors to engulfment in flowing grain in on-farm grain bins.

    PubMed

    Kingman, D M; Field, W E

    2005-11-01

    Findings reported by researchers at Illinois State University and Purdue University indicated that since 1980, an average of eight individuals per year have become engulfed and died in farm grain bins in the U.S. and Canada and that all these deaths are significant because they are believed to be preventable. During a recent effort to develop intervention strategies and recommendations for an ASAE farm grain bin safety standard, fault tree analysis (FTA) was utilized to identify contributing factors to engulfments in grain stored in on-farm grain bins. FTA diagrams provided a spatial perspective of the circumstances that occurred prior to engulfment incidents, a perspective never before presented in other hazard analyses. The FTA also demonstrated relationships and interrelationships of the contributing factors. FTA is a useful tool that should be applied more often in agricultural incident investigations to assist in the more complete understanding of the problem studied.

Top