Sample records for parallel bin-based indexing

  1. Bin-Hash Indexing: A Parallel Method for Fast Query Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bethel, Edward W; Gosink, Luke J.; Wu, Kesheng

    2008-06-27

    This paper presents a new parallel indexing data structure for answering queries. The index, called Bin-Hash, offers extremely high levels of concurrency, and is therefore well-suited for the emerging commodity of parallel processors, such as multi-cores, cell processors, and general purpose graphics processing units (GPU). The Bin-Hash approach first bins the base data, and then partitions and separately stores the values in each bin as a perfect spatial hash table. To answer a query, we first determine whether or not a record satisfies the query conditions based on the bin boundaries. For the bins with records that can not bemore » resolved, we examine the spatial hash tables. The procedures for examining the bin numbers and the spatial hash tables offer the maximum possible level of concurrency; all records are able to be evaluated by our procedure independently in parallel. Additionally, our Bin-Hash procedures access much smaller amounts of data than similar parallel methods, such as the projection index. This smaller data footprint is critical for certain parallel processors, like GPUs, where memory resources are limited. To demonstrate the effectiveness of Bin-Hash, we implement it on a GPU using the data-parallel programming language CUDA. The concurrency offered by the Bin-Hash index allows us to fully utilize the GPU's massive parallelism in our work; over 12,000 records can be simultaneously evaluated at any one time. We show that our new query processing method is an order of magnitude faster than current state-of-the-art CPU-based indexing technologies. Additionally, we compare our performance to existing GPU-based projection index strategies.« less

  2. Data Parallel Bin-Based Indexing for Answering Queries on Multi-Core Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gosink, Luke; Wu, Kesheng; Bethel, E. Wes

    2009-06-02

    The multi-core trend in CPUs and general purpose graphics processing units (GPUs) offers new opportunities for the database community. The increase of cores at exponential rates is likely to affect virtually every server and client in the coming decade, and presents database management systems with a huge, compelling disruption that will radically change how processing is done. This paper presents a new parallel indexing data structure for answering queries that takes full advantage of the increasing thread-level parallelism emerging in multi-core architectures. In our approach, our Data Parallel Bin-based Index Strategy (DP-BIS) first bins the base data, and then partitionsmore » and stores the values in each bin as a separate, bin-based data cluster. In answering a query, the procedures for examining the bin numbers and the bin-based data clusters offer the maximum possible level of concurrency; each record is evaluated by a single thread and all threads are processed simultaneously in parallel. We implement and demonstrate the effectiveness of DP-BIS on two multi-core architectures: a multi-core CPU and a GPU. The concurrency afforded by DP-BIS allows us to fully utilize the thread-level parallelism provided by each architecture--for example, our GPU-based DP-BIS implementation simultaneously evaluates over 12,000 records with an equivalent number of concurrently executing threads. In comparing DP-BIS's performance across these architectures, we show that the GPU-based DP-BIS implementation requires significantly less computation time to answer a query than the CPU-based implementation. We also demonstrate in our analysis that DP-BIS provides better overall performance than the commonly utilized CPU and GPU-based projection index. Finally, due to data encoding, we show that DP-BIS accesses significantly smaller amounts of data than index strategies that operate solely on a column's base data; this smaller data footprint is critical for parallel processors that possess limited memory resources (e.g., GPUs).« less

  3. Breaking the Curse of Cardinality on Bitmap Indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Wu, Kesheng; Stockinger, Kurt

    2008-04-04

    Bitmap indexes are known to be efficient for ad-hoc range queries that are common in data warehousing and scientific applications. However, they suffer from the curse of cardinality, that is, their efficiency deteriorates as attribute cardinalities increase. A number of strategies have been proposed, but none of them addresses the problem adequately. In this paper, we propose a novel binned bitmap index that greatly reduces the cost to answer queries, and therefore breaks the curse of cardinality. The key idea is to augment the binned index with an Order-preserving Bin-based Clustering (OrBiC) structure. This data structure significantly reduces the I/Omore » operations needed to resolve records that cannot be resolved with the bitmaps. To further improve the proposed index structure, we also present a strategy to create single-valued bins for frequent values. This strategy reduces index sizes and improves query processing speed. Overall, the binned indexes with OrBiC great improves the query processing speed, and are 3 - 25 times faster than the best available indexes for high-cardinality data.« less

  4. The volatile compound BinBase mass spectral database.

    PubMed

    Skogerson, Kirsten; Wohlgemuth, Gert; Barupal, Dinesh K; Fiehn, Oliver

    2011-08-04

    Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement.

  5. The volatile compound BinBase mass spectral database

    PubMed Central

    2011-01-01

    Background Volatile compounds comprise diverse chemical groups with wide-ranging sources and functions. These compounds originate from major pathways of secondary metabolism in many organisms and play essential roles in chemical ecology in both plant and animal kingdoms. In past decades, sampling methods and instrumentation for the analysis of complex volatile mixtures have improved; however, design and implementation of database tools to process and store the complex datasets have lagged behind. Description The volatile compound BinBase (vocBinBase) is an automated peak annotation and database system developed for the analysis of GC-TOF-MS data derived from complex volatile mixtures. The vocBinBase DB is an extension of the previously reported metabolite BinBase software developed to track and identify derivatized metabolites. The BinBase algorithm uses deconvoluted spectra and peak metadata (retention index, unique ion, spectral similarity, peak signal-to-noise ratio, and peak purity) from the Leco ChromaTOF software, and annotates peaks using a multi-tiered filtering system with stringent thresholds. The vocBinBase algorithm assigns the identity of compounds existing in the database. Volatile compound assignments are supported by the Adams mass spectral-retention index library, which contains over 2,000 plant-derived volatile compounds. Novel molecules that are not found within vocBinBase are automatically added using strict mass spectral and experimental criteria. Users obtain fully annotated data sheets with quantitative information for all volatile compounds for studies that may consist of thousands of chromatograms. The vocBinBase database may also be queried across different studies, comprising currently 1,537 unique mass spectra generated from 1.7 million deconvoluted mass spectra of 3,435 samples (18 species). Mass spectra with retention indices and volatile profiles are available as free download under the CC-BY agreement (http://vocbinbase.fiehnlab.ucdavis.edu). Conclusions The BinBase database algorithms have been successfully modified to allow for tracking and identification of volatile compounds in complex mixtures. The database is capable of annotating large datasets (hundreds to thousands of samples) and is well-suited for between-study comparisons such as chemotaxonomy investigations. This novel volatile compound database tool is applicable to research fields spanning chemical ecology to human health. The BinBase source code is freely available at http://binbase.sourceforge.net/ under the LGPL 2.0 license agreement. PMID:21816034

  6. Effects of empty bins on image upscaling in capsule endoscopy

    NASA Astrophysics Data System (ADS)

    Rukundo, Olivier

    2017-07-01

    This paper presents a preliminary study of the effect of empty bins on image upscaling in capsule endoscopy. The presented study was conducted based on results of existing contrast enhancement and interpolation methods. A low contrast enhancement method based on pixels consecutiveness and modified bilinear weighting scheme has been developed to distinguish between necessary empty bins and unnecessary empty bins in the effort to minimize the number of empty bins in the input image, before further processing. Linear interpolation methods have been used for upscaling input images with stretched histograms. Upscaling error differences and similarity indices between pairs of interpolation methods have been quantified using the mean squared error and feature similarity index techniques. Simulation results demonstrated more promising effects using the developed method than other contrast enhancement methods mentioned.

  7. Generalized index for spatial data sets as a measure of complete spatial randomness

    NASA Astrophysics Data System (ADS)

    Hackett-Jones, Emily J.; Davies, Kale J.; Binder, Benjamin J.; Landman, Kerry A.

    2012-06-01

    Spatial data sets, generated from a wide range of physical systems can be analyzed by counting the number of objects in a set of bins. Previous work has been limited to equal-sized bins, which are inappropriate for some domains (e.g., circular). We consider a nonequal size bin configuration whereby overlapping or nonoverlapping bins cover the domain. A generalized index, defined in terms of a variance between bin counts, is developed to indicate whether or not a spatial data set, generated from exclusion or nonexclusion processes, is at the complete spatial randomness (CSR) state. Limiting values of the index are determined. Using examples, we investigate trends in the generalized index as a function of density and compare the results with those using equal size bins. The smallest bin size must be much larger than the mean size of the objects. We can determine whether a spatial data set is at the CSR state or not by comparing the values of a generalized index for different bin configurations—the values will be approximately the same if the data is at the CSR state, while the values will differ if the data set is not at the CSR state. In general, the generalized index is lower than the limiting value of the index, since objects do not have access to the entire region due to blocking by other objects. These methods are applied to two applications: (i) spatial data sets generated from a cellular automata model of cell aggregation in the enteric nervous system and (ii) a known plant data distribution.

  8. Starting a DNA barcode reference library for shallow water polychaetes from the southern European Atlantic coast.

    PubMed

    Lobo, Jorge; Teixeira, Marcos A L; Borges, Luisa M S; Ferreira, Maria S G; Hollatz, Claudia; Gomes, Pedro T; Sousa, Ronaldo; Ravara, Ascensão; Costa, Maria H; Costa, Filipe O

    2016-01-01

    Annelid polychaetes have been seldom the focus of dedicated DNA barcoding studies, despite their ecological relevance and often dominance, particularly in soft-bottom estuarine and coastal marine ecosystems. Here, we report the first assessment of the performance of DNA barcodes in the discrimination of shallow water polychaete species from the southern European Atlantic coast, focusing on specimens collected in estuaries and coastal ecosystems of Portugal. We analysed cytochrome oxidase I DNA barcodes (COI-5P) from 164 specimens, which were assigned to 51 morphospecies. To our data set from Portugal, we added available published sequences selected from the same species, genus or family, to inspect for taxonomic congruence among studies and collection location. The final data set comprised 290 specimens and 79 morphospecies, which generated 99 Barcode Index Numbers (BINs) within Barcode of Life Data Systems (BOLD). Among these, 22 BINs were singletons, 47 other BINs were concordant, confirming the initial identification based on morphological characters, and 30 were discordant, most of which consisted on multiple BINs found for the same morphospecies. Some of the most prominent cases in the latter category include Hediste diversicolor (O.F. Müller, 1776) (7), Eulalia viridis (Linnaeus, 1767) (2) and Owenia fusiformis (delle Chiaje, 1844) (5), all of them reported from Portugal and frequently used in ecological studies as environmental quality indicators. Our results for these species showed discordance between molecular lineages and morphospecies, or added additional relatively divergent lineages. The potential inaccuracies in environmental assessments, where underpinning polychaete species diversity is poorly resolved or clarified, demand additional and extensive investigation of the DNA barcode diversity in this group, in parallel with alpha taxonomy efforts. © 2015 John Wiley & Sons Ltd.

  9. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    NASA Astrophysics Data System (ADS)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  10. BinSanity: unsupervised clustering of environmental microbial assemblies using coverage and affinity propagation

    PubMed Central

    Heidelberg, John F.; Tully, Benjamin J.

    2017-01-01

    Metagenomics has become an integral part of defining microbial diversity in various environments. Many ecosystems have characteristically low biomass and few cultured representatives. Linking potential metabolisms to phylogeny in environmental microorganisms is important for interpreting microbial community functions and the impacts these communities have on geochemical cycles. However, with metagenomic studies there is the computational hurdle of ‘binning’ contigs into phylogenetically related units or putative genomes. Binning methods have been implemented with varying approaches such as k-means clustering, Gaussian mixture models, hierarchical clustering, neural networks, and two-way clustering; however, many of these suffer from biases against low coverage/abundance organisms and closely related taxa/strains. We are introducing a new binning method, BinSanity, that utilizes the clustering algorithm affinity propagation (AP), to cluster assemblies using coverage with compositional based refinement (tetranucleotide frequency and percent GC content) to optimize bins containing multiple source organisms. This separation of composition and coverage based clustering reduces bias for closely related taxa. BinSanity was developed and tested on artificial metagenomes varying in size and complexity. Results indicate that BinSanity has a higher precision, recall, and Adjusted Rand Index compared to five commonly implemented methods. When tested on a previously published environmental metagenome, BinSanity generated high completion and low redundancy bins corresponding with the published metagenome-assembled genomes. PMID:28289564

  11. Diviner lunar radiometer gridded brightness temperatures from geodesic binning of modeled fields of view

    NASA Astrophysics Data System (ADS)

    Sefton-Nash, E.; Williams, J.-P.; Greenhagen, B. T.; Aye, K.-M.; Paige, D. A.

    2017-12-01

    An approach is presented to efficiently produce high quality gridded data records from the large, global point-based dataset returned by the Diviner Lunar Radiometer Experiment aboard NASA's Lunar Reconnaissance Orbiter. The need to minimize data volume and processing time in production of science-ready map products is increasingly important with the growth in data volume of planetary datasets. Diviner makes on average >1400 observations per second of radiance that is reflected and emitted from the lunar surface, using 189 detectors divided into 9 spectral channels. Data management and processing bottlenecks are amplified by modeling every observation as a probability distribution function over the field of view, which can increase the required processing time by 2-3 orders of magnitude. Geometric corrections, such as projection of data points onto a digital elevation model, are numerically intensive and therefore it is desirable to perform them only once. Our approach reduces bottlenecks through parallel binning and efficient storage of a pre-processed database of observations. Database construction is via subdivision of a geodesic icosahedral grid, with a spatial resolution that can be tailored to suit the field of view of the observing instrument. Global geodesic grids with high spatial resolution are normally impractically memory intensive. We therefore demonstrate a minimum storage and highly parallel method to bin very large numbers of data points onto such a grid. A database of the pre-processed and binned points is then used for production of mapped data products that is significantly faster than if unprocessed points were used. We explore quality controls in the production of gridded data records by conditional interpolation, allowed only where data density is sufficient. The resultant effects on the spatial continuity and uncertainty in maps of lunar brightness temperatures is illustrated. We identify four binning regimes based on trades between the spatial resolution of the grid, the size of the FOV and the on-target spacing of observations. Our approach may be applicable and beneficial for many existing and future point-based planetary datasets.

  12. ATLAS (Automatic Tool for Local Assembly Structures) - A Comprehensive Infrastructure for Assembly, Annotation, and Genomic Binning of Metagenomic and Metaranscripomic Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard A.; Brown, Joseph M.; Colby, Sean M.

    ATLAS (Automatic Tool for Local Assembly Structures) is a comprehensive multiomics data analysis pipeline that is massively parallel and scalable. ATLAS contains a modular analysis pipeline for assembly, annotation, quantification and genome binning of metagenomics and metatranscriptomics data and a framework for reference metaproteomic database construction. ATLAS transforms raw sequence data into functional and taxonomic data at the microbial population level and provides genome-centric resolution through genome binning. ATLAS provides robust taxonomy based on majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS provides robust taxonomy based onmore » majority voting of protein coding open reading frames rolled-up at the contig level using modified lowest common ancestor (LCA) analysis. ATLAS is user-friendly, easy install through bioconda maintained as open-source on GitHub, and is implemented in Snakemake for modular customizable workflows.« less

  13. An efficient parallel algorithm for the calculation of canonical MP2 energies.

    PubMed

    Baker, Jon; Pulay, Peter

    2002-09-01

    We present the parallel version of a previous serial algorithm for the efficient calculation of canonical MP2 energies (Pulay, P.; Saebo, S.; Wolinski, K. Chem Phys Lett 2001, 344, 543). It is based on the Saebo-Almlöf direct-integral transformation, coupled with an efficient prescreening of the AO integrals. The parallel algorithm avoids synchronization delays by spawning a second set of slaves during the bin-sort prior to the second half-transformation. Results are presented for systems with up to 2000 basis functions. MP2 energies for molecules with 400-500 basis functions can be routinely calculated to microhartree accuracy on a small number of processors (6-8) in a matter of minutes with modern PC-based parallel computers. Copyright 2002 Wiley Periodicals, Inc. J Comput Chem 23: 1150-1156, 2002

  14. Running Gaussian16 Software Jobs on the Peregrine System | High-Performance

    Science.gov Websites

    , parallel setup is taken care of automatically based on settings in the PBS script example below. Previous filesystem called /dev/shm. This scratch space is set automatically by the example script below. The Gaussian system. An example script for batch submission is given below. #!/bin/bash #PBS -l nodes=2 #PBS -l

  15. Intel Parallel Studio on the Peregrine System | High-Performance Computing

    Science.gov Websites

    given below: #!/bin/bash --login #PBS -N #PBS -q #PBS -l nodes=<N> ;:ppn=<n> #PBS -l walltime=00:30:00 #PBS -A # set your tmpdir, and don't collect MPI communication data: #!/bin/bash --login #PBS -N #PBS -q #PBS -l

  16. Does Your Optical Particle Counter Measure What You Think it Does? Calibration and Refractive Index Correction Methods.

    NASA Astrophysics Data System (ADS)

    Rosenberg, Phil; Dean, Angela; Williams, Paul; Dorsey, James; Minikin, Andreas; Pickering, Martyn; Petzold, Andreas

    2013-04-01

    Optical Particle Counters (OPCs) are the de-facto standard for in-situ measurements of airborne aerosol size distributions and small cloud particles over a wide size range. This is particularly the case on airborne platforms where fast response is important. OPCs measure scattered light from individual particles and generally bin particles according to the measured peak amount of light scattered (the OPC's response). Most manufacturers provide a table along with their instrument which indicates the particle diameters which represent the edges of each bin. It is important to correct the particle size reported by OPCs for the refractive index of the particles being measured, which is often not the same as for those used during calibration. However, the OPC's response is not a monotonic function of particle diameter and obvious problems occur when refractive index corrections are attempted, but multiple diameters correspond to the same OPC response. Here we recommend that OPCs are calibrated in terms of particle scattering cross section as this is a monotonic (usually linear) function of an OPC's response. We present a method for converting a bin's boundaries in terms of scattering cross section into a bin centre and bin width in terms of diameter for any aerosol species for which the scattering properties are known. The relationship between diameter and scattering cross section can be arbitrarily complex and does not need to be monotonic; it can be based on Mie-Lorenz theory or any other scattering theory. Software has been provided on the Sourceforge open source repository for scientific users to implement such methods in their own measurement and calibration routines. As a case study data is presented showing data from Passive Cavity Aerosol Spectrometer Probe (PCASP) and a Cloud Droplet Probe (CDP) calibrated using polystyrene latex spheres and glass beads before being deployed as part of the Fennec project to measure airborne dust in the inaccessible regions of the Sahara.

  17. Untangling taxonomy: a DNA barcode reference library for Canadian spiders.

    PubMed

    Blagoev, Gergin A; deWaard, Jeremy R; Ratnasingham, Sujeevan; deWaard, Stephanie L; Lu, Liuqiong; Robertson, James; Telfer, Angela C; Hebert, Paul D N

    2016-01-01

    Approximately 1460 species of spiders have been reported from Canada, 3% of the global fauna. This study provides a DNA barcode reference library for 1018 of these species based upon the analysis of more than 30,000 specimens. The sequence results show a clear barcode gap in most cases with a mean intraspecific divergence of 0.78% vs. a minimum nearest-neighbour (NN) distance averaging 7.85%. The sequences were assigned to 1359 Barcode index numbers (BINs) with 1344 of these BINs composed of specimens belonging to a single currently recognized species. There was a perfect correspondence between BIN membership and a known species in 795 cases, while another 197 species were assigned to two or more BINs (556 in total). A few other species (26) were involved in BIN merges or in a combination of merges and splits. There was only a weak relationship between the number of specimens analysed for a species and its BIN count. However, three species were clear outliers with their specimens being placed in 11-22 BINs. Although all BIN splits need further study to clarify the taxonomic status of the entities involved, DNA barcodes discriminated 98% of the 1018 species. The present survey conservatively revealed 16 species new to science, 52 species new to Canada and major range extensions for 426 species. However, if most BIN splits detected in this study reflect cryptic taxa, the true species count for Canadian spiders could be 30-50% higher than currently recognized. © 2015 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  18. Using the Parallel Computing Toolbox with MATLAB on the Peregrine System |

    Science.gov Websites

    parallel pool took %g seconds.\\n', toc) % "single program multiple data" spmd fprintf('Worker %d says Hello World!\\n', labindex) end delete(gcp); % close the parallel pool exit To run the script on a compute node, create the file helloWorld.sub: #!/bin/bash #PBS -l walltime=05:00 #PBS -l nodes=1 #PBS -N

  19. Fast polyenergetic forward projection for image formation using OpenCL on a heterogeneous parallel computing platform.

    PubMed

    Zhou, Lili; Clifford Chao, K S; Chang, Jenghwa

    2012-11-01

    Simulated projection images of digital phantoms constructed from CT scans have been widely used for clinical and research applications but their quality and computation speed are not optimal for real-time comparison with the radiography acquired with an x-ray source of different energies. In this paper, the authors performed polyenergetic forward projections using open computing language (OpenCL) in a parallel computing ecosystem consisting of CPU and general purpose graphics processing unit (GPGPU) for fast and realistic image formation. The proposed polyenergetic forward projection uses a lookup table containing the NIST published mass attenuation coefficients (μ∕ρ) for different tissue types and photon energies ranging from 1 keV to 20 MeV. The CT images of interested sites are first segmented into different tissue types based on the CT numbers and converted to a three-dimensional attenuation phantom by linking each voxel to the corresponding tissue type in the lookup table. The x-ray source can be a radioisotope or an x-ray generator with a known spectrum described as weight w(n) for energy bin E(n). The Siddon method is used to compute the x-ray transmission line integral for E(n) and the x-ray fluence is the weighted sum of the exponential of line integral for all energy bins with added Poisson noise. To validate this method, a digital head and neck phantom constructed from the CT scan of a Rando head phantom was segmented into three (air, gray∕white matter, and bone) regions for calculating the polyenergetic projection images for the Mohan 4 MV energy spectrum. To accelerate the calculation, the authors partitioned the workloads using the task parallelism and data parallelism and scheduled them in a parallel computing ecosystem consisting of CPU and GPGPU (NVIDIA Tesla C2050) using OpenCL only. The authors explored the task overlapping strategy and the sequential method for generating the first and subsequent DRRs. A dispatcher was designed to drive the high-degree parallelism of the task overlapping strategy. Numerical experiments were conducted to compare the performance of the OpenCL∕GPGPU-based implementation with the CPU-based implementation. The projection images were similar to typical portal images obtained with a 4 or 6 MV x-ray source. For a phantom size of 512 × 512 × 223, the time for calculating the line integrals for a 512 × 512 image panel was 16.2 ms on GPGPU for one energy bin in comparison to 8.83 s on CPU. The total computation time for generating one polyenergetic projection image of 512 × 512 was 0.3 s (141 s for CPU). The relative difference between the projection images obtained with the CPU-based and OpenCL∕GPGPU-based implementations was on the order of 10(-6) and was virtually indistinguishable. The task overlapping strategy was 5.84 and 1.16 times faster than the sequential method for the first and the subsequent digitally reconstruction radiographies, respectively. The authors have successfully built digital phantoms using anatomic CT images and NIST μ∕ρ tables for simulating realistic polyenergetic projection images and optimized the processing speed with parallel computing using GPGPU∕OpenCL-based implementation. The computation time was fast (0.3 s per projection image) enough for real-time IGRT (image-guided radiotherapy) applications.

  20. The intercrater plains of Mercury and the Moon: Their nature, origin and role in terrestrial planet evolution. Crater statistical data. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Leake, M. A.

    1982-01-01

    The total number of craters within a bin of mean diameter, and the number of craters of each degradational type within that bin are tabulated. Rim-to-rim diameters were measured at arbitrary azimuths for rectified photos or photos taken at vertical incidence (most lunar photos), and at azimuths paralleling a local tangent to the limb for oblique images.

  1. See Also:physica status solidi (b)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  2. Get Sample Copy
  3. Recommend to Your Librarian
  4. E-MailPrint
  1. Design and Performance of a 1 ms High-Speed Vision Chip with 3D-Stacked 140 GOPS Column-Parallel PEs †.

    PubMed

    Nose, Atsushi; Yamazaki, Tomohiro; Katayama, Hironobu; Uehara, Shuji; Kobayashi, Masatsugu; Shida, Sayaka; Odahara, Masaki; Takamiya, Kenichi; Matsumoto, Shizunori; Miyashita, Leo; Watanabe, Yoshihiro; Izawa, Takashi; Muramatsu, Yoshinori; Nitta, Yoshikazu; Ishikawa, Masatoshi

    2018-04-24

    We have developed a high-speed vision chip using 3D stacking technology to address the increasing demand for high-speed vision chips in diverse applications. The chip comprises a 1/3.2-inch, 1.27 Mpixel, 500 fps (0.31 Mpixel, 1000 fps, 2 × 2 binning) vision chip with 3D-stacked column-parallel Analog-to-Digital Converters (ADCs) and 140 Giga Operation per Second (GOPS) programmable Single Instruction Multiple Data (SIMD) column-parallel PEs for new sensing applications. The 3D-stacked structure and column parallel processing architecture achieve high sensitivity, high resolution, and high-accuracy object positioning.

  2. Modelling of ionospheric irregularities during geomagnetic storms over African low latitude region

    NASA Astrophysics Data System (ADS)

    Mungufeni, Patrick

    2016-07-01

    In this study, empirical models of occurrence of ionospheric irregularities over low latitude African region during geomagnetic storms have been developed. The geomagnetic storms considered consisted of Dst ≤ -50 nT. GNSS-derived ionospheric Total Electron Content (TEC) data over Libreville, Gabon (NKLG) (0.35° N, 9.68° E, geographic, 8.05° S, magnetic) and Malindi, Kenya (MAL2) (2.99° S, 40.19° E, geographic, 12.42° S, magnetic) during 2000 - 2014 were used. Ionospheric irregularities at scale- lengths of a few kilometers and ˜400 m were represented with the rate of change of TEC index (ROTI). The inputs for the models are the local time, solar flux index, Auroral Electrojet index, day of the year, and the Dst index, while the output is the median ROTI during these given conditions. To develop the models, the ROTI index values were binned based on the input parameters and cubic B splines were then fitted to the binned data. Developed models using data over NKLG and MAL2 were validated with independent data over stations within 510 km and 680 km radius, respectively. The models captured the enhancements and inhibitions of the occurrence of the ionospheric irregularities during the storm period. The models even emulated these patterns in the various seasons, during medium and high solar activity conditions. The correlation coefficients for the validations were statistically significant and ranged from 0.58 - 0.73, while the percentage of the variance in the observed data explained by the modelled data ranged from 34 - 53.

  3. Mapping global biodiversity connections with DNA barcodes: Lepidoptera of Pakistan.

    PubMed

    Ashfaq, Muhammad; Akhtar, Saleem; Rafi, Muhammad Athar; Mansoor, Shahid; Hebert, Paul D N

    2017-01-01

    Sequences from the DNA barcode region of the mitochondrial COI gene are an effective tool for specimen identification and for the discovery of new species. The Barcode of Life Data Systems (BOLD) (www.boldsystems.org) currently hosts 4.5 million records from animals which have been assigned to more than 490,000 different Barcode Index Numbers (BINs), which serve as a proxy for species. Because a fourth of these BINs derive from Lepidoptera, BOLD has a strong capability to both identify specimens in this order and to support studies of faunal overlap. DNA barcode sequences were obtained from 4503 moths from 329 sites across Pakistan, specimens that represented 981 BINs from 52 families. Among 379 species with a Linnaean name assignment, all were represented by a single BIN excepting five species that showed a BIN split. Less than half (44%) of the 981 BINs had counterparts in other countries; the remaining BINs were unique to Pakistan. Another 218 BINs of Lepidoptera from Pakistan were coupled with the 981 from this study before being compared with all 116,768 BINs for this order. As expected, faunal overlap was highest with India (21%), Sri Lanka (21%), United Arab Emirates (20%) and with other Asian nations (2.1%), but it was very low with other continents including Africa (0.6%), Europe (1.3%), Australia (0.6%), Oceania (1.0%), North America (0.1%), and South America (0.1%). This study indicates the way in which DNA barcoding facilitates measures of faunal overlap even when taxa have not been assigned to a Linnean species.

  4. Mapping global biodiversity connections with DNA barcodes: Lepidoptera of Pakistan

    PubMed Central

    Akhtar, Saleem; Rafi, Muhammad Athar; Mansoor, Shahid; Hebert, Paul D. N.

    2017-01-01

    Sequences from the DNA barcode region of the mitochondrial COI gene are an effective tool for specimen identification and for the discovery of new species. The Barcode of Life Data Systems (BOLD) (www.boldsystems.org) currently hosts 4.5 million records from animals which have been assigned to more than 490,000 different Barcode Index Numbers (BINs), which serve as a proxy for species. Because a fourth of these BINs derive from Lepidoptera, BOLD has a strong capability to both identify specimens in this order and to support studies of faunal overlap. DNA barcode sequences were obtained from 4503 moths from 329 sites across Pakistan, specimens that represented 981 BINs from 52 families. Among 379 species with a Linnaean name assignment, all were represented by a single BIN excepting five species that showed a BIN split. Less than half (44%) of the 981 BINs had counterparts in other countries; the remaining BINs were unique to Pakistan. Another 218 BINs of Lepidoptera from Pakistan were coupled with the 981 from this study before being compared with all 116,768 BINs for this order. As expected, faunal overlap was highest with India (21%), Sri Lanka (21%), United Arab Emirates (20%) and with other Asian nations (2.1%), but it was very low with other continents including Africa (0.6%), Europe (1.3%), Australia (0.6%), Oceania (1.0%), North America (0.1%), and South America (0.1%). This study indicates the way in which DNA barcoding facilitates measures of faunal overlap even when taxa have not been assigned to a Linnean species. PMID:28339501

  5. Bin Ratio-Based Histogram Distances and Their Application to Image Classification.

    PubMed

    Hu, Weiming; Xie, Nianhua; Hu, Ruiguang; Ling, Haibin; Chen, Qiang; Yan, Shuicheng; Maybank, Stephen

    2014-12-01

    Large variations in image background may cause partial matching and normalization problems for histogram-based representations, i.e., the histograms of the same category may have bins which are significantly different, and normalization may produce large changes in the differences between corresponding bins. In this paper, we deal with this problem by using the ratios between bin values of histograms, rather than bin values' differences which are used in the traditional histogram distances. We propose a bin ratio-based histogram distance (BRD), which is an intra-cross-bin distance, in contrast with previous bin-to-bin distances and cross-bin distances. The BRD is robust to partial matching and histogram normalization, and captures correlations between bins with only a linear computational complexity. We combine the BRD with the ℓ1 histogram distance and the χ(2) histogram distance to generate the ℓ1 BRD and the χ(2) BRD, respectively. These combinations exploit and benefit from the robustness of the BRD under partial matching and the robustness of the ℓ1 and χ(2) distances to small noise. We propose a method for assessing the robustness of histogram distances to partial matching. The BRDs and logistic regression-based histogram fusion are applied to image classification. The experimental results on synthetic data sets show the robustness of the BRDs to partial matching, and the experiments on seven benchmark data sets demonstrate promising results of the BRDs for image classification.

  6. Kinematic Analysis and Performance Evaluation of Novel PRS Parallel Mechanism

    NASA Astrophysics Data System (ADS)

    Balaji, K.; Khan, B. Shahul Hamid

    2018-02-01

    In this paper, a 3 DoF (Degree of Freedom) novel PRS (Prismatic-Revolute- Spherical) type parallel mechanisms has been designed and presented. The combination of striaght and arc type linkages for 3 DOF parallel mechanism is introduced for the first time. The performances of the mechanisms are evaluated based on the indices such as Minimum Singular Value (MSV), Condition Number (CN), Local Conditioning Index (LCI), Kinematic Configuration Index (KCI) and Global Conditioning Index (GCI). The overall reachable workspace of all mechanisms are presented. The kinematic measure, dexterity measure and workspace analysis for all the mechanism have been evaluated and compared.

  7. Storing files in a parallel computing system using list-based index to identify replica files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faibish, Sorin; Bent, John M.; Tzelnic, Percy

    Improved techniques are provided for storing files in a parallel computing system using a list-based index to identify file replicas. A file and at least one replica of the file are stored in one or more storage nodes of the parallel computing system. An index for the file comprises at least one list comprising a pointer to a storage location of the file and a storage location of the at least one replica of the file. The file comprises one or more of a complete file and one or more sub-files. The index may also comprise a checksum value formore » one or more of the file and the replica(s) of the file. The checksum value can be evaluated to validate the file and/or the file replica(s). A query can be processed using the list.« less

  8. Measurement of the Microwave Refractive Index of Materials Based on Parallel Plate Waveguides

    NASA Astrophysics Data System (ADS)

    Zhao, F.; Pei, J.; Kan, J. S.; Zhao, Q.

    2017-12-01

    An electrical field scanning apparatus based on a parallel plate waveguide method is constructed, which collects the amplitude and phase matrices as a function of the relative position. On the basis of such data, a method for calculating the refractive index of the measured wedge samples is proposed in this paper. The measurement and calculation results of different PTFE samples reveal that the refractive index measured by the apparatus is substantially consistent with the refractive index inferred with the permittivity of the sample. The proposed refractive index calculation method proposed in this paper is a competitive method for the characterization of the refractive index of materials with positive refractive index. Since the apparatus and method can be used to measure and calculate arbitrary direction of the microwave propagation, it is believed that both of them can be applied to the negative refractive index materials, such as metamaterials or “left-handed” materials.

  9. Optimizing 4DCBCT projection allocation to respiratory bins.

    PubMed

    O'Brien, Ricky T; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J

    2014-10-07

    4D cone beam computed tomography (4DCBCT) is an emerging image guidance strategy used in radiotherapy where projections acquired during a scan are sorted into respiratory bins based on the respiratory phase or displacement. 4DCBCT reduces the motion blur caused by respiratory motion but increases streaking artefacts due to projection under-sampling as a result of the irregular nature of patient breathing and the binning algorithms used. For displacement binning the streak artefacts are so severe that displacement binning is rarely used clinically. The purpose of this study is to investigate if sharing projections between respiratory bins and adjusting the location of respiratory bins in an optimal manner can reduce or eliminate streak artefacts in 4DCBCT images. We introduce a mathematical optimization framework and a heuristic solution method, which we will call the optimized projection allocation algorithm, to determine where to position the respiratory bins and which projections to source from neighbouring respiratory bins. Five 4DCBCT datasets from three patients were used to reconstruct 4DCBCT images. Projections were sorted into respiratory bins using equispaced, equal density and optimized projection allocation. The standard deviation of the angular separation between projections was used to assess streaking and the consistency of the segmented volume of a fiducial gold marker was used to assess motion blur. The standard deviation of the angular separation between projections using displacement binning and optimized projection allocation was 30%-50% smaller than conventional phase based binning and 59%-76% smaller than conventional displacement binning indicating more uniformly spaced projections and fewer streaking artefacts. The standard deviation in the marker volume was 20%-90% smaller when using optimized projection allocation than using conventional phase based binning suggesting more uniform marker segmentation and less motion blur. Images reconstructed using displacement binning and the optimized projection allocation algorithm were clearer, contained visibly fewer streak artefacts and produced more consistent marker segmentation than those reconstructed with either equispaced or equal-density binning. The optimized projection allocation algorithm significantly improves image quality in 4DCBCT images and provides, for the first time, a method to consistently generate high quality displacement binned 4DCBCT images in clinical applications.

  10. DNA barcoding of odonates from the Upper Plata basin: Database creation and genetic diversity estimation.

    PubMed

    Koroiva, Ricardo; Pepinelli, Mateus; Rodrigues, Marciel Elio; Roque, Fabio de Oliveira; Lorenz-Lemke, Aline Pedroso; Kvist, Sebastian

    2017-01-01

    We present a DNA barcoding study of Neotropical odonates from the Upper Plata basin, Brazil. A total of 38 species were collected in a transition region of "Cerrado" and Atlantic Forest, both regarded as biological hotspots, and 130 cytochrome c oxidase subunit I (COI) barcodes were generated for the collected specimens. The distinct gap between intraspecific (0-2%) and interspecific variation (15% and above) in COI, and resulting separation of Barcode Index Numbers (BIN), allowed for successful identification of specimens in 94% of cases. The 6% fail rate was due to a shared BIN between two separate nominal species. DNA barcoding, based on COI, thus seems to be a reliable and efficient tool for identifying Neotropical odonate specimens down to the species level. These results underscore the utility of DNA barcoding to aid specimen identification in diverse biological hotspots, areas that require urgent action regarding taxonomic surveys and biodiversity conservation.

  11. Exploring Electrical and Magnetic Resonances from Coherently Correlated Long-Lived Radical Pairs towards Development of Negative Refractive-Index Materials

    DTIC Science & Technology

    2015-01-03

    Dissociation in Perovskite Solar Cells Yu-Che Hsiao, Ting Wu, Mingxing Li, and Bin Hu Advanced Materials, DOI: 10.1002/adma.201405946, 2015 2...Electrode Interface and Donor/Acceptor Interface via Charge Dissociation in Organic Solar Cells at Device-Operating Condition Ting Wu, Yu-Che Hsiao...exchange energy at donor:acceptor interfaces in organic solar cells Mingxing Li, Hongfeng Wang, Lei He, Huidong Zang, Hengxing Xu, and Bin Hu Appl

  12. Approved Methods and Algorithms for DoD Risk-Based Explosives Siting

    DTIC Science & Technology

    2009-07-21

    Parameter used in determining probability of hit ( Phit ) by debris. [Table 31, Table 32, Table 33, Eq. (157), Eq. (158)] CCa Variable “Actual...being in the glass hazard area”. [Eq. (60), Eq. (78)] Phit Variable “Probability of hit”. An array value indexed by consequence and mass bin...Eq. (156), Eq. (157)] Phit (f) Variable “Probability of hit for fatality”. [Eq. (157), Eq. (158)] Phit (maji) Variable “Probability of hit for major

  13. BusyBee Web: metagenomic data analysis by bootstrapped supervised binning and annotation

    PubMed Central

    Kiefer, Christina; Fehlmann, Tobias; Backes, Christina

    2017-01-01

    Abstract Metagenomics-based studies of mixed microbial communities are impacting biotechnology, life sciences and medicine. Computational binning of metagenomic data is a powerful approach for the culture-independent recovery of population-resolved genomic sequences, i.e. from individual or closely related, constituent microorganisms. Existing binning solutions often require a priori characterized reference genomes and/or dedicated compute resources. Extending currently available reference-independent binning tools, we developed the BusyBee Web server for the automated deconvolution of metagenomic data into population-level genomic bins using assembled contigs (Illumina) or long reads (Pacific Biosciences, Oxford Nanopore Technologies). A reversible compression step as well as bootstrapped supervised binning enable quick turnaround times. The binning results are represented in interactive 2D scatterplots. Moreover, bin quality estimates, taxonomic annotations and annotations of antibiotic resistance genes are computed and visualized. Ground truth-based benchmarks of BusyBee Web demonstrate comparably high performance to state-of-the-art binning solutions for assembled contigs and markedly improved performance for long reads (median F1 scores: 70.02–95.21%). Furthermore, the applicability to real-world metagenomic datasets is shown. In conclusion, our reference-independent approach automatically bins assembled contigs or long reads, exhibits high sensitivity and precision, enables intuitive inspection of the results, and only requires FASTA-formatted input. The web-based application is freely accessible at: https://ccb-microbe.cs.uni-saarland.de/busybee. PMID:28472498

  14. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi -LAT data

    DOE PAGES

    Lott, B.; Escande, L.; Larsson, S.; ...

    2012-07-19

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LATmore » analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.« less

  15. Assembling and auditing a comprehensive DNA barcode reference library for European marine fishes.

    PubMed

    Oliveira, L M; Knebelsberger, T; Landi, M; Soares, P; Raupach, M J; Costa, F O

    2016-12-01

    A large-scale comprehensive reference library of DNA barcodes for European marine fishes was assembled, allowing the evaluation of taxonomic uncertainties and species genetic diversity that were otherwise hidden in geographically restricted studies. A total of 4118 DNA barcodes were assigned to 358 species generating 366 Barcode Index Numbers (BIN). Initial examination revealed as much as 141 BIN discordances (more than one species in each BIN). After implementing an auditing and five-grade (A-E) annotation protocol, the number of discordant species BINs was reduced to 44 (13% grade E), while concordant species BINs amounted to 271 (78% grades A and B) and 14 other had insufficient data (grade D). Fifteen species displayed comparatively high intraspecific divergences ranging from 2·6 to 18·5% (grade C), which is biologically paramount information to be considered in fish species monitoring and stock assessment. On balance, this compilation contributed to the detection of 59 European fish species probably in need of taxonomic clarification or re-evaluation. The generalized implementation of an auditing and annotation protocol for reference libraries of DNA barcodes is recommended. © 2016 The Fisheries Society of the British Isles.

  16. Optofluidic refractive-index sensor in step-index fiber with parallel hollow micro-channel.

    PubMed

    Lee, H W; Schmidt, M A; Uebel, P; Tyagi, H; Joly, N Y; Scharrer, M; Russell, P St J

    2011-04-25

    We present a simple refractive index sensor based on a step-index fiber with a hollow micro-channel running parallel to its core. This channel becomes waveguiding when filled with a liquid of index greater than silica, causing sharp dips to appear in the transmission spectrum at wavelengths where the glass-core mode phase-matches to a mode of the liquid-core. The sensitivity of the dip-wavelengths to changes in liquid refractive index is quantified and the results used to study the dynamic flow characteristics of fluids in narrow channels. Potential applications of this fiber microstructure include measuring the optical properties of liquids, refractive index sensing, biophotonics and studies of fluid dynamics on the nanoscale.

  17. Development of a lifetime merit-based selection index for US dairy grazing systems

    USDA-ARS?s Scientific Manuscript database

    Pasture-based dairy producers in the US face costs, revenues and management challenges that differ from those associated with conventional dairy production systems. Three Grazing Merit indexes (GM$1, GM$2, and GM$3), parallel to the US Lifetime Net Merit (NM$) index, were constructed using economic ...

  18. CoMet: a workflow using contig coverage and composition for binning a metagenomic sample with high precision.

    PubMed

    Herath, Damayanthi; Tang, Sen-Lin; Tandon, Kshitij; Ackland, David; Halgamuge, Saman Kumara

    2017-12-28

    In metagenomics, the separation of nucleotide sequences belonging to an individual or closely matched populations is termed binning. Binning helps the evaluation of underlying microbial population structure as well as the recovery of individual genomes from a sample of uncultivable microbial organisms. Both supervised and unsupervised learning methods have been employed in binning; however, characterizing a metagenomic sample containing multiple strains remains a significant challenge. In this study, we designed and implemented a new workflow, Coverage and composition based binning of Metagenomes (CoMet), for binning contigs in a single metagenomic sample. CoMet utilizes coverage values and the compositional features of metagenomic contigs. The binning strategy in CoMet includes the initial grouping of contigs in guanine-cytosine (GC) content-coverage space and refinement of bins in tetranucleotide frequencies space in a purely unsupervised manner. With CoMet, the clustering algorithm DBSCAN is employed for binning contigs. The performances of CoMet were compared against four existing approaches for binning a single metagenomic sample, including MaxBin, Metawatt, MyCC (default) and MyCC (coverage) using multiple datasets including a sample comprised of multiple strains. Binning methods based on both compositional features and coverages of contigs had higher performances than the method which is based only on compositional features of contigs. CoMet yielded higher or comparable precision in comparison to the existing binning methods on benchmark datasets of varying complexities. MyCC (coverage) had the highest ranking score in F1-score. However, the performances of CoMet were higher than MyCC (coverage) on the dataset containing multiple strains. Furthermore, CoMet recovered contigs of more species and was 18 - 39% higher in precision than the compared existing methods in discriminating species from the sample of multiple strains. CoMet resulted in higher precision than MyCC (default) and MyCC (coverage) on a real metagenome. The approach proposed with CoMet for binning contigs, improves the precision of binning while characterizing more species in a single metagenomic sample and in a sample containing multiple strains. The F1-scores obtained from different binning strategies vary with different datasets; however, CoMet yields the highest F1-score with a sample comprised of multiple strains.

  19. Retrieval of the thickness and refractive index dispersion of parallel plate from a single interferogram recorded in both spectral and angular domains

    NASA Astrophysics Data System (ADS)

    Dong, Jingtao; Lu, Rongsheng

    2018-04-01

    The principle of retrieving the thickness and refractive index dispersion of a parallel glass plate is reported based on single interferogram recording and phase analysis. With the parallel plate illuminated by a convergent light sheet, the transmitted light interfering in both spectral and angular domains is recorded. The phase recovered from the single interferogram by Fourier analysis is used to retrieve the thickness and refractive index dispersion without periodic ambiguity. Experimental results of an optical substrate standard show that the accuracy of refractive index dispersion is less than 2.5 × 10-5 and the relative uncertainty of thickness is 6 × 10-5 (3σ). This method is confirmed to be robust against the intensity noises, indicating the capability of stable and accurate measurement.

  20. Transmission Index Research of Parallel Manipulators Based on Matrix Orthogonal Degree

    NASA Astrophysics Data System (ADS)

    Shao, Zhu-Feng; Mo, Jiao; Tang, Xiao-Qiang; Wang, Li-Ping

    2017-11-01

    Performance index is the standard of performance evaluation, and is the foundation of both performance analysis and optimal design for the parallel manipulator. Seeking the suitable kinematic indices is always an important and challenging issue for the parallel manipulator. So far, there are extensive studies in this field, but few existing indices can meet all the requirements, such as simple, intuitive, and universal. To solve this problem, the matrix orthogonal degree is adopted, and generalized transmission indices that can evaluate motion/force transmissibility of fully parallel manipulators are proposed. Transmission performance analysis of typical branches, end effectors, and parallel manipulators is given to illustrate proposed indices and analysis methodology. Simulation and analysis results reveal that proposed transmission indices possess significant advantages, such as normalized finite (ranging from 0 to 1), dimensionally homogeneous, frame-free, intuitive and easy to calculate. Besides, proposed indices well indicate the good transmission region and relativity to the singularity with better resolution than the traditional local conditioning index, and provide a novel tool for kinematic analysis and optimal design of fully parallel manipulators.

  1. Contrasting morphological and DNA barcode-suggested species boundaries among shallow-water amphipod fauna from the southern European Atlantic coast.

    PubMed

    Lobo, Jorge; Ferreira, Maria S; Antunes, Ilisa C; Teixeira, Marcos A L; Borges, Luisa M S; Sousa, Ronaldo; Gomes, Pedro A; Costa, Maria Helena; Cunha, Marina R; Costa, Filipe O

    2017-02-01

    In this study we compared DNA barcode-suggested species boundaries with morphology-based species identifications in the amphipod fauna of the southern European Atlantic coast. DNA sequences of the cytochrome c oxidase subunit I barcode region (COI-5P) were generated for 43 morphospecies (178 specimens) collected along the Portuguese coast which, together with publicly available COI-5P sequences, produced a final dataset comprising 68 morphospecies and 295 sequences. Seventy-five BINs (Barcode Index Numbers) were assigned to these morphospecies, of which 48 were concordant (i.e., 1 BIN = 1 species), 8 were taxonomically discordant, and 19 were singletons. Twelve species had matching sequences (<2% distance) with conspecifics from distant locations (e.g., North Sea). Seven morphospecies were assigned to multiple, and highly divergent, BINs, including specimens of Corophium multisetosum (18% divergence) and Dexamine spiniventris (16% divergence), which originated from sampling locations on the west coast of Portugal (only about 36 and 250 km apart, respectively). We also found deep divergence (4%-22%) among specimens of seven species from Portugal compared to those from the North Sea and Italy. The detection of evolutionarily meaningful divergence among populations of several amphipod species from southern Europe reinforces the need for a comprehensive re-assessment of the diversity of this faunal group.

  2. DNA barcoding a nightmare taxon: assessing barcode index numbers and barcode gaps for sweat bees.

    PubMed

    Gibbs, Jason

    2018-01-01

    There is an ongoing campaign to DNA barcode the world's >20 000 bee species. Recent revisions of Lasioglossum (Dialictus) (Hymenoptera: Halictidae) for Canada and the eastern United States were completed using integrative taxonomy. DNA barcode data from 110 species of L. (Dialictus) are examined for their value in identification and discovering additional taxonomic diversity. Specimen identification success was estimated using the best close match method. Error rates were 20% relative to current taxonomic understanding. Barcode Index Numbers (BINs) assigned using Refined Single Linkage Analysis (RESL) and barcode gaps using the Automatic Barcode Gap Discovery (ABGD) method were also assessed. RESL was incongruent for 44.5% of species, although some cryptic diversity may exist. Forty-three of 110 species were part of merged BINs with multiple species. The barcode gap is non-existent for the data set as a whole and ABGD showed levels of discordance similar to the RESL. The viridatum species-group is particularly problematic, so that DNA barcodes alone would be misleading for species delimitation and specimen identification. Character-based methods using fixed nucleotide substitutions could improve specimen identification success in some cases. The use of DNA barcoding for species discovery for standard taxonomic practice in the absence of a well-defined barcode gap is discussed.

  3. INDEXING MECHANISM

    DOEpatents

    Kock, L.J.

    1959-09-22

    A device is presented for loading and unloading fuel elements containing material fissionable by neutrons of thermal energy. The device comprises a combination of mechanical features Including a base, a lever pivotally attached to the base, an Indexing plate on the base parallel to the plane of lever rotation and having a plurality of apertures, the apertures being disposed In rows, each aperture having a keyway, an Index pin movably disposed to the plane of lever rotation and having a plurality of apertures, the apertures being disposed in rows, each aperture having a keyway, an index pin movably disposed on the lever normal to the plane rotation, a key on the pin, a sleeve on the lever spaced from and parallel to the index pin, a pair of pulleys and a cable disposed between them, an open collar rotatably attached to the sleeve and linked to one of the pulleys, a pin extending from the collar, and a bearing movably mounted in the sleeve and having at least two longitudinal grooves in the outside surface.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Islam, Md. Shafiqul, E-mail: shafique@eng.ukm.my; Hannan, M.A., E-mail: hannan@eng.ukm.my; Basri, Hassan

    Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensormore » intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level.« less

  5. Efficient visibility-driven medical image visualisation via adaptive binned visibility histogram.

    PubMed

    Jung, Younhyun; Kim, Jinman; Kumar, Ashnil; Feng, David Dagan; Fulham, Michael

    2016-07-01

    'Visibility' is a fundamental optical property that represents the observable, by users, proportion of the voxels in a volume during interactive volume rendering. The manipulation of this 'visibility' improves the volume rendering processes; for instance by ensuring the visibility of regions of interest (ROIs) or by guiding the identification of an optimal rendering view-point. The construction of visibility histograms (VHs), which represent the distribution of all the visibility of all voxels in the rendered volume, enables users to explore the volume with real-time feedback about occlusion patterns among spatially related structures during volume rendering manipulations. Volume rendered medical images have been a primary beneficiary of VH given the need to ensure that specific ROIs are visible relative to the surrounding structures, e.g. the visualisation of tumours that may otherwise be occluded by neighbouring structures. VH construction and its subsequent manipulations, however, are computationally expensive due to the histogram binning of the visibilities. This limits the real-time application of VH to medical images that have large intensity ranges and volume dimensions and require a large number of histogram bins. In this study, we introduce an efficient adaptive binned visibility histogram (AB-VH) in which a smaller number of histogram bins are used to represent the visibility distribution of the full VH. We adaptively bin medical images by using a cluster analysis algorithm that groups the voxels according to their intensity similarities into a smaller subset of bins while preserving the distribution of the intensity range of the original images. We increase efficiency by exploiting the parallel computation and multiple render targets (MRT) extension of the modern graphical processing units (GPUs) and this enables efficient computation of the histogram. We show the application of our method to single-modality computed tomography (CT), magnetic resonance (MR) imaging and multi-modality positron emission tomography-CT (PET-CT). In our experiments, the AB-VH markedly improved the computational efficiency for the VH construction and thus improved the subsequent VH-driven volume manipulations. This efficiency was achieved without major degradation in the VH visually and numerical differences between the AB-VH and its full-bin counterpart. We applied several variants of the K-means clustering algorithm with varying Ks (the number of clusters) and found that higher values of K resulted in better performance at a lower computational gain. The AB-VH also had an improved performance when compared to the conventional method of down-sampling of the histogram bins (equal binning) for volume rendering visualisation. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. SeaWiFS technical report series. Volume 32: Level-3 SeaWiFS data products. Spatial and temporal binning algorithms

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Acker, James G. (Editor); Campbell, Janet W.; Blaisdell, John M.; Darzi, Michael

    1995-01-01

    The level-3 data products from the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) are statistical data sets derived from level-2 data. Each data set will be based on a fixed global grid of equal-area bins that are approximately 9 x 9 sq km. Statistics available for each bin include the sum and sum of squares of the natural logarithm of derived level-2 geophysical variables where sums are accumulated over a binning period. Operationally, products with binning periods of 1 day, 8 days, 1 month, and 1 year will be produced and archived. From these accumulated values and for each bin, estimates of the mean, standard deviation, median, and mode may be derived for each geophysical variable. This report contains two major parts: the first (Section 2) is intended as a users' guide for level-3 SeaWiFS data products. It contains an overview of level-0 to level-3 data processing, a discussion of important statistical considerations when using level-3 data, and details of how to use the level-3 data. The second part (Section 3) presents a comparative statistical study of several binning algorithms based on CZCS and moored fluorometer data. The operational binning algorithms were selected based on the results of this study.

  7. Bin recycling strategy for improving the histogram precision on GPU

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Rodríguez-Vázquez, Juan José; Vega-Rodríguez, Miguel A.

    2016-07-01

    Histogram is an easily comprehensible way to present data and analyses. In the current scientific context with access to large volumes of data, the processing time for building histogram has dramatically increased. For this reason, parallel construction is necessary to alleviate the impact of the processing time in the analysis activities. In this scenario, GPU computing is becoming widely used for reducing until affordable levels the processing time of histogram construction. Associated to the increment of the processing time, the implementations are stressed on the bin-count accuracy. Accuracy aspects due to the particularities of the implementations are not usually taken into consideration when building histogram with very large data sets. In this work, a bin recycling strategy to create an accuracy-aware implementation for building histogram on GPU is presented. In order to evaluate the approach, this strategy was applied to the computation of the three-point angular correlation function, which is a relevant function in Cosmology for the study of the Large Scale Structure of Universe. As a consequence of the study a high-accuracy implementation for histogram construction on GPU is proposed.

  8. DNA barcoding of odonates from the Upper Plata basin: Database creation and genetic diversity estimation

    PubMed Central

    Pepinelli, Mateus; Rodrigues, Marciel Elio; Roque, Fabio de Oliveira; Lorenz-Lemke, Aline Pedroso; Kvist, Sebastian

    2017-01-01

    We present a DNA barcoding study of Neotropical odonates from the Upper Plata basin, Brazil. A total of 38 species were collected in a transition region of “Cerrado” and Atlantic Forest, both regarded as biological hotspots, and 130 cytochrome c oxidase subunit I (COI) barcodes were generated for the collected specimens. The distinct gap between intraspecific (0–2%) and interspecific variation (15% and above) in COI, and resulting separation of Barcode Index Numbers (BIN), allowed for successful identification of specimens in 94% of cases. The 6% fail rate was due to a shared BIN between two separate nominal species. DNA barcoding, based on COI, thus seems to be a reliable and efficient tool for identifying Neotropical odonate specimens down to the species level. These results underscore the utility of DNA barcoding to aid specimen identification in diverse biological hotspots, areas that require urgent action regarding taxonomic surveys and biodiversity conservation. PMID:28763495

  9. Basic Research in Computer Science

    DTIC Science & Technology

    1993-10-01

    No 0704-0188 _7 7 , - -s ,,, - .’ . .. ..... _" r oI N tp o .. ,o - Ile *, nhe’ asect of 1. AGENCY USE ONLY :leave b/in,) 2 REPORT DATE 7 REPORT TYPE...Parallel Programming. ACM, April, 1991. 116 [Laird et al. 90] Laird, J.E., C.B. Congdon , C.B. Altmann, and K. Swedlow. Soar User’s Manual: Version 5.2

  10. FPGA-based voltage and current dual drive system for high frame rate electrical impedance tomography.

    PubMed

    Khan, Shadab; Manwaring, Preston; Borsic, Andrea; Halter, Ryan

    2015-04-01

    Electrical impedance tomography (EIT) is used to image the electrical property distribution of a tissue under test. An EIT system comprises complex hardware and software modules, which are typically designed for a specific application. Upgrading these modules is a time-consuming process, and requires rigorous testing to ensure proper functioning of new modules with the existing ones. To this end, we developed a modular and reconfigurable data acquisition (DAQ) system using National Instruments' (NI) hardware and software modules, which offer inherent compatibility over generations of hardware and software revisions. The system can be configured to use up to 32-channels. This EIT system can be used to interchangeably apply current or voltage signal, and measure the tissue response in a semi-parallel fashion. A novel signal averaging algorithm, and 512-point fast Fourier transform (FFT) computation block was implemented on the FPGA. FFT output bins were classified as signal or noise. Signal bins constitute a tissue's response to a pure or mixed tone signal. Signal bins' data can be used for traditional applications, as well as synchronous frequency-difference imaging. Noise bins were used to compute noise power on the FPGA. Noise power represents a metric of signal quality, and can be used to ensure proper tissue-electrode contact. Allocation of these computationally expensive tasks to the FPGA reduced the required bandwidth between PC, and the FPGA for high frame rate EIT. In 16-channel configuration, with a signal-averaging factor of 8, the DAQ frame rate at 100 kHz exceeded 110 frames s (-1), and signal-to-noise ratio exceeded 90 dB across the spectrum. Reciprocity error was found to be for frequencies up to 1 MHz. Static imaging experiments were performed on a high-conductivity inclusion placed in a saline filled tank; the inclusion was clearly localized in the reconstructions obtained for both absolute current and voltage mode data.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad Allen

    EDENx is a multivariate data visualization tool that allows interactive user driven analysis of large-scale data sets with high dimensionality. EDENx builds on our earlier system, called EDEN to enable analysis of more dimensions and larger scale data sets. EDENx provides an initial overview of summary statistics for each variable in the data set under investigation. EDENx allows the user to interact with graphical summary plots of the data to investigate subsets and their statistical associations. These plots include histograms, binned scatterplots, binned parallel coordinate plots, timeline plots, and graphical correlation indicators. From the EDENx interface, a user can selectmore » a subsample of interest and launch a more detailed data visualization via the EDEN system. EDENx is best suited for high-level, aggregate analysis tasks while EDEN is more appropriate for detail data investigations.« less

  12. Uniform rovibrational collisional N2 bin model for DSMC, with application to atmospheric entry flows

    NASA Astrophysics Data System (ADS)

    Torres, E.; Bondar, Ye. A.; Magin, T. E.

    2016-11-01

    A state-to-state model for internal energy exchange and molecular dissociation allows for high-fidelity DSMC simulations. Elementary reaction cross sections for the N2 (v, J)+ N system were previously extracted from a quantum-chemical database, originally compiled at NASA Ames Research Center. Due to the high computational cost of simulating the full range of inelastic collision processes (approx. 23 million reactions), a coarse-grain model, called the Uniform RoVibrational Collisional (URVC) bin model can be used instead. This allows to reduce the original 9390 rovibrational levels of N2 to 10 energy bins. In the present work, this reduced model is used to simulate a 2D flow configuration, which more closely reproduces the conditions of high-speed entry into Earth's atmosphere. For this purpose, the URVC bin model had to be adapted for integration into the "Rarefied Gas Dynamics Analysis System" (RGDAS), a separate high-performance DSMC code capable of handling complex geometries and parallel computations. RGDAS was developed at the Institute of Theoretical and Applied Mechanics in Novosibirsk, Russia for use by the European Space Agency (ESA) and shares many features with the well-known SMILE code developed by the same group. We show that the reduced mechanism developed previously can be implemented in RGDAS, and the results exhibit nonequilibrium effects consistent with those observed in previous 1D-simulations.

  13. GenoGAM: genome-wide generalized additive models for ChIP-Seq analysis.

    PubMed

    Stricker, Georg; Engelhardt, Alexander; Schulz, Daniel; Schmid, Matthias; Tresch, Achim; Gagneur, Julien

    2017-08-01

    Chromatin immunoprecipitation followed by deep sequencing (ChIP-Seq) is a widely used approach to study protein-DNA interactions. Often, the quantities of interest are the differential occupancies relative to controls, between genetic backgrounds, treatments, or combinations thereof. Current methods for differential occupancy of ChIP-Seq data rely however on binning or sliding window techniques, for which the choice of the window and bin sizes are subjective. Here, we present GenoGAM (Genome-wide Generalized Additive Model), which brings the well-established and flexible generalized additive models framework to genomic applications using a data parallelism strategy. We model ChIP-Seq read count frequencies as products of smooth functions along chromosomes. Smoothing parameters are objectively estimated from the data by cross-validation, eliminating ad hoc binning and windowing needed by current approaches. GenoGAM provides base-level and region-level significance testing for full factorial designs. Application to a ChIP-Seq dataset in yeast showed increased sensitivity over existing differential occupancy methods while controlling for type I error rate. By analyzing a set of DNA methylation data and illustrating an extension to a peak caller, we further demonstrate the potential of GenoGAM as a generic statistical modeling tool for genome-wide assays. Software is available from Bioconductor: https://www.bioconductor.org/packages/release/bioc/html/GenoGAM.html . gagneur@in.tum.de. Supplementary information is available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  14. A water quality management strategy for regionally protected water through health risk assessment and spatial distribution of heavy metal pollution in 3 marine reserves.

    PubMed

    Zhang, Yinan; Chu, Chunli; Li, Tong; Xu, Shengguo; Liu, Lei; Ju, Meiting

    2017-12-01

    Severe water pollution and resource scarcity is a major problem in China, where it is necessary to establish water quality-oriented monitoring and intelligent watershed management. In this study, an effective watershed management method is explored, in which water quality is first assessed using the heavy metal pollution index and the human health risk index, and then by classifying the pollution and management grade based on cluster analysis and GIS visualization. Three marine reserves in Tianjin were selected and analyzed, namely the Tianjin Ancient Coastal Wetland National Nature Reserve (Qilihai Natural Reserve), the Tianjin DaShentang Oyster Reef National Marine Special Reserve (DaShentang Reserve), and the Tianjin Coastal Wetland National Marine Special Reserve (BinHai Wetland Reserve) which is under construction. The water quality and potential human health risks of 5 heavy metals (Pb, As, Cd, Hg, Cr) in the three reserves were assessed using the Nemerow index and USEPA methods. Moreover, ArcGIS10.2 software was used to visualize the heavy metal index and display their spatial distribution. Cluster analysis enabled classification of the heavy metals into 4 categories, which allowed for identification of the heavy metals whose pollution index and health risks were highest, and, thus, whose control in the reserve is a priority. Results indicate that heavy metal pollution exists in the Qilihai Natural Reserve and in the north and east of the DaShentang Reserve; furthermore, human health risks exist in the Qilihai Natural Reserve and in the BinHai Wetland Reserve. In each reserve, the main factor influencing the pollution and health risk were high concentrations of As and Pb that exceed the corresponding standards. Measures must be adopted to control and remediate the pollutants. Furthermore, to protect the marine reserves, management policies must be implemented to improve water quality, which is an urgent task for both local and national governments. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram

    2013-01-01

    The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less

  16. Diel Metagenomics and Metatranscriptomics of Elkhorn Slough Hypersaline Microbial Mat

    NASA Astrophysics Data System (ADS)

    Lee, J.; Detweiler, A. M.; Everroad, R. C.; Bebout, L. E.; Weber, P. K.; Pett-Ridge, J.; Bebout, B.

    2014-12-01

    To understand the variation in gene expression associated with the daytime oxygenic phototrophic and nighttime fermentation regimes seen in hypersaline microbial mats, a contiguous mat piece was subjected to sampling at regular intervals over a 24-hour diel period. Additionally, to understand the impact of sulfate reduction on biohydrogen consumption, molybdate was added to a parallel experiment in the same run. 4 metagenome and 12 metatranscriptome Illumina HiSeq lanes were completed over day / night, and control / molybdate experiments. Preliminary comparative examination of noon and midnight metatranscriptomic samples mapped using bowtie2 to reference genomes has revealed several notable results about the dominant mat-building cyanobacterium Microcoleus chthonoplastes PCC 7420. Dominant cyanobacterium M. chthonoplastes PCC 7420 shows expression in several pathways for nitrogen scavenging, including nitrogen fixation. Reads mapped to M. chthonoplastes PCC 7420 shows expression of two starch storage and utilization pathways, one as a starch-trehalose-maltose-glucose pathway, another through UDP-glucose-cellulose-β-1,4 glucan-glucose pathway. The overall trend of gene expression was primarily light driven up-regulation followed by down-regulation in dark, while much of the remaining expression profile appears to be constitutive. Co-assembly of quality-controlled reads from 4 metagenomes was performed using Ray Meta with progressively smaller K-mer sizes, with bins identified and filtered using principal component analysis of coverages from all libraries and a %GC filter, followed by reassembly of the remaining co-assembly reads and binned reads. Despite having relatively similar abundance profiles in each metagenome, this binning approach was able to distinctly resolve bins from dominant taxa, but also sulfate reducing bacteria that are desired for understanding molybdate inhibition. Bins generated from this iterative assembly process will be used for downstream mapping of transcriptomic reads as well as isolation efforts for Cyanobacteria-associated bacteria.

  17. Measuring the Index of Refraction.

    ERIC Educational Resources Information Center

    Phelps, F. M., III; Jacobson, B. S.

    1980-01-01

    Presents two methods for measuring the index of refraction of glass or lucite. These two methods, used in the freshman laboratory, are based on the fact that a ray of light inside a block will be refracted parallel to the surface. (HM)

  18. Time-frequency analysis-based time-windowing algorithm for the inverse synthetic aperture radar imaging of ships

    NASA Astrophysics Data System (ADS)

    Zhou, Peng; Zhang, Xi; Sun, Weifeng; Dai, Yongshou; Wan, Yong

    2018-01-01

    An algorithm based on time-frequency analysis is proposed to select an imaging time window for the inverse synthetic aperture radar imaging of ships. An appropriate range bin is selected to perform the time-frequency analysis after radial motion compensation. The selected range bin is that with the maximum mean amplitude among the range bins whose echoes are confirmed to be contributed by a dominant scatter. The criterion for judging whether the echoes of a range bin are contributed by a dominant scatter is key to the proposed algorithm and is therefore described in detail. When the first range bin that satisfies the judgment criterion is found, a sequence composed of the frequencies that have the largest amplitudes in every moment's time-frequency spectrum corresponding to this range bin is employed to calculate the length and the center moment of the optimal imaging time window. Experiments performed with simulation data and real data show the effectiveness of the proposed algorithm, and comparisons between the proposed algorithm and the image contrast-based algorithm (ICBA) are provided. Similar image contrast and lower entropy are acquired using the proposed algorithm as compared with those values when using the ICBA.

  19. Study on parallel and distributed management of RS data based on spatial data base

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Liu, Shijin

    2006-12-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  20. A Nonlinearity Minimization-Oriented Resource-Saving Time-to-Digital Converter Implemented in a 28 nm Xilinx FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2015-10-01

    Because large nonlinearity errors exist in the current tapped-delay line (TDL) style field programmable gate array (FPGA)-based time-to-digital converters (TDC), bin-by-bin calibration techniques have to be resorted for gaining a high measurement resolution. If the TDL in selected FPGAs is significantly affected by changes in ambient temperature, the bin-by-bin calibration table has to be updated as frequently as possible. The on-line calibration and calibration table updating increase the TDC design complexity and limit the system performance to some extent. This paper proposes a method to minimize the nonlinearity errors of TDC bins, so that the bin-by-bin calibration may not be needed while maintaining a reasonably high time resolution. The method is a two pass approach: By a bin realignment, the large number of wasted zero-width bins in the original TDL is reused and the granularity of the bins is improved; by a bin decimation, the bin size and its uniformity is traded-off, and the time interpolation by the delay line turns more precise so that the bin-by-bin calibration is not necessary. Using Xilinx 28 nm FPGAs, in which the TDL property is not very sensitive to ambient temperature, the proposed TDC achieves approximately 15 ps root-mean-square (RMS) time resolution by dual-channel measurements of time-intervals over the range of operating temperature. Because of removing the calibration and less logic resources required for the data post-processing, the method has bigger multi-channel capability.

  1. Field evaluation of a prototype paper-based point-of-care fingerstick transaminase test.

    PubMed

    Pollock, Nira R; McGray, Sarah; Colby, Donn J; Noubary, Farzad; Nguyen, Huyen; Nguyen, The Anh; Khormaee, Sariah; Jain, Sidhartha; Hawkins, Kenneth; Kumar, Shailendra; Rolland, Jason P; Beattie, Patrick D; Chau, Nguyen V; Quang, Vo M; Barfield, Cori; Tietje, Kathy; Steele, Matt; Weigl, Bernhard H

    2013-01-01

    Monitoring for drug-induced liver injury (DILI) via serial transaminase measurements in patients on potentially hepatotoxic medications (e.g., for HIV and tuberculosis) is routine in resource-rich nations, but often unavailable in resource-limited settings. Towards enabling universal access to affordable point-of-care (POC) screening for DILI, we have performed the first field evaluation of a paper-based, microfluidic fingerstick test for rapid, semi-quantitative, visual measurement of blood alanine aminotransferase (ALT). Our objectives were to assess operational feasibility, inter-operator variability, lot variability, device failure rate, and accuracy, to inform device modification for further field testing. The paper-based ALT test was performed at POC on fingerstick samples from 600 outpatients receiving HIV treatment in Vietnam. Results, read independently by two clinic nurses, were compared with gold-standard automated (Roche Cobas) results from venipuncture samples obtained in parallel. Two device lots were used sequentially. We demonstrated high inter-operator agreement, with 96.3% (95% C.I., 94.3-97.7%) agreement in placing visual results into clinically-defined "bins" (<3x, 3-5x, and >5x upper limit of normal), >90% agreement in validity determination, and intraclass correlation coefficient of 0.89 (95% C.I., 0.87-0.91). Lot variability was observed in % invalids due to hemolysis (21.1% for Lot 1, 1.6% for Lot 2) and correlated with lots of incorporated plasma separation membranes. Invalid rates <1% were observed for all other device controls. Overall bin placement accuracy for the two readers was 84% (84.3%/83.6%). Our findings of extremely high inter-operator agreement for visual reading-obtained in a target clinical environment, as performed by local practitioners-indicate that the device operation and reading process is feasible and reproducible. Bin placement accuracy and lot-to-lot variability data identified specific targets for device optimization and material quality control. This is the first field study performed with a patterned paper-based microfluidic device and opens the door to development of similar assays for other important analytes.

  2. FastQuery: A Parallel Indexing System for Scientific Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chou, Jerry; Wu, Kesheng; Prabhat,

    2011-07-29

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the- art index and query technologies such as FastBit can significantly improve accesses to these datasets by augmenting the user data with indexes and other secondary information. However, a challenge is that the indexes assume the relational data model but the scientific data generally follows the array data model. To match the two data models, we design a generic mapping mechanism and implement an efficient input and output interface for reading and writing the data and their corresponding indexes. To take advantage of the emerging many-core architectures, we also developmore » a parallel strategy for indexing using threading technology. This approach complements our on-going MPI-based parallelization efforts. We demonstrate the flexibility of our software by applying it to two of the most commonly used scientific data formats, HDF5 and NetCDF. We present two case studies using data from a particle accelerator model and a global climate model. We also conducted a detailed performance study using these scientific datasets. The results show that FastQuery speeds up the query time by a factor of 2.5x to 50x, and it reduces the indexing time by a factor of 16 on 24 cores.« less

  3. GSHR-Tree: a spatial index tree based on dynamic spatial slot and hash table in grid environments

    NASA Astrophysics Data System (ADS)

    Chen, Zhanlong; Wu, Xin-cai; Wu, Liang

    2008-12-01

    Computation Grids enable the coordinated sharing of large-scale distributed heterogeneous computing resources that can be used to solve computationally intensive problems in science, engineering, and commerce. Grid spatial applications are made possible by high-speed networks and a new generation of Grid middleware that resides between networks and traditional GIS applications. The integration of the multi-sources and heterogeneous spatial information and the management of the distributed spatial resources and the sharing and cooperative of the spatial data and Grid services are the key problems to resolve in the development of the Grid GIS. The performance of the spatial index mechanism is the key technology of the Grid GIS and spatial database affects the holistic performance of the GIS in Grid Environments. In order to improve the efficiency of parallel processing of a spatial mass data under the distributed parallel computing grid environment, this paper presents a new grid slot hash parallel spatial index GSHR-Tree structure established in the parallel spatial indexing mechanism. Based on the hash table and dynamic spatial slot, this paper has improved the structure of the classical parallel R tree index. The GSHR-Tree index makes full use of the good qualities of R-Tree and hash data structure. This paper has constructed a new parallel spatial index that can meet the needs of parallel grid computing about the magnanimous spatial data in the distributed network. This arithmetic splits space in to multi-slots by multiplying and reverting and maps these slots to sites in distributed and parallel system. Each sites constructs the spatial objects in its spatial slot into an R tree. On the basis of this tree structure, the index data was distributed among multiple nodes in the grid networks by using large node R-tree method. The unbalance during process can be quickly adjusted by means of a dynamical adjusting algorithm. This tree structure has considered the distributed operation, reduplication operation transfer operation of spatial index in the grid environment. The design of GSHR-Tree has ensured the performance of the load balance in the parallel computation. This tree structure is fit for the parallel process of the spatial information in the distributed network environments. Instead of spatial object's recursive comparison where original R tree has been used, the algorithm builds the spatial index by applying binary code operation in which computer runs more efficiently, and extended dynamic hash code for bit comparison. In GSHR-Tree, a new server is assigned to the network whenever a split of a full node is required. We describe a more flexible allocation protocol which copes with a temporary shortage of storage resources. It uses a distributed balanced binary spatial tree that scales with insertions to potentially any number of storage servers through splits of the overloaded ones. The application manipulates the GSHR-Tree structure from a node in the grid environment. The node addresses the tree through its image that the splits can make outdated. This may generate addressing errors, solved by the forwarding among the servers. In this paper, a spatial index data distribution algorithm that limits the number of servers has been proposed. We improve the storage utilization at the cost of additional messages. The structure of GSHR-Tree is believed that the scheme of this grid spatial index should fit the needs of new applications using endlessly larger sets of spatial data. Our proposal constitutes a flexible storage allocation method for a distributed spatial index. The insertion policy can be tuned dynamically to cope with periods of storage shortage. In such cases storage balancing should be favored for better space utilization, at the price of extra message exchanges between servers. This structure makes a compromise in the updating of the duplicated index and the transformation of the spatial index data. Meeting the needs of the grid computing, GSHRTree has a flexible structure in order to satisfy new needs in the future. The GSHR-Tree provides the R-tree capabilities for large spatial datasets stored over interconnected servers. The analysis, including the experiments, confirmed the efficiency of our design choices. The scheme should fit the needs of new applications of spatial data, using endlessly larger datasets. Using the system response time of the parallel processing of spatial scope query algorithm as the performance evaluation factor, According to the result of the simulated the experiments, GSHR-Tree is performed to prove the reasonable design and the high performance of the indexing structure that the paper presented.

  4. 14 CFR 1260.9 - Synopses requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... funding opportunities shall be synopsized. Synopses shall be prepared in the NASA Acquisition Internet Service (NAIS), located at: http://prod.nais.nasa.gov/cgi-bin/nais/index.cgi; by using the Electronic Posting System (EPS), and transmitted to http://www.Fedgrants.gov. Synopses shall be electronically posted...

  5. 14 CFR 1260.9 - Synopses requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... funding opportunities shall be synopsized. Synopses shall be prepared in the NASA Acquisition Internet Service (NAIS), located at: http://prod.nais.nasa.gov/cgi-bin/nais/index.cgi; by using the Electronic Posting System (EPS), and transmitted to http://www.Fedgrants.gov. Synopses shall be electronically posted...

  6. Composting of high moisture content swine manure with corncob in a pilot-scale aerated static bin system.

    PubMed

    Zhu, Nengwu

    2006-10-01

    Pilot composting experiments of swine manure with corncob were conducted to evaluate the performance of the aerated static bin composting system. Effects of temperature control (60 and 70 degrees C) and moisture content (70% and 80%) were monitored on the composting by measuring physical and chemical indexes. The results showed that (1) the composting system could destroy pathogens, converted nitrogen from unstable ammonia to stable organic forms, and reduced the volume of waste; (2) significant difference of NH(4)(+)-N (P(12) = 0.074), and (NO(3)(-) + NO(2)(-))-N (P(12) = 0.085) was found between the temperature control treatments; (3) anaerobic reaction in the treatment with 80% moisture content resulted in significant difference of pH (P(23) = 0.006), total organic matter (P(23) = 0.003), and germination index (P(23) = 0.040) between 70% and 80%. Therefore, the optimum initial moisture content was less than 80% with the composting of swine manure and corncob by using the composting system.

  7. Leveraging multi-layer imager detector design to improve low-dose performance for megavoltage cone-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Hu, Yue-Houng; Rottmann, Joerg; Fueglistaller, Rony; Myronakis, Marios; Wang, Adam; Huber, Pascal; Shedlock, Daniel; Morf, Daniel; Baturin, Paul; Star-Lack, Josh; Berbeco, Ross

    2018-02-01

    While megavoltage cone-beam computed tomography (CBCT) using an electronic portal imaging device (EPID) provides many advantages over kilovoltage (kV) CBCT, clinical adoption is limited by its high doses. Multi-layer imager (MLI) EPIDs increase DQE(0) while maintaining high resolution. However, even well-designed, high-performance MLIs suffer from increased electronic noise from each readout, degrading low-dose image quality. To improve low-dose performance, shift-and-bin addition (ShiBA) imaging is proposed, leveraging the unique architecture of the MLI. ShiBA combines hardware readout-binning and super-resolution concepts, reducing electronic noise while maintaining native image sampling. The imaging performance of full-resolution (FR); standard, aligned binned (BIN); and ShiBA images in terms of noise power spectrum (NPS), electronic NPS, modulation transfer function (MTF), and the ideal observer signal-to-noise ratio (SNR)—the detectability index (d‧)—are compared. The FR 4-layer readout of the prototype MLI exhibits an electronic NPS magnitude 6-times higher than a state-of-the-art single layer (SLI) EPID. Although the MLI is built on the same readout platform as the SLI, with each layer exhibiting equivalent electronic noise, the multi-stage readout of the MLI results in electronic noise 50% higher than simple summation. Electronic noise is mitigated in both BIN and ShiBA imaging, reducing its total by ~12 times. ShiBA further reduces the NPS, effectively upsampling the image, resulting in a multiplication by a sinc2 function. Normalized NPS show that neither ShiBA nor BIN otherwise affects image noise. The LSF shows that ShiBA removes the pixilation artifact of BIN images and mitigates the effect of detector shift, but does not quantifiably improve the MTF. ShiBA provides a pre-sampled representation of the images, mitigating phase dependence. Hardware binning strategies lower the quantum noise floor, with 2  ×  2 implementation reducing the dose at which DQE(0) degrades by 10% from 0.01 MU to 0.004 MU, representing 20% improvement in d‧.

  8. Comparison of compostable bags and aerated bins with conventional storage systems to collect the organic fraction of municipal solid waste from homes. a Catalonia case study.

    PubMed

    Puyuelo, Belén; Colón, Joan; Martín, Patrícia; Sánchez, Antoni

    2013-06-01

    The separation of biowaste at home is key to improving, facilitating and reducing the operational costs of the treatment of organic municipal waste. The conventional method of collecting such waste and separating it at home is usually done by using a sealed bin with a plastic bag. The use of modern compostable bags is starting to be implemented in some European countries. These compostable bags are made of biodegradable polymers, often from renewable sources. In addition to compostable bags, a new model of bin is also promoted that has a perforated surface that, together with the compostable bag, makes the so-called "aerated system". In this study, different combinations of home collection systems have been systematically studied in the laboratory and at home. The results obtained quantitatively demonstrate that the aerated bin and compostable bag system combination is effective at improving the collection of biowaste without significant gaseous emissions and preparing the organic waste for further composting as concluded from the respiration indices. In terms of weight loss, temperature, gas emissions, respiration index and organic matter reduction, the best results were achieved with the aerated system. At the same time, a qualitative study of bin and bag combinations was carried in 100 homes in which more than 80% of the families participating preferred the aerated system. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. SONG China project - participating in the global network

    NASA Astrophysics Data System (ADS)

    Deng, Licai; Xin, Yu; Zhang, Xiaobin; Li, Yan; Jiang, Xiaojun; Wang, Guomin; Wang, Kun; Zhou, Jilin; Yan, Zhengzhou; Luo, Zhiquan

    2013-01-01

    SONG (Stellar Observations Network Goup) is a low-cost ground based international collaboration aimed at two cutting edge problems in contemporary astrophysics in the time-domain: 1) Direct diagnostics of the internal structure of stars and 2) looking for and studying extra solar planets, possibly in the habitable zone. The general plan is to set up a network of 1m telescopes uniformly distributed in geographic latitude (in both hemispheres). China jointed the collaboration (initiated by Danish astronomers) at the very beginning. In addition to SONG's original plan (http://song.phys.au.dk), the Chinese team proposed a parallel photometry subnet work in the northern hemisphere, namely 50BiN (50cm Binocular Network, previously known as mini-SONG), to enable a large field photometric capability for the network, therefore maximising the potential of the network platform. The network will be able to produce nearly continuous time series observations of a number of selected objects with high resolution spectroscopy (SONG) and accurate photometry (50BiN), and to produce ultra-high accuracy photometry in dense field to look for micro-lensing events caused by planetary systems. This project has great synergy with Chinese Astronomical activities in Antarctica (Dome A), and other similar networks (e.g. LCOGT). The plan and current status of the project are overviewed in this poster.

  10. Artificial dielectric stepped-refractive-index lens for the terahertz region.

    PubMed

    Hernandez-Serrano, A I; Mendis, Rajind; Reichel, Kimberly S; Zhang, Wei; Castro-Camus, E; Mittleman, Daniel M

    2018-02-05

    In this paper we theoretically and experimentally demonstrate a stepped-refractive-index convergent lens made of a parallel stack of metallic plates for terahertz frequencies based on artificial dielectrics. The lens consist of a non-uniformly spaced stack of metallic plates, forming a mirror-symmetric array of parallel-plate waveguides (PPWGs). The operation of the device is based on the TE 1 mode of the PPWG. The effective refractive index of the TE 1 mode is a function of the frequency of operation and the spacing between the plates of the PPWG. By varying the spacing between the plates, we can modify the local refractive index of the structure in every individual PPWG that constitutes the lens producing a stepped refractive index profile across the multi stack structure. The theoretical and experimental results show that this structure is capable of focusing a 1 cm diameter beam to a line focus of less than 4 mm for the design frequency of 0.18 THz. This structure shows that this artificial-dielectric concept is an important technology for the fabrication of next generation terahertz devices.

  11. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling..., expressed in m2 and rounded to two decimal places. Where we allow you to group multiple configurations... bin based on the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin...

  12. Conservative Bin-to-Bin Fractional Collisions

    DTIC Science & Technology

    2016-06-28

    BIN FRACTIONAL COLLISIONS Robert Martin ERC INC., SPACECRAFT PROPULSION BRANCH AIR FORCE RESEARCH LABORATORY EDWARDS AIR FORCE BASE, CA USA 30th...IMPORTANCE OF COLLISION PHYSICS Important Collisions in Spacecraft Propulsion : Discharge and Breakdown in FRC Collisional Radiative Cooling/Ionization...UNLIMITED; PA #16326 3 / 18 IMPORTANCE OF COLLISION PHYSICS Important Collisions in Spacecraft Propulsion : Discharge and Breakdown in FRC Collisional

  13. Study on parallel and distributed management of RS data based on spatial database

    NASA Astrophysics Data System (ADS)

    Chen, Yingbiao; Qian, Qinglan; Wu, Hongqiao; Liu, Shijin

    2009-10-01

    With the rapid development of current earth-observing technology, RS image data storage, management and information publication become a bottle-neck for its appliance and popularization. There are two prominent problems in RS image data storage and management system. First, background server hardly handle the heavy process of great capacity of RS data which stored at different nodes in a distributing environment. A tough burden has put on the background server. Second, there is no unique, standard and rational organization of Multi-sensor RS data for its storage and management. And lots of information is lost or not included at storage. Faced at the above two problems, the paper has put forward a framework for RS image data parallel and distributed management and storage system. This system aims at RS data information system based on parallel background server and a distributed data management system. Aiming at the above two goals, this paper has studied the following key techniques and elicited some revelatory conclusions. The paper has put forward a solid index of "Pyramid, Block, Layer, Epoch" according to the properties of RS image data. With the solid index mechanism, a rational organization for different resolution, different area, different band and different period of Multi-sensor RS image data is completed. In data storage, RS data is not divided into binary large objects to be stored at current relational database system, while it is reconstructed through the above solid index mechanism. A logical image database for the RS image data file is constructed. In system architecture, this paper has set up a framework based on a parallel server of several common computers. Under the framework, the background process is divided into two parts, the common WEB process and parallel process.

  14. Defense.gov Special Report: The Demise of Osama bin Laden

    Science.gov Websites

    official said. Story | Transcript Bin Laden's Death May Impact Afghanistan SEYMOUR-JOHNSON AIR FORCE BASE -Johnson Air Force Base, N.C. Story Leaders Honor 9/11 Victims at Ground Zero WASHINGTON, May 5, 2011

  15. 14 CFR § 1260.9 - Synopses requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... funding opportunities shall be synopsized. Synopses shall be prepared in the NASA Acquisition Internet Service (NAIS), located at: http://prod.nais.nasa.gov/cgi-bin/nais/index.cgi; by using the Electronic Posting System (EPS), and transmitted to http://www.Fedgrants.gov. Synopses shall be electronically posted...

  16. 14 CFR 1260.9 - Synopses requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... opportunities shall be synopsized. Synopses shall be prepared in the NASA Acquisition Internet Service (NAIS), located at: http://prod.nais.nasa.gov/cgi-bin/nais/index.cgi; by using the Electronic Posting System (EPS), and transmitted to http://www.Fedgrants.gov. Synopses shall be electronically posted to: http://www...

  17. Adjacent bin stability evaluating for feature description

    NASA Astrophysics Data System (ADS)

    Nie, Dongdong; Ma, Qinyong

    2018-04-01

    Recent study improves descriptor performance by accumulating stability votes for all scale pairs to compose the local descriptor. We argue that the stability of a bin depends on the differences across adjacent pairs more than the differences across all scale pairs, and a new local descriptor is composed based on the hypothesis. A series of SIFT descriptors are extracted from multiple scales firstly. Then the difference value of the bin across adjacent scales is calculated, and the stability value of a bin is calculated based on it and accumulated to compose the final descriptor. The performance of the proposed method is evaluated with two popular matching datasets, and compared with other state-of-the-art works. Experimental results show that the proposed method performs satisfactorily.

  18. A general parallel sparse-blocked matrix multiply for linear scaling SCF theory

    NASA Astrophysics Data System (ADS)

    Challacombe, Matt

    2000-06-01

    A general approach to the parallel sparse-blocked matrix-matrix multiply is developed in the context of linear scaling self-consistent-field (SCF) theory. The data-parallel message passing method uses non-blocking communication to overlap computation and communication. The space filling curve heuristic is used to achieve data locality for sparse matrix elements that decay with “separation”. Load balance is achieved by solving the bin packing problem for blocks with variable size.With this new method as the kernel, parallel performance of the simplified density matrix minimization (SDMM) for solution of the SCF equations is investigated for RHF/6-31G ∗∗ water clusters and RHF/3-21G estane globules. Sustained rates above 5.7 GFLOPS for the SDMM have been achieved for (H 2 O) 200 with 95 Origin 2000 processors. Scalability is found to be limited by load imbalance, which increases with decreasing granularity, due primarily to the inhomogeneous distribution of variable block sizes.

  19. A Hybrid Parallel Strategy Based on String Graph Theory to Improve De Novo DNA Assembly on the TianHe-2 Supercomputer.

    PubMed

    Zhang, Feng; Liao, Xiangke; Peng, Shaoliang; Cui, Yingbo; Wang, Bingqiang; Zhu, Xiaoqian; Liu, Jie

    2016-06-01

    ' The de novo assembly of DNA sequences is increasingly important for biological researches in the genomic era. After more than one decade since the Human Genome Project, some challenges still exist and new solutions are being explored to improve de novo assembly of genomes. String graph assembler (SGA), based on the string graph theory, is a new method/tool developed to address the challenges. In this paper, based on an in-depth analysis of SGA we prove that the SGA-based sequence de novo assembly is an NP-complete problem. According to our analysis, SGA outperforms other similar methods/tools in memory consumption, but costs much more time, of which 60-70 % is spent on the index construction. Upon this analysis, we introduce a hybrid parallel optimization algorithm and implement this algorithm in the TianHe-2's parallel framework. Simulations are performed with different datasets. For data of small size the optimized solution is 3.06 times faster than before, and for data of middle size it's 1.60 times. The results demonstrate an evident performance improvement, with the linear scalability for parallel FM-index construction. This results thus contribute significantly to improving the efficiency of de novo assembly of DNA sequences.

  20. VarBin, a novel method for classifying true and false positive variants in NGS data

    PubMed Central

    2013-01-01

    Background Variant discovery for rare genetic diseases using Illumina genome or exome sequencing involves screening of up to millions of variants to find only the one or few causative variant(s). Sequencing or alignment errors create "false positive" variants, which are often retained in the variant screening process. Methods to remove false positive variants often retain many false positive variants. This report presents VarBin, a method to prioritize variants based on a false positive variant likelihood prediction. Methods VarBin uses the Genome Analysis Toolkit variant calling software to calculate the variant-to-wild type genotype likelihood ratio at each variant change and position divided by read depth. The resulting Phred-scaled, likelihood-ratio by depth (PLRD) was used to segregate variants into 4 Bins with Bin 1 variants most likely true and Bin 4 most likely false positive. PLRD values were calculated for a proband of interest and 41 additional Illumina HiSeq, exome and whole genome samples (proband's family or unrelated samples). At variant sites without apparent sequencing or alignment error, wild type/non-variant calls cluster near -3 PLRD and variant calls typically cluster above 10 PLRD. Sites with systematic variant calling problems (evident by variant quality scores and biases as well as displayed on the iGV viewer) tend to have higher and more variable wild type/non-variant PLRD values. Depending on the separation of a proband's variant PLRD value from the cluster of wild type/non-variant PLRD values for background samples at the same variant change and position, the VarBin method's classification is assigned to each proband variant (Bin 1 to Bin 4). Results To assess VarBin performance, Sanger sequencing was performed on 98 variants in the proband and background samples. True variants were confirmed in 97% of Bin 1 variants, 30% of Bin 2, and 0% of Bin 3/Bin 4. Conclusions These data indicate that VarBin correctly classifies the majority of true variants as Bin 1 and Bin 3/4 contained only false positive variants. The "uncertain" Bin 2 contained both true and false positive variants. Future work will further differentiate the variants in Bin 2. PMID:24266885

  1. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  2. Get Sample Copy
  3. Recommend to Your Librarian
  4. E-MailPrint
  1. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  2. Get Sample Copy
  3. Recommend to Your Librarian
  4. E-MailPrint
  1. Anisotropies in the diffuse gamma-ray background measured by the Fermi LAT

    DOE PAGES

    Ackermann, M.; Ajello, M.; Albert, A.; ...

    2012-04-23

    The contribution of unresolved sources to the diffuse gamma-ray background could induce anisotropies in this emission on small angular scales. Here, we analyze the angular power spectrum of the diffuse emission measured by the Fermi Large Area Telescope at Galactic latitudes | b | > 30 ° in four energy bins spanning 1–50 GeV. At multipoles ℓ ≥ 155 , corresponding to angular scales ≲ 2 ° , angular power above the photon noise level is detected at > 99.99 % confidence level in the 1–2 GeV, 2–5 GeV, and 5–10 GeV energy bins, and at > 99 % confidencemore » level at 10–50 GeV. Within each energy bin the measured angular power takes approximately the same value at all multipoles ℓ ≥ 155 , suggesting that it originates from the contribution of one or more unclustered source populations. Furthermore, the amplitude of the angular power normalized to the mean intensity in each energy bin is consistent with a constant value at all energies, C P / < I > 2 = 9.05 ± 0.84 × 10 - 6 sr , while the energy dependence of C P is consistent with the anisotropy arising from one or more source populations with power-law photon spectra with spectral index Γ s = 2.40 ± 0.07 . We also discuss the implications of the measured angular power for gamma-ray source populations that may provide a contribution to the diffuse gamma-ray background.« less

  2. Anisotropies in the Diffuse Gamma-Ray Background Measured by the Fermi LAT

    NASA Technical Reports Server (NTRS)

    Ferrara, E. C.; McEnery, J. E.; Troja, E.

    2012-01-01

    The contribution of unresolved sources to the diffuse gamma-ray background could induce anisotropies in this emission on small angular scales. We analyze the angular power spectrum of the diffuse emission measured by the Fermi LAT at Galactic latitudes absolute value of b > 30 deg in four energy bins spanning 1 to 50 GeV. At multipoles l >= 155, corresponding to angular scales approx < 2 deg, angular power above the photon noise level is detected at > 99.99% CL in the 1-2 GeV, 2- 5 GeV, and 5- 10 GeV energy bins, and at > 99% CL at 10-50 GeV. Within each energy bin the measured angular power takes approximately the same value at all multipoles l >= 155, suggesting that it originates from the contribution of one or more unclustered source populations. The amplitude of the angular power normalized to the mean intensity in each energy bin is consistent with a constant value at all energies, C(sub p) / (I)(exp 2) = 9.05 +/- 0.84 x 10(exp -6) sr, while the energy dependence of C(sub p) is consistent with the anisotropy arising from one or more source populations with power-law photon spectra with spectral index Gamma (sub s) = 2.40 +/- 0.07. We discuss the implications of the measured angular power for gamma-ray source populations that may provide a contribution to the diffuse gamma-ray background.

  3. Method and apparatus for combinatorial logic signal processor in a digitally based high speed x-ray spectrometer

    DOEpatents

    Warburton, W.K.

    1999-02-16

    A high speed, digitally based, signal processing system is disclosed which accepts a digitized input signal and detects the presence of step-like pulses in the this data stream, extracts filtered estimates of their amplitudes, inspects for pulse pileup, and records input pulse rates and system lifetime. The system has two parallel processing channels: a slow channel, which filters the data stream with a long time constant trapezoidal filter for good energy resolution; and a fast channel which filters the data stream with a short time constant trapezoidal filter, detects pulses, inspects for pileups, and captures peak values from the slow channel for good events. The presence of a simple digital interface allows the system to be easily integrated with a digital processor to produce accurate spectra at high count rates and allow all spectrometer functions to be fully automated. Because the method is digitally based, it allows pulses to be binned based on time related values, as well as on their amplitudes, if desired. 31 figs.

  4. Global Summary MGS TES Data and Mars-Gram Validation

    NASA Technical Reports Server (NTRS)

    Justus, C.; Johnson, D.; Parker, Nelson C. (Technical Monitor)

    2002-01-01

    Mars Global Reference Atmospheric Model (Mars-GRAM 2001) is an engineering-level Mars atmosphere model widely used for many Mars mission applications. From 0-80 km, it is based on NASA Ames Mars General Circulation Model (MGCM), while above 80 km it is based on University of Arizona Mars Thermospheric General Circulation Model. Mars-GRAM 2001 and MGCM use surface topograph$ from Mars Global Surveyor Mars Orbiting Laser Altimeter (MOLA). Validation studies are described comparing Mars-GRAM with a global summary data set of Mars Global Surveyor Thermal Emission Spectrometer (TES) data. TES averages and standard deviations were assembled from binned TES data which covered surface to approx. 40 km, over more than a full Mars year (February, 1999 - June, 2001, just before start of a Mars global dust storm). TES data were binned in 10-by-10 degree latitude-longitude bins (i.e. 36 longitude bins by 19 latitude bins), 12 seasonal bins (based on 30 degree increments of Ls angle). Bin averages and standard deviations were assembled at 23 data levels (temperature at 21 pressure levels, plus surface temperature and surface pressure). Two time-of day bins were used: local time near 2 or 14 hours local time). Two dust optical depth bins wereused: infrared optical depth either less than or greater than 0.25 (which corresponds to visible optical depth either less than or greater than about 0.5). For interests in aerocapture and precision entry and landing, comparisons focused on atmospheric density. TES densities versus height were computed from TES temperature versus pressure, using assumptions of perfect gas law and hydrostatics. Mars-GRAM validation studies used density ratio (TES/Mars-GRAM) evaluated at data bin center points in space and time. Observed average TES/Mars-GRAM density ratios were generally 1+/-0.05, except at high altitudes (15-30 km, depending on season) and high latitudes (> 45 deg N), or at most altitudes in the southern hemisphere at Ls approx. 90 and 180deg). Compared to TES averages for a given latitude and season, TES data had average density standard deviation about the mean of approx. 65-10.5% (varying with height) for all data, or approx. 5-12%, depending on time of day and dust optical depth. Average standard deviation of TES/Mars-GRAM density ratio was 8.9% for local time 2 hours and 7.1% for local time 14 hours. Thus standard deviation of observed TES/Mars-GRAM density ratio, evaluated at matching positions and times, is about the same as the standard deviation of TES data about the TES mean value at a given position and season.

  5. DNA Barcode Analysis of Thrips (Thysanoptera) Diversity in Pakistan Reveals Cryptic Species Complexes.

    PubMed

    Iftikhar, Romana; Ashfaq, Muhammad; Rasool, Akhtar; Hebert, Paul D N

    2016-01-01

    Although thrips are globally important crop pests and vectors of viral disease, species identifications are difficult because of their small size and inconspicuous morphological differences. Sequence variation in the mitochondrial COI-5' (DNA barcode) region has proven effective for the identification of species in many groups of insect pests. We analyzed barcode sequence variation among 471 thrips from various plant hosts in north-central Pakistan. The Barcode Index Number (BIN) system assigned these sequences to 55 BINs, while the Automatic Barcode Gap Discovery detected 56 partitions, a count that coincided with the number of monophyletic lineages recognized by Neighbor-Joining analysis and Bayesian inference. Congeneric species showed an average of 19% sequence divergence (range = 5.6% - 27%) at COI, while intraspecific distances averaged 0.6% (range = 0.0% - 7.6%). BIN analysis suggested that all intraspecific divergence >3.0% actually involved a species complex. In fact, sequences for three major pest species (Haplothrips reuteri, Thrips palmi, Thrips tabaci), and one predatory thrips (Aeolothrips intermedius) showed deep intraspecific divergences, providing evidence that each is a cryptic species complex. The study compiles the first barcode reference library for the thrips of Pakistan, and examines global haplotype diversity in four important pest thrips.

  6. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    PubMed

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  7. High-resolution empirical geomagnetic field model TS07D: Investigating run-on-request and forecasting modes of operation

    NASA Astrophysics Data System (ADS)

    Stephens, G. K.; Sitnov, M. I.; Ukhorskiy, A. Y.; Vandegriff, J. D.; Tsyganenko, N. A.

    2010-12-01

    The dramatic increase of the geomagnetic field data volume available due to many recent missions, including GOES, Polar, Geotail, Cluster, and THEMIS, required at some point the appropriate qualitative transition in the empirical modeling tools. Classical empirical models, such as T96 and T02, used few custom-tailored modules to represent major magnetospheric current systems and simple data binning or loading-unloading inputs for their fitting with data and the subsequent applications. They have been replaced by more systematic expansions of the equatorial and field-aligned current contributions as well as by the advanced data-mining algorithms searching for events with the global activity parameters, such as the Sym-H index, similar to those at the time of interest, as is done in the model TS07D (Tsyganenko and Sitnov, 2007; Sitnov et al., 2008). The necessity to mine and fit data dynamically, with the individual subset of the database being used to reproduce the geomagnetic field pattern at every new moment in time, requires the corresponding transition in the use of the new empirical geomagnetic field models. It becomes more similar to runs-on-request offered by the Community Coordinated Modeling Center for many first principles MHD and kinetic codes. To provide this mode of operation for the TS07D model a new web-based modeling tool has been created and tested at the JHU/APL (http://geomag_field.jhuapl.edu/model/), and we discuss the first results of its performance testing and validation, including in-sample and out-of-sample modeling of a number of CME- and CIR-driven magnetic storms. We also report on the first tests of the forecasting version of the TS07D model, where the magnetospheric part of the macro-parameters involved in the data-binning process (Sym-H index and its trend parameter) are replaced by their solar wind-based analogs obtained using the Burton-McPherron-Russell approach.

  8. Multiple-algorithm parallel fusion of infrared polarization and intensity images based on algorithmic complementarity and synergy

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Yang, Fengbao; Ji, Linna; Lv, Sheng

    2018-01-01

    Diverse image fusion methods perform differently. Each method has advantages and disadvantages compared with others. One notion is that the advantages of different image methods can be effectively combined. A multiple-algorithm parallel fusion method based on algorithmic complementarity and synergy is proposed. First, in view of the characteristics of the different algorithms and difference-features among images, an index vector-based feature-similarity is proposed to define the degree of complementarity and synergy. This proposed index vector is a reliable evidence indicator for algorithm selection. Second, the algorithms with a high degree of complementarity and synergy are selected. Then, the different degrees of various features and infrared intensity images are used as the initial weights for the nonnegative matrix factorization (NMF). This avoids randomness of the NMF initialization parameter. Finally, the fused images of different algorithms are integrated using the NMF because of its excellent data fusing performance on independent features. Experimental results demonstrate that the visual effect and objective evaluation index of the fused images obtained using the proposed method are better than those obtained using traditional methods. The proposed method retains all the advantages that individual fusion algorithms have.

  9. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    PubMed Central

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.; Kuncic, Zdenka; Keall, Paul J.

    2014-01-01

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets with various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage. PMID:24694143

  10. Surgical Technical Evidence Review of Hip Fracture Surgery Conducted for the AHRQ Safety Program for Improving Surgical Care and Recovery

    PubMed Central

    Siletz, Anaar; Faltermeier, Claire; Singer, Emily S.; Hu, Q. Lina; Ko, Clifford Y.; Kates, Stephen L.; Maggard-Gibbons, Melinda; Wick, Elizabeth

    2018-01-01

    Background: Enhanced recovery pathways (ERPs) have been shown to improve patient outcomes in a variety of contexts. This review summarizes the evidence and defines a protocol for perioperative care of patients with hip fracture and was conducted for the Agency for Healthcare Research and Quality safety program for improving surgical care and recovery. Study Design: Perioperative care was divided into components or “bins.” For each bin, a semisystematic review of the literature was conducted using MEDLINE with priority given to systematic reviews, meta-analyses, and randomized controlled trials. Observational studies were included when higher levels of evidence were not available. Existing guidelines for perioperative care were also incorporated. For convenience, the components of care that are under the auspices of anesthesia providers will be reported separately. Recommendations for an evidence-based protocol were synthesized based on review of this evidence. Results: Eleven bins were identified. Preoperative risk factor bins included nutrition, diabetes mellitus, tobacco use, and anemia. Perioperative management bins included thromboprophylaxis, timing of surgery, fluid management, drain placement, early mobilization, early alimentation, and discharge criteria/planning. Conclusions: This review provides the evidence basis for an ERP for perioperative care of patients with hip fracture. PMID:29844947

  11. Spectral binning for mitigation of polarization mode dispersion artifacts in catheter-based optical frequency domain imaging

    PubMed Central

    Villiger, Martin; Zhang, Ellen Ziyi; Nadkarni, Seemantini K.; Oh, Wang-Yuhl; Vakoc, Benjamin J.; Bouma, Brett E.

    2013-01-01

    Polarization mode dispersion (PMD) has been recognized as a significant barrier to sensitive and reproducible birefringence measurements with fiber-based, polarization-sensitive optical coherence tomography systems. Here, we present a signal processing strategy that reconstructs the local retardation robustly in the presence of system PMD. The algorithm uses a spectral binning approach to limit the detrimental impact of system PMD and benefits from the final averaging of the PMD-corrected retardation vectors of the spectral bins. The algorithm was validated with numerical simulations and experimental measurements of a rubber phantom. When applied to the imaging of human cadaveric coronary arteries, the algorithm was found to yield a substantial improvement in the reconstructed birefringence maps. PMID:23938487

  12. Curve Set Feature-Based Robust and Fast Pose Estimation Algorithm

    PubMed Central

    Hashimoto, Koichi

    2017-01-01

    Bin picking refers to picking the randomly-piled objects from a bin for industrial production purposes, and robotic bin picking is always used in automated assembly lines. In order to achieve a higher productivity, a fast and robust pose estimation algorithm is necessary to recognize and localize the randomly-piled parts. This paper proposes a pose estimation algorithm for bin picking tasks using point cloud data. A novel descriptor Curve Set Feature (CSF) is proposed to describe a point by the surface fluctuation around this point and is also capable of evaluating poses. The Rotation Match Feature (RMF) is proposed to match CSF efficiently. The matching process combines the idea of the matching in 2D space of origin Point Pair Feature (PPF) algorithm with nearest neighbor search. A voxel-based pose verification method is introduced to evaluate the poses and proved to be more than 30-times faster than the kd-tree-based verification method. Our algorithm is evaluated against a large number of synthetic and real scenes and proven to be robust to noise, able to detect metal parts, more accurately and more than 10-times faster than PPF and Oriented, Unique and Repeatable (OUR)-Clustered Viewpoint Feature Histogram (CVFH). PMID:28771216

  13. SNR improvement for hyperspectral application using frame and pixel binning

    NASA Astrophysics Data System (ADS)

    Rehman, Sami Ur; Kumar, Ankush; Banerjee, Arup

    2016-05-01

    Hyperspectral imaging spectrometer systems are increasingly being used in the field of remote sensing for variety of civilian and military applications. The ability of such instruments in discriminating finer spectral features along with improved spatial and radiometric performance have made such instruments a powerful tool in the field of remote sensing. Design and development of spaceborne hyper spectral imaging spectrometers poses lot of technological challenges in terms of optics, dispersion element, detectors, electronics and mechanical systems. The main factors that define the type of detectors are the spectral region, SNR, dynamic range, pixel size, number of pixels, frame rate, operating temperature etc. Detectors with higher quantum efficiency and higher well depth are the preferred choice for such applications. CCD based Si detectors serves the requirement of high well depth for VNIR band spectrometers but suffers from smear. Smear can be controlled by using CMOS detectors. Si CMOS detectors with large format arrays are available. These detectors generally have smaller pitch and low well depth. Binning technique can be used with available CMOS detectors to meet the large swath, higher resolution and high SNR requirements. Availability of larger dwell time of satellite can be used to bin multiple frames to increase the signal collection even with lesser well depth detectors and ultimately increase the SNR. Lab measurements reveal that SNR improvement by frame binning is more in comparison to pixel binning. Effect of pixel binning as compared to the frame binning will be discussed and degradation of SNR as compared to theoretical value for pixel binning will be analyzed.

  14. Sensitivity of alpine and subalpine lakes to acidification from atmospheric deposition in Grand Teton National Park and Yellowstone National Park, Wyoming

    USGS Publications Warehouse

    Nanus, Leora; Campbell, Donald H.; Williams, Mark W.

    2005-01-01

    The sensitivity of 400 lakes in Grand Teton and Yellowstone National Parks to acidification from atmospheric deposition of nitrogen and sulfur was estimated based on statistical relations between acid-neutralizing capacity concentrations and basin characteristics to aid in the design of a long-term monitoring plan for Outstanding Natural Resource Waters. Acid-neutralizing capacity concentrations that were measured at 52 lakes in Grand Teton and 23 lakes in Yellowstone during synoptic surveys were used to calibrate the statistical models. Three acid-neutralizing capacity concentration bins (bins) were selected that are within the U.S. Environmental Protection Agency criteria of sensitive to acidification; less than 50 microequivalents per liter (?eq/L) (0-50), less than 100 ?eq/L (0-100), and less than 200 ?eq/L (0-200). The development of discrete bins enables resource managers to have the ability to change criteria based on the focus of their study. Basin-characteristic information was derived from Geographic Information System data sets. The explanatory variables that were considered included bedrock type, basin slope, basin aspect, basin elevation, lake area, basin area, inorganic nitrogen deposition, sulfate deposition, hydrogen ion deposition, basin precipitation, soil type, and vegetation type. A logistic regression model was developed and applied to lake basins greater than 1 hectare in Grand Teton (n = 106) and Yellowstone (n = 294). A higher percentage of lakes in Grand Teton than in Yellowstone were predicted to be sensitive to atmospheric deposition in all three bins. For Grand Teton, 7 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-50 bin, 36 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-100 bin, and 59 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-200 bin. The elevation of the lake outlet and the area of the basin with northeast aspects were determined to be statistically significant and were used as the explanatory variables in the multivariate logistic regression model for the 0-100 bin. For Yellowstone, results indicated that 13 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-100 bin, and 27 percent of lakes had a greater than 60-percent probability of having acid-neutralizing capacity concentrations in the 0-200 bin. Only the elevation of the lake outlet was determined to be statistically significant and was used as the explanatory variable for the 0-100 bin. The lakes that exceeded 60-percent probability of having an acid-neutralizing capacity concentration in the 0-100 bin, and therefore had the greatest sensitivity to acidification from atmospheric deposition, are located at elevations greater than 2,790 meters in Grand Teton, and greater than 2,590 meters in Yellowstone.

  15. Knowledge-driven binning approach for rare variant association analysis: application to neuroimaging biomarkers in Alzheimer's disease.

    PubMed

    Kim, Dokyoon; Basile, Anna O; Bang, Lisa; Horgusluoglu, Emrin; Lee, Seunggeun; Ritchie, Marylyn D; Saykin, Andrew J; Nho, Kwangsik

    2017-05-18

    Rapid advancement of next generation sequencing technologies such as whole genome sequencing (WGS) has facilitated the search for genetic factors that influence disease risk in the field of human genetics. To identify rare variants associated with human diseases or traits, an efficient genome-wide binning approach is needed. In this study we developed a novel biological knowledge-based binning approach for rare-variant association analysis and then applied the approach to structural neuroimaging endophenotypes related to late-onset Alzheimer's disease (LOAD). For rare-variant analysis, we used the knowledge-driven binning approach implemented in Bin-KAT, an automated tool, that provides 1) binning/collapsing methods for multi-level variant aggregation with a flexible, biologically informed binning strategy and 2) an option of performing unified collapsing and statistical rare variant analyses in one tool. A total of 750 non-Hispanic Caucasian participants from the Alzheimer's Disease Neuroimaging Initiative (ADNI) cohort who had both WGS data and magnetic resonance imaging (MRI) scans were used in this study. Mean bilateral cortical thickness of the entorhinal cortex extracted from MRI scans was used as an AD-related neuroimaging endophenotype. SKAT was used for a genome-wide gene- and region-based association analysis of rare variants (MAF (minor allele frequency) < 0.05) and potential confounding factors (age, gender, years of education, intracranial volume (ICV) and MRI field strength) for entorhinal cortex thickness were used as covariates. Significant associations were determined using FDR adjustment for multiple comparisons. Our knowledge-driven binning approach identified 16 functional exonic rare variants in FANCC significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In addition, the approach identified 7 evolutionary conserved regions, which were mapped to FAF1, RFX7, LYPLAL1 and GOLGA3, significantly associated with entorhinal cortex thickness (FDR-corrected p-value < 0.05). In further analysis, the functional exonic rare variants in FANCC were also significantly associated with hippocampal volume and cerebrospinal fluid (CSF) Aβ 1-42 (p-value < 0.05). Our novel binning approach identified rare variants in FANCC as well as 7 evolutionary conserved regions significantly associated with a LOAD-related neuroimaging endophenotype. FANCC (fanconi anemia complementation group C) has been shown to modulate TLR and p38 MAPK-dependent expression of IL-1β in macrophages. Our results warrant further investigation in a larger independent cohort and demonstrate that the biological knowledge-driven binning approach is a powerful strategy to identify rare variants associated with AD and other complex disease.

  16. Hidden diversity revealed by genome-resolved metagenomics of iron-oxidizing microbial mats from Lō'ihi Seamount, Hawai'i.

    PubMed

    Fullerton, Heather; Hager, Kevin W; McAllister, Sean M; Moyer, Craig L

    2017-08-01

    The Zetaproteobacteria are ubiquitous in marine environments, yet this class of Proteobacteria is only represented by a few closely-related cultured isolates. In high-iron environments, such as diffuse hydrothermal vents, the Zetaproteobacteria are important members of the community driving its structure. Biogeography of Zetaproteobacteria has shown two ubiquitous operational taxonomic units (OTUs), yet much is unknown about their genomic diversity. Genome-resolved metagenomics allows for the specific binning of microbial genomes based on genomic signatures present in composite metagenome assemblies. This resulted in the recovery of 93 genome bins, of which 34 were classified as Zetaproteobacteria. Form II ribulose 1,5-bisphosphate carboxylase genes were recovered from nearly all the Zetaproteobacteria genome bins. In addition, the Zetaproteobacteria genome bins contain genes for uptake and utilization of bioavailable nitrogen, detoxification of arsenic, and a terminal electron acceptor adapted for low oxygen concentration. Our results also support the hypothesis of a Cyc2-like protein as the site for iron oxidation, now detected across a majority of the Zetaproteobacteria genome bins. Whole genome comparisons showed a high genomic diversity across the Zetaproteobacteria OTUs and genome bins that were previously unidentified by SSU rRNA gene analysis. A single lineage of cosmopolitan Zetaproteobacteria (zOTU 2) was found to be monophyletic, based on cluster analysis of average nucleotide identity and average amino acid identity comparisons. From these data, we can begin to pinpoint genomic adaptations of the more ecologically ubiquitous Zetaproteobacteria, and further understand their environmental constraints and metabolic potential.

  17. Toward zero waste: composting and recycling for sustainable venue based events.

    PubMed

    Hottle, Troy A; Bilec, Melissa M; Brown, Nicholas R; Landis, Amy E

    2015-04-01

    This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g. recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO2 equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO2 eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO2 eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the staffed bins) and 23% contamination rates at the third game. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Establishing a community-wide DNA barcode library as a new tool for arctic research.

    PubMed

    Wirta, H; Várkonyi, G; Rasmussen, C; Kaartinen, R; Schmidt, N M; Hebert, P D N; Barták, M; Blagoev, G; Disney, H; Ertl, S; Gjelstrup, P; Gwiazdowicz, D J; Huldén, L; Ilmonen, J; Jakovlev, J; Jaschhof, M; Kahanpää, J; Kankaanpää, T; Krogh, P H; Labbee, R; Lettner, C; Michelsen, V; Nielsen, S A; Nielsen, T R; Paasivirta, L; Pedersen, S; Pohjoismäki, J; Salmela, J; Vilkamaa, P; Väre, H; von Tschirnhaus, M; Roslin, T

    2016-05-01

    DNA sequences offer powerful tools for describing the members and interactions of natural communities. In this study, we establish the to-date most comprehensive library of DNA barcodes for a terrestrial site, including all known macroscopic animals and vascular plants of an intensively studied area of the High Arctic, the Zackenberg Valley in Northeast Greenland. To demonstrate its utility, we apply the library to identify nearly 20 000 arthropod individuals from two Malaise traps, each operated for two summers. Drawing on this material, we estimate the coverage of previous morphology-based species inventories, derive a snapshot of faunal turnover in space and time and describe the abundance and phenology of species in the rapidly changing arctic environment. Overall, 403 terrestrial animal and 160 vascular plant species were recorded by morphology-based techniques. DNA barcodes (CO1) offered high resolution in discriminating among the local animal taxa, with 92% of morphologically distinguishable taxa assigned to unique Barcode Index Numbers (BINs) and 93% to monophyletic clusters. For vascular plants, resolution was lower, with 54% of species forming monophyletic clusters based on barcode regions rbcLa and ITS2. Malaise catches revealed 122 BINs not detected by previous sampling and DNA barcoding. The insect community was dominated by a few highly abundant taxa. Even closely related taxa differed in phenology, emphasizing the need for species-level resolution when describing ongoing shifts in arctic communities and ecosystems. The DNA barcode library now established for Zackenberg offers new scope for such explorations, and for the detailed dissection of interspecific interactions throughout the community. © 2015 John Wiley & Sons Ltd.

  19. Sort-Seq Approach to Engineering a Formaldehyde-Inducible Promoter for Dynamically Regulated Escherichia coli Growth on Methanol

    PubMed Central

    2017-01-01

    Tight and tunable control of gene expression is a highly desirable goal in synthetic biology for constructing predictable gene circuits and achieving preferred phenotypes. Elucidating the sequence–function relationship of promoters is crucial for manipulating gene expression at the transcriptional level, particularly for inducible systems dependent on transcriptional regulators. Sort-seq methods employing fluorescence-activated cell sorting (FACS) and high-throughput sequencing allow for the quantitative analysis of sequence–function relationships in a robust and rapid way. Here we utilized a massively parallel sort-seq approach to analyze the formaldehyde-inducible Escherichia coli promoter (Pfrm) with single-nucleotide resolution. A library of mutated formaldehyde-inducible promoters was cloned upstream of gfp on a plasmid. The library was partitioned into bins via FACS on the basis of green fluorescent protein (GFP) expression level, and mutated promoters falling into each expression bin were identified with high-throughput sequencing. The resulting analysis identified two 19 base pair repressor binding sites, one upstream of the −35 RNA polymerase (RNAP) binding site and one overlapping with the −10 site, and assessed the relative importance of each position and base therein. Key mutations were identified for tuning expression levels and were used to engineer formaldehyde-inducible promoters with predictable activities. Engineered variants demonstrated up to 14-fold lower basal expression, 13-fold higher induced expression, and a 3.6-fold stronger response as indicated by relative dynamic range. Finally, an engineered formaldehyde-inducible promoter was employed to drive the expression of heterologous methanol assimilation genes and achieved increased biomass levels on methanol, a non-native substrate of E. coli. PMID:28463494

  20. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    PubMed Central

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  1. An efficient computational approach to model statistical correlations in photon counting x-ray detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faby, Sebastian; Maier, Joscha; Sawall, Stefan

    2016-07-15

    Purpose: To introduce and evaluate an increment matrix approach (IMA) describing the signal statistics of energy-selective photon counting detectors including spatial–spectral correlations between energy bins of neighboring detector pixels. The importance of the occurring correlations for image-based material decomposition is studied. Methods: An IMA describing the counter increase patterns in a photon counting detector is proposed. This IMA has the potential to decrease the number of required random numbers compared to Monte Carlo simulations by pursuing an approach based on convolutions. To validate and demonstrate the IMA, an approximate semirealistic detector model is provided, simulating a photon counting detector inmore » a simplified manner, e.g., by neglecting count rate-dependent effects. In this way, the spatial–spectral correlations on the detector level are obtained and fed into the IMA. The importance of these correlations in reconstructed energy bin images and the corresponding detector performance in image-based material decomposition is evaluated using a statistically optimal decomposition algorithm. Results: The results of IMA together with the semirealistic detector model were compared to other models and measurements using the spectral response and the energy bin sensitivity, finding a good agreement. Correlations between the different reconstructed energy bin images could be observed, and turned out to be of weak nature. These correlations were found to be not relevant in image-based material decomposition. An even simpler simulation procedure based on the energy bin sensitivity was tested instead and yielded similar results for the image-based material decomposition task, as long as the fact that one incident photon can increase multiple counters across neighboring detector pixels is taken into account. Conclusions: The IMA is computationally efficient as it required about 10{sup 2} random numbers per ray incident on a detector pixel instead of an estimated 10{sup 8} random numbers per ray as Monte Carlo approaches would need. The spatial–spectral correlations as described by IMA are not important for the studied image-based material decomposition task. Respecting the absolute photon counts and thus the multiple counter increases by a single x-ray photon, the same material decomposition performance could be obtained with a simpler detector description using the energy bin sensitivity.« less

  2. The Hemiptera (Insecta) of Canada: Constructing a Reference Library of DNA Barcodes

    PubMed Central

    Gwiazdowski, Rodger A.; Foottit, Robert G.; Maw, H. Eric L.; Hebert, Paul D. N.

    2015-01-01

    DNA barcode reference libraries linked to voucher specimens create new opportunities for high-throughput identification and taxonomic re-evaluations. This study provides a DNA barcode library for about 45% of the recognized species of Canadian Hemiptera, and the publically available R workflow used for its generation. The current library is based on the analysis of 20,851 specimens including 1849 species belonging to 628 genera and 64 families. These individuals were assigned to 1867 Barcode Index Numbers (BINs), sequence clusters that often coincide with species recognized through prior taxonomy. Museum collections were a key source for identified specimens, but we also employed high-throughput collection methods that generated large numbers of unidentified specimens. Many of these specimens represented novel BINs that were subsequently identified by taxonomists, adding barcode coverage for additional species. Our analyses based on both approaches includes 94 species not listed in the most recent Canadian checklist, representing a potential 3% increase in the fauna. We discuss the development of our workflow in the context of prior DNA barcode library construction projects, emphasizing the importance of delineating a set of reference specimens to aid investigations in cases of nomenclatural and DNA barcode discordance. The identification for each specimen in the reference set can be annotated on the Barcode of Life Data System (BOLD), allowing experts to highlight questionable identifications; annotations can be added by any registered user of BOLD, and instructions for this are provided. PMID:25923328

  3. Respiratory motion resolved, self-gated 4D-MRI using Rotating Cartesian K-space (ROCK)

    PubMed Central

    Han, Fei; Zhou, Ziwu; Cao, Minsong; Yang, Yingli; Sheng, Ke; Hu, Peng

    2017-01-01

    Purpose To propose and validate a respiratory motion resolved, self-gated (SG) 4D-MRI technique to assess patient-specific breathing motion of abdominal organs for radiation treatment planning. Methods The proposed 4D-MRI technique was based on the balanced steady-state free-precession (bSSFP) technique and 3D k-space encoding. A novel ROtating Cartesian K-space (ROCK) reordering method was designed that incorporates repeatedly sampled k-space centerline as the SG motion surrogate and allows for retrospective k-space data binning into different respiratory positions based on the amplitude of the surrogate. The multiple respiratory-resolved 3D k-space data were subsequently reconstructed using a joint parallel imaging and compressed sensing method with spatial and temporal regularization. The proposed 4D-MRI technique was validated using a custom-made dynamic motion phantom and was tested in 6 healthy volunteers, in whom quantitative diaphragm and kidney motion measurements based on 4D-MRI images were compared with those based on 2D-CINE images. Results The 5-minute 4D-MRI scan offers high-quality volumetric images in 1.2×1.2×1.6mm3 and 8 respiratory positions, with good soft-tissue contrast. In phantom experiments with triangular motion waveform, the motion amplitude measurements based on 4D-MRI were 11.89% smaller than the ground truth, whereas a −12.5% difference was expected due to data binning effects. In healthy volunteers, the difference between the measurements based on 4D-MRI and the ones based on 2D-CINE were 6.2±4.5% for the diaphragm, 8.2±4.9% and 8.9±5.1% for the right and left kidney. Conclusion The proposed 4D-MRI technique could provide high resolution, high quality, respiratory motion resolved 4D images with good soft-tissue contrast and are free of the “stitching” artifacts usually seen on 4D-CT and 4D-MRI based on resorting 2D-CINE. It could be used to visualize and quantify abdominal organ motion for MRI-based radiation treatment planning. PMID:28133752

  4. Respiratory motion-resolved, self-gated 4D-MRI using rotating cartesian k-space (ROCK).

    PubMed

    Han, Fei; Zhou, Ziwu; Cao, Minsong; Yang, Yingli; Sheng, Ke; Hu, Peng

    2017-04-01

    To propose and validate a respiratory motion resolved, self-gated (SG) 4D-MRI technique to assess patient-specific breathing motion of abdominal organs for radiation treatment planning. The proposed 4D-MRI technique was based on the balanced steady-state free-precession (bSSFP) technique and 3D k-space encoding. A novel rotating cartesian k-space (ROCK) reordering method was designed which incorporates repeatedly sampled k-space centerline as the SG motion surrogate and allows for retrospective k-space data binning into different respiratory positions based on the amplitude of the surrogate. The multiple respiratory-resolved 3D k-space data were subsequently reconstructed using a joint parallel imaging and compressed sensing method with spatial and temporal regularization. The proposed 4D-MRI technique was validated using a custom-made dynamic motion phantom and was tested in six healthy volunteers, in whom quantitative diaphragm and kidney motion measurements based on 4D-MRI images were compared with those based on 2D-CINE images. The 5-minute 4D-MRI scan offers high-quality volumetric images in 1.2 × 1.2 × 1.6 mm 3 and eight respiratory positions, with good soft-tissue contrast. In phantom experiments with triangular motion waveform, the motion amplitude measurements based on 4D-MRI were 11.89% smaller than the ground truth, whereas a -12.5% difference was expected due to data binning effects. In healthy volunteers, the difference between the measurements based on 4D-MRI and the ones based on 2D-CINE were 6.2 ± 4.5% for the diaphragm, 8.2 ± 4.9% and 8.9 ± 5.1% for the right and left kidney. The proposed 4D-MRI technique could provide high-resolution, high-quality, respiratory motion-resolved 4D images with good soft-tissue contrast and are free of the "stitching" artifacts usually seen on 4D-CT and 4D-MRI based on resorting 2D-CINE. It could be used to visualize and quantify abdominal organ motion for MRI-based radiation treatment planning. © 2017 American Association of Physicists in Medicine.

  5. Trapped Atoms in One-Dimensional Photonic Crystals

    DTIC Science & Technology

    2013-08-09

    a single silicon -nitride nanobeam (refractive index n = 2) with a 1D array of filleted rectangular holes along the propagation direction; atoms are...trapped in the centers of the holes (figure 1( a )). The second waveguide consists of two parallel silicon nitride nanobeams, each with a periodic array...the refractive index of silicon nitride is approximately constant across the optical domain, we adopt the approximation based on a frequency

  6. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection.

    PubMed

    Toofanny, Rudesh D; Simms, Andrew M; Beck, David A C; Daggett, Valerie

    2011-08-10

    Molecular dynamics (MD) simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns) simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster). For a 'full' simulation trajectory (51 ns) spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster). Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36%) was achieved using page level compression on both the data and indexes. The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery problems. The speed up enables on-the-fly calculation and visualization of contacts and rapid cross simulation analysis for knowledge discovery. Using page compression for the atomic coordinate tables and indexes saves ~36% of disk space without any significant decrease in calculation time and should be considered for other non-transactional databases in MS SQL SERVER 2008.

  7. Implementation of 3D spatial indexing and compression in a large-scale molecular dynamics simulation database for rapid atomic contact detection

    PubMed Central

    2011-01-01

    Background Molecular dynamics (MD) simulations offer the ability to observe the dynamics and interactions of both whole macromolecules and individual atoms as a function of time. Taken in context with experimental data, atomic interactions from simulation provide insight into the mechanics of protein folding, dynamics, and function. The calculation of atomic interactions or contacts from an MD trajectory is computationally demanding and the work required grows exponentially with the size of the simulation system. We describe the implementation of a spatial indexing algorithm in our multi-terabyte MD simulation database that significantly reduces the run-time required for discovery of contacts. The approach is applied to the Dynameomics project data. Spatial indexing, also known as spatial hashing, is a method that divides the simulation space into regular sized bins and attributes an index to each bin. Since, the calculation of contacts is widely employed in the simulation field, we also use this as the basis for testing compression of data tables. We investigate the effects of compression of the trajectory coordinate tables with different options of data and index compression within MS SQL SERVER 2008. Results Our implementation of spatial indexing speeds up the calculation of contacts over a 1 nanosecond (ns) simulation window by between 14% and 90% (i.e., 1.2 and 10.3 times faster). For a 'full' simulation trajectory (51 ns) spatial indexing reduces the calculation run-time between 31 and 81% (between 1.4 and 5.3 times faster). Compression resulted in reduced table sizes but resulted in no significant difference in the total execution time for neighbour discovery. The greatest compression (~36%) was achieved using page level compression on both the data and indexes. Conclusions The spatial indexing scheme significantly decreases the time taken to calculate atomic contacts and could be applied to other multidimensional neighbor discovery problems. The speed up enables on-the-fly calculation and visualization of contacts and rapid cross simulation analysis for knowledge discovery. Using page compression for the atomic coordinate tables and indexes saves ~36% of disk space without any significant decrease in calculation time and should be considered for other non-transactional databases in MS SQL SERVER 2008. PMID:21831299

  8. Palm-Vein Classification Based on Principal Orientation Features

    PubMed Central

    Zhou, Yujia; Liu, Yaqin; Feng, Qianjin; Yang, Feng; Huang, Jing; Nie, Yixiao

    2014-01-01

    Personal recognition using palm–vein patterns has emerged as a promising alternative for human recognition because of its uniqueness, stability, live body identification, flexibility, and difficulty to cheat. With the expanding application of palm–vein pattern recognition, the corresponding growth of the database has resulted in a long response time. To shorten the response time of identification, this paper proposes a simple and useful classification for palm–vein identification based on principal direction features. In the registration process, the Gaussian-Radon transform is adopted to extract the orientation matrix and then compute the principal direction of a palm–vein image based on the orientation matrix. The database can be classified into six bins based on the value of the principal direction. In the identification process, the principal direction of the test sample is first extracted to ascertain the corresponding bin. One-by-one matching with the training samples is then performed in the bin. To improve recognition efficiency while maintaining better recognition accuracy, two neighborhood bins of the corresponding bin are continuously searched to identify the input palm–vein image. Evaluation experiments are conducted on three different databases, namely, PolyU, CASIA, and the database of this study. Experimental results show that the searching range of one test sample in PolyU, CASIA and our database by the proposed method for palm–vein identification can be reduced to 14.29%, 14.50%, and 14.28%, with retrieval accuracy of 96.67%, 96.00%, and 97.71%, respectively. With 10,000 training samples in the database, the execution time of the identification process by the traditional method is 18.56 s, while that by the proposed approach is 3.16 s. The experimental results confirm that the proposed approach is more efficient than the traditional method, especially for a large database. PMID:25383715

  9. 54. VIEW OF ROASTER ADDITION FROM SOUTHEAST. SHOWS ELEVATOR/ORE BIN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    54. VIEW OF ROASTER ADDITION FROM SOUTHEAST. SHOWS ELEVATOR/ORE BIN ADDITION ON LEFT WITH BASE OF EXHAUST STACK, PORTION OF TOPPLED STACK ON LOWER RIGHT IN VIEW, AND UPPER TAILINGS POND BEYOND. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  10. Computation of reliable textural indices from multimodal brain MRI: suggestions based on a study of patients with diffuse intrinsic pontine glioma

    NASA Astrophysics Data System (ADS)

    Goya-Outi, Jessica; Orlhac, Fanny; Calmon, Raphael; Alentorn, Agusti; Nioche, Christophe; Philippe, Cathy; Puget, Stéphanie; Boddaert, Nathalie; Buvat, Irène; Grill, Jacques; Frouin, Vincent; Frouin, Frederique

    2018-05-01

    Few methodological studies regarding widely used textural indices robustness in MRI have been reported. In this context, this study aims to propose some rules to compute reliable textural indices from multimodal 3D brain MRI. Diagnosis and post-biopsy MR scans including T1, post-contrast T1, T2 and FLAIR images from thirty children with diffuse intrinsic pontine glioma (DIPG) were considered. The hybrid white stripe method was adapted to standardize MR intensities. Sixty textural indices were then computed for each modality in different regions of interest (ROI), including tumor and white matter (WM). Three types of intensity binning were compared : constant bin width and relative bounds; constant number of bins and relative bounds; constant number of bins and absolute bounds. The impact of the volume of the region was also tested within the WM. First, the mean Hellinger distance between patient-based intensity distributions decreased by a factor greater than 10 in WM and greater than 2.5 in gray matter after standardization. Regarding the binning strategy, the ranking of patients was highly correlated for 188/240 features when comparing with , but for only 20 when comparing with , and nine when comparing with . Furthermore, when using or texture indices reflected tumor heterogeneity as assessed visually by experts. Last, 41 features presented statistically significant differences between contralateral WM regions when ROI size slightly varies across patients, and none when using ROI of the same size. For regions with similar size, 224 features were significantly different between WM and tumor. Valuable information from texture indices can be biased by methodological choices. Recommendations are to standardize intensities in MR brain volumes, to use intensity binning with constant bin width, and to define regions with the same volumes to get reliable textural indices.

  11. Consistency Check for the Bin Packing Constraint Revisited

    NASA Astrophysics Data System (ADS)

    Dupuis, Julien; Schaus, Pierre; Deville, Yves

    The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.

  12. A Massively Parallel Computational Method of Reading Index Files for SOAPsnv.

    PubMed

    Zhu, Xiaoqian; Peng, Shaoliang; Liu, Shaojie; Cui, Yingbo; Gu, Xiang; Gao, Ming; Fang, Lin; Fang, Xiaodong

    2015-12-01

    SOAPsnv is the software used for identifying the single nucleotide variation in cancer genes. However, its performance is yet to match the massive amount of data to be processed. Experiments reveal that the main performance bottleneck of SOAPsnv software is the pileup algorithm. The original pileup algorithm's I/O process is time-consuming and inefficient to read input files. Moreover, the scalability of the pileup algorithm is also poor. Therefore, we designed a new algorithm, named BamPileup, aiming to improve the performance of sequential read, and the new pileup algorithm implemented a parallel read mode based on index. Using this method, each thread can directly read the data start from a specific position. The results of experiments on the Tianhe-2 supercomputer show that, when reading data in a multi-threaded parallel I/O way, the processing time of algorithm is reduced to 3.9 s and the application program can achieve a speedup up to 100×. Moreover, the scalability of the new algorithm is also satisfying.

  13. Comparison of recycling outcomes in three types of recycling collection units.

    PubMed

    Andrews, Ashley; Gregoire, Mary; Rasmussen, Heather; Witowich, Gretchen

    2013-03-01

    Commercial institutions have many factors to consider when implementing an effective recycling program. This study examined the effectiveness of three different types of recycling bins on recycling accuracy by determining the percent weight of recyclable material placed in the recycling bins, comparing the percent weight of recyclable material by type of container used, and examining whether a change in signage increased recycling accuracy. Data were collected over 6 weeks totaling 30 days from 3 different recycling bin types at a Midwest University medical center. Five bin locations for each bin type were used. Bags from these bins were collected, sorted into recyclable and non-recyclable material, and weighed. The percent recyclable material was calculated using these weights. Common contaminates found in the bins were napkins and paper towels, plastic food wrapping, plastic bags, and coffee cups. The results showed a significant difference in percent recyclable material between bin types and bin locations. Bin type 2 was found to have one bin location to be statistically different (p=0.048), which may have been due to lack of a trash bin next to the recycling bin in that location. Bin type 3 had significantly lower percent recyclable material (p<0.001), which may have been due to lack of a trash bin next to the recycling bin and increased contamination due to the combination of commingled and paper into one bag. There was no significant change in percent recyclable material in recycling bins post signage change. These results suggest a signage change may not be an effective way, when used alone, to increase recycling compliance and accuracy. This study showed two or three-compartment bins located next to a trash bin may be the best bin type for recycling accuracy. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. High precision refractometry based on Fresnel diffraction from phase plates.

    PubMed

    Tavassoly, M Taghi; Naraghi, Roxana Rezvani; Nahal, Arashmid; Hassani, Khosrow

    2012-05-01

    When a transparent plane-parallel plate is illuminated at a boundary region by a monochromatic parallel beam of light, Fresnel diffraction occurs because of the abrupt change in phase imposed by the finite change in refractive index at the plate boundary. The visibility of the diffraction fringes varies periodically with changes in incident angle. The visibility period depends on the plate thickness and the refractive indices of the plate and the surrounding medium. Plotting the phase change versus incident angle or counting the visibility repetition in an incident-angle interval provides, for a given plate thickness, the refractive index of the plate very accurately. It is shown here that the refractive index of a plate can be determined without knowing the plate thickness. Therefore, the technique can be utilized for measuring plate thickness with high precision. In addition, by installing a plate with known refractive index in a rectangular cell filled with a liquid and following the described procedures, the refractive index of the liquid is obtained. The technique is applied to measure the refractive indices of a glass slide, distilled water, and ethanol. The potential and merits of the technique are also discussed.

  15. Single-Cell-Genomics-Facilitated Read Binning of Candidate Phylum EM19 Genomes from Geothermal Spring Metagenomes

    PubMed Central

    Becraft, Eric D.; Dodsworth, Jeremy A.; Murugapiran, Senthil K.; Ohlsson, J. Ingemar; Briggs, Brandon R.; Kanbar, Jad; De Vlaminck, Iwijn; Quake, Stephen R.; Dong, Hailiang; Hedlund, Brian P.

    2015-01-01

    The vast majority of microbial life remains uncatalogued due to the inability to cultivate these organisms in the laboratory. This “microbial dark matter” represents a substantial portion of the tree of life and of the populations that contribute to chemical cycling in many ecosystems. In this work, we leveraged an existing single-cell genomic data set representing the candidate bacterial phylum “Calescamantes” (EM19) to calibrate machine learning algorithms and define metagenomic bins directly from pyrosequencing reads derived from Great Boiling Spring in the U.S. Great Basin. Compared to other assembly-based methods, taxonomic binning with a read-based machine learning approach yielded final assemblies with the highest predicted genome completeness of any method tested. Read-first binning subsequently was used to extract Calescamantes bins from all metagenomes with abundant Calescamantes populations, including metagenomes from Octopus Spring and Bison Pool in Yellowstone National Park and Gongxiaoshe Spring in Yunnan Province, China. Metabolic reconstruction suggests that Calescamantes are heterotrophic, facultative anaerobes, which can utilize oxidized nitrogen sources as terminal electron acceptors for respiration in the absence of oxygen and use proteins as their primary carbon source. Despite their phylogenetic divergence, the geographically separate Calescamantes populations were highly similar in their predicted metabolic capabilities and core gene content, respiring O2, or oxidized nitrogen species for energy conservation in distant but chemically similar hot springs. PMID:26637598

  16. COCACOLA: binning metagenomic contigs using sequence COmposition, read CoverAge, CO-alignment and paired-end read LinkAge.

    PubMed

    Lu, Yang Young; Chen, Ting; Fuhrman, Jed A; Sun, Fengzhu

    2017-03-15

    The advent of next-generation sequencing technologies enables researchers to sequence complex microbial communities directly from the environment. Because assembly typically produces only genome fragments, also known as contigs, instead of an entire genome, it is crucial to group them into operational taxonomic units (OTUs) for further taxonomic profiling and down-streaming functional analysis. OTU clustering is also referred to as binning. We present COCACOLA, a general framework automatically bin contigs into OTUs based on sequence composition and coverage across multiple samples. The effectiveness of COCACOLA is demonstrated in both simulated and real datasets in comparison with state-of-art binning approaches such as CONCOCT, GroopM, MaxBin and MetaBAT. The superior performance of COCACOLA relies on two aspects. One is using L 1 distance instead of Euclidean distance for better taxonomic identification during initialization. More importantly, COCACOLA takes advantage of both hard clustering and soft clustering by sparsity regularization. In addition, the COCACOLA framework seamlessly embraces customized knowledge to facilitate binning accuracy. In our study, we have investigated two types of additional knowledge, the co-alignment to reference genomes and linkage of contigs provided by paired-end reads, as well as the ensemble of both. We find that both co-alignment and linkage information further improve binning in the majority of cases. COCACOLA is scalable and faster than CONCOCT, GroopM, MaxBin and MetaBAT. The software is available at https://github.com/younglululu/COCACOLA . fsun@usc.edu. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  17. Image quality in thoracic 4D cone-beam CT: A sensitivity analysis of respiratory signal, binning method, reconstruction algorithm, and projection angular spacing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shieh, Chun-Chien; Kipritidis, John; O’Brien, Ricky T.

    Purpose: Respiratory signal, binning method, and reconstruction algorithm are three major controllable factors affecting image quality in thoracic 4D cone-beam CT (4D-CBCT), which is widely used in image guided radiotherapy (IGRT). Previous studies have investigated each of these factors individually, but no integrated sensitivity analysis has been performed. In addition, projection angular spacing is also a key factor in reconstruction, but how it affects image quality is not obvious. An investigation of the impacts of these four factors on image quality can help determine the most effective strategy in improving 4D-CBCT for IGRT. Methods: Fourteen 4D-CBCT patient projection datasets withmore » various respiratory motion features were reconstructed with the following controllable factors: (i) respiratory signal (real-time position management, projection image intensity analysis, or fiducial marker tracking), (ii) binning method (phase, displacement, or equal-projection-density displacement binning), and (iii) reconstruction algorithm [Feldkamp–Davis–Kress (FDK), McKinnon–Bates (MKB), or adaptive-steepest-descent projection-onto-convex-sets (ASD-POCS)]. The image quality was quantified using signal-to-noise ratio (SNR), contrast-to-noise ratio, and edge-response width in order to assess noise/streaking and blur. The SNR values were also analyzed with respect to the maximum, mean, and root-mean-squared-error (RMSE) projection angular spacing to investigate how projection angular spacing affects image quality. Results: The choice of respiratory signals was found to have no significant impact on image quality. Displacement-based binning was found to be less prone to motion artifacts compared to phase binning in more than half of the cases, but was shown to suffer from large interbin image quality variation and large projection angular gaps. Both MKB and ASD-POCS resulted in noticeably improved image quality almost 100% of the time relative to FDK. In addition, SNR values were found to increase with decreasing RMSE values of projection angular gaps with strong correlations (r ≈ −0.7) regardless of the reconstruction algorithm used. Conclusions: Based on the authors’ results, displacement-based binning methods, better reconstruction algorithms, and the acquisition of even projection angular views are the most important factors to consider for improving thoracic 4D-CBCT image quality. In view of the practical issues with displacement-based binning and the fact that projection angular spacing is not currently directly controllable, development of better reconstruction algorithms represents the most effective strategy for improving image quality in thoracic 4D-CBCT for IGRT applications at the current stage.« less

  18. Multisensor Parallel Largest Ellipsoid Distributed Data Fusion with Unknown Cross-Covariances

    PubMed Central

    Liu, Baoyu; Zhan, Xingqun; Zhu, Zheng H.

    2017-01-01

    As the largest ellipsoid (LE) data fusion algorithm can only be applied to two-sensor system, in this contribution, parallel fusion structure is proposed to introduce the LE algorithm into a multisensor system with unknown cross-covariances, and three parallel fusion structures based on different estimate pairing methods are presented and analyzed. In order to assess the influence of fusion structure on fusion performance, two fusion performance assessment parameters are defined as Fusion Distance and Fusion Index. Moreover, the formula for calculating the upper bounds of actual fused error covariances of the presented multisensor LE fusers is also provided. Demonstrated with simulation examples, the Fusion Index indicates fuser’s actual fused accuracy and its sensitivity to the sensor orders, as well as its robustness to the accuracy of newly added sensors. Compared to the LE fuser with sequential structure, the LE fusers with proposed parallel structures not only significantly improve their properties in these aspects, but also embrace better performances in consistency and computation efficiency. The presented multisensor LE fusers generally have better accuracies than covariance intersection (CI) fusion algorithm and are consistent when the local estimates are weakly correlated. PMID:28661442

  19. BInGaN alloys nearly lattice-matched to GaN for high-power high-efficiency visible LEDs

    NASA Astrophysics Data System (ADS)

    Williams, Logan; Kioupakis, Emmanouil

    2017-11-01

    InGaN-based visible light-emitting diodes (LEDs) find commercial applications for solid-state lighting and displays, but lattice mismatch limits the thickness of InGaN quantum wells that can be grown on GaN with high crystalline quality. Since narrower wells operate at a higher carrier density for a given current density, they increase the fraction of carriers lost to Auger recombination and lower the efficiency. The incorporation of boron, a smaller group-III element, into InGaN alloys is a promising method to eliminate the lattice mismatch and realize high-power, high-efficiency visible LEDs with thick active regions. In this work, we apply predictive calculations based on hybrid density functional theory to investigate the thermodynamic, structural, and electronic properties of BInGaN alloys. Our results show that BInGaN alloys with a B:In ratio of 2:3 are better lattice matched to GaN compared to InGaN and, for indium fractions less than 0.2, nearly lattice matched. Deviations from Vegard's law appear as bowing of the in-plane lattice constant with respect to composition. Our thermodynamics calculations demonstrate that the solubility of boron is higher in InGaN than in pure GaN. Varying the Ga mole fraction while keeping the B:In ratio constant enables the adjustment of the (direct) gap in the 1.75-3.39 eV range, which covers the entire visible spectrum. Holes are strongly localized in non-bonded N 2p states caused by local bond planarization near boron atoms. Our results indicate that BInGaN alloys are promising for fabricating nitride heterostructures with thick active regions for high-power, high-efficiency LEDs.

  20. An Enhanced Differential Evolution Algorithm Based on Multiple Mutation Strategies.

    PubMed

    Xiang, Wan-li; Meng, Xue-lei; An, Mei-qing; Li, Yin-zhen; Gao, Ming-xia

    2015-01-01

    Differential evolution algorithm is a simple yet efficient metaheuristic for global optimization over continuous spaces. However, there is a shortcoming of premature convergence in standard DE, especially in DE/best/1/bin. In order to take advantage of direction guidance information of the best individual of DE/best/1/bin and avoid getting into local trap, based on multiple mutation strategies, an enhanced differential evolution algorithm, named EDE, is proposed in this paper. In the EDE algorithm, an initialization technique, opposition-based learning initialization for improving the initial solution quality, and a new combined mutation strategy composed of DE/current/1/bin together with DE/pbest/bin/1 for the sake of accelerating standard DE and preventing DE from clustering around the global best individual, as well as a perturbation scheme for further avoiding premature convergence, are integrated. In addition, we also introduce two linear time-varying functions, which are used to decide which solution search equation is chosen at the phases of mutation and perturbation, respectively. Experimental results tested on twenty-five benchmark functions show that EDE is far better than the standard DE. In further comparisons, EDE is compared with other five state-of-the-art approaches and related results show that EDE is still superior to or at least equal to these methods on most of benchmark functions.

  1. Massively parallel support for a case-based planning system

    NASA Technical Reports Server (NTRS)

    Kettler, Brian P.; Hendler, James A.; Anderson, William A.

    1993-01-01

    Case-based planning (CBP), a kind of case-based reasoning, is a technique in which previously generated plans (cases) are stored in memory and can be reused to solve similar planning problems in the future. CBP can save considerable time over generative planning, in which a new plan is produced from scratch. CBP thus offers a potential (heuristic) mechanism for handling intractable problems. One drawback of CBP systems has been the need for a highly structured memory to reduce retrieval times. This approach requires significant domain engineering and complex memory indexing schemes to make these planners efficient. In contrast, our CBP system, CaPER, uses a massively parallel frame-based AI language (PARKA) and can do extremely fast retrieval of complex cases from a large, unindexed memory. The ability to do fast, frequent retrievals has many advantages: indexing is unnecessary; very large case bases can be used; memory can be probed in numerous alternate ways; and queries can be made at several levels, allowing more specific retrieval of stored plans that better fit the target problem with less adaptation. In this paper we describe CaPER's case retrieval techniques and some experimental results showing its good performance, even on large case bases.

  2. Measurement of seeing and the atmospheric time constant by differential scintillations.

    PubMed

    Tokovinin, Andrei

    2002-02-20

    A simple differential analysis of stellar scintillations measured simultaneously with two apertures opens the possibility to estimate seeing. Moreover, some information on the vertical turbulence distribution can be obtained. A general expression for the differential scintillation index for apertures of arbitrary shape and for finite exposure time is derived, and its applications are studied. Correction for exposure time bias by use of the ratio of scintillation indices with and without time binning is studied. A bandpass-filtered scintillation in a small aperture (computed as the differential-exposure index) provides a reasonably good estimate of the atmospheric time constant for adaptive optics.

  3. A design study of a signal detection system. [for search of extraterrestrial radio sources

    NASA Technical Reports Server (NTRS)

    Healy, T. J.

    1980-01-01

    A system is described which can aid in the search for radio signals from extraterrestrial sources, or in other applications characterized by low signal-to-noise ratios and very high data rates. The system follows a multichannel (16 million bin) spectrum analyzer, and has critical processing, system control, and memory fuctions. The design includes a moderately rich set of algorithms to be used in parallel to detect signals of unknown form. A multi-threshold approach is used to obtain high and low signal sensitivities. Relatively compact and transportable memory systems are specified.

  4. A Framework to Develop Persuasive Smart Environments

    NASA Astrophysics Data System (ADS)

    Lobo, Pedro; Romão, Teresa; Dias, A. Eduardo; Danado, José Carlos

    This paper presents a framework for the creation of context-sensitive persuasive applications. The framework allows the authoring of new persuasive smart environments producing the appropriate feedback to the users based on different sensors spread throughout the environment to capture contextual information. Using this framework, we created an application, Smart Bins, aimed at promoting users' behavioural changes regarding the recycling of waste materials. Furthermore, to evaluate the usability of our authoring tool, we performed user tests to analyze if developers could successfully create the Smart Bins application using the framework. A description of the Smart Bins application, as well as the results of the user tests, are also presented in this paper.

  5. A Psychological Profile of Osama bin Laden.

    PubMed

    Ross, Colin A

    2015-01-01

    Understanding Osama bin Laden's personal history illuminates his motivation, inner conflicts, decisions and behaviors. His relationships with his mother, father, country and religion set the stage for his conflicted choices as an adolescent and then as an adult. Although only a cursory psychological profile is possible based on public domain information, the profile constructed here could be useful in setting future foreign policy. Perhaps the crucial mistake in U.S. foreign policy was abandoning bin Laden as an asset when Russian forces were expelled from Afghanistan in 1989: this act by the U.S. set the stage for the World Trade Center attacks on September 11, 2001.

  6. LROC Investigation of Three Strategies for Reducing the Impact of Respiratory Motion on the Detection of Solitary Pulmonary Nodules in SPECT

    NASA Astrophysics Data System (ADS)

    Smyczynski, Mark S.; Gifford, Howard C.; Dey, Joyoni; Lehovich, Andre; McNamara, Joseph E.; Segars, W. Paul; King, Michael A.

    2016-02-01

    The objective of this investigation was to determine the effectiveness of three motion reducing strategies in diminishing the degrading impact of respiratory motion on the detection of small solitary pulmonary nodules (SPNs) in single-photon emission computed tomographic (SPECT) imaging in comparison to a standard clinical acquisition and the ideal case of imaging in the absence of respiratory motion. To do this nonuniform rational B-spline cardiac-torso (NCAT) phantoms based on human-volunteer CT studies were generated spanning the respiratory cycle for a normal background distribution of Tc-99 m NeoTect. Similarly, spherical phantoms of 1.0-cm diameter were generated to model small SPN for each of the 150 uniquely located sites within the lungs whose respiratory motion was based on the motion of normal structures in the volunteer CT studies. The SIMIND Monte Carlo program was used to produce SPECT projection data from these. Normal and single-lesion containing SPECT projection sets with a clinically realistic Poisson noise level were created for the cases of 1) the end-expiration (EE) frame with all counts, 2) respiration-averaged motion with all counts, 3) one fourth of the 32 frames centered around EE (Quarter Binning), 4) one half of the 32 frames centered around EE (Half Binning), and 5) eight temporally binned frames spanning the respiratory cycle. Each of the sets of combined projection data were reconstructed with RBI-EM with system spatial-resolution compensation (RC). Based on the known motion for each of the 150 different lesions, the reconstructed volumes of respiratory bins were shifted so as to superimpose the locations of the SPN onto that in the first bin (Reconstruct and Shift). Five human observers performed localization receiver operating characteristics (LROC) studies of SPN detection. The observer results were analyzed for statistical significance differences in SPN detection accuracy among the three correction strategies, the standard acquisition, and the ideal case of the absence of respiratory motion. Our human-observer LROC determined that Quarter Binning and Half Binning strategies resulted in SPN detection accuracy statistically significantly below ( ) that of standard clinical acquisition, whereas the Reconstruct and Shift strategy resulted in a detection accuracy not statistically significantly different from that of the ideal case. This investigation demonstrates that tumor detection based on acquisitions associated with less than all the counts which could potentially be employed may result in poorer detection despite limiting the motion of the lesion. The Reconstruct and Shift method results in tumor detection that is equivalent to ideal motion correction.

  7. Validating the operational bias and hypothesis of universal exponent in landslide frequency-area distribution.

    PubMed

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes.

  8. A comprehensive DNA barcode database for Central European beetles with a focus on Germany: adding more than 3500 identified species to BOLD.

    PubMed

    Hendrich, Lars; Morinière, Jérôme; Haszprunar, Gerhard; Hebert, Paul D N; Hausmann, Axel; Köhler, Frank; Balke, Michael

    2015-07-01

    Beetles are the most diverse group of animals and are crucial for ecosystem functioning. In many countries, they are well established for environmental impact assessment, but even in the well-studied Central European fauna, species identification can be very difficult. A comprehensive and taxonomically well-curated DNA barcode library could remedy this deficit and could also link hundreds of years of traditional knowledge with next generation sequencing technology. However, such a beetle library is missing to date. This study provides the globally largest DNA barcode reference library for Coleoptera for 15 948 individuals belonging to 3514 well-identified species (53% of the German fauna) with representatives from 97 of 103 families (94%). This study is the first comprehensive regional test of the efficiency of DNA barcoding for beetles with a focus on Germany. Sequences ≥500 bp were recovered from 63% of the specimens analysed (15 948 of 25 294) with short sequences from another 997 specimens. Whereas most specimens (92.2%) could be unambiguously assigned to a single known species by sequence diversity at CO1, 1089 specimens (6.8%) were assigned to more than one Barcode Index Number (BIN), creating 395 BINs which need further study to ascertain if they represent cryptic species, mitochondrial introgression, or simply regional variation in widespread species. We found 409 specimens (2.6%) that shared a BIN assignment with another species, most involving a pair of closely allied species as 43 BINs were involved. Most of these taxa were separated by barcodes although sequence divergences were low. Only 155 specimens (0.97%) show identical or overlapping clusters. © 2014 John Wiley & Sons Ltd.

  9. Terahertz microfluidic sensing using a parallel-plate waveguide sensor.

    PubMed

    Astley, Victoria; Reichel, Kimberly; Mendis, Rajind; Mittleman, Daniel M

    2012-08-30

    Refractive index (RI) sensing is a powerful noninvasive and label-free sensing technique for the identification, detection and monitoring of microfluidic samples with a wide range of possible sensor designs such as interferometers and resonators. Most of the existing RI sensing applications focus on biological materials in aqueous solutions in visible and IR frequencies, such as DNA hybridization and genome sequencing. At terahertz frequencies, applications include quality control, monitoring of industrial processes and sensing and detection applications involving nonpolar materials. Several potential designs for refractive index sensors in the terahertz regime exist, including photonic crystal waveguides, asymmetric split-ring resonators, and photonic band gap structures integrated into parallel-plate waveguides. Many of these designs are based on optical resonators such as rings or cavities. The resonant frequencies of these structures are dependent on the refractive index of the material in or around the resonator. By monitoring the shifts in resonant frequency the refractive index of a sample can be accurately measured and this in turn can be used to identify a material, monitor contamination or dilution, etc. The sensor design we use here is based on a simple parallel-plate waveguide. A rectangular groove machined into one face acts as a resonant cavity (Figures 1 and 2). When terahertz radiation is coupled into the waveguide and propagates in the lowest-order transverse-electric (TE1) mode, the result is a single strong resonant feature with a tunable resonant frequency that is dependent on the geometry of the groove. This groove can be filled with nonpolar liquid microfluidic samples which cause a shift in the observed resonant frequency that depends on the amount of liquid in the groove and its refractive index. Our technique has an advantage over other terahertz techniques in its simplicity, both in fabrication and implementation, since the procedure can be accomplished with standard laboratory equipment without the need for a clean room or any special fabrication or experimental techniques. It can also be easily expanded to multichannel operation by the incorporation of multiple grooves. In this video we will describe our complete experimental procedure, from the design of the sensor to the data analysis and determination of the sample refractive index.

  10. Single-Cell-Genomics-Facilitated Read Binning of Candidate Phylum EM19 Genomes from Geothermal Spring Metagenomes.

    PubMed

    Becraft, Eric D; Dodsworth, Jeremy A; Murugapiran, Senthil K; Ohlsson, J Ingemar; Briggs, Brandon R; Kanbar, Jad; De Vlaminck, Iwijn; Quake, Stephen R; Dong, Hailiang; Hedlund, Brian P; Swingley, Wesley D

    2016-02-15

    The vast majority of microbial life remains uncatalogued due to the inability to cultivate these organisms in the laboratory. This "microbial dark matter" represents a substantial portion of the tree of life and of the populations that contribute to chemical cycling in many ecosystems. In this work, we leveraged an existing single-cell genomic data set representing the candidate bacterial phylum "Calescamantes" (EM19) to calibrate machine learning algorithms and define metagenomic bins directly from pyrosequencing reads derived from Great Boiling Spring in the U.S. Great Basin. Compared to other assembly-based methods, taxonomic binning with a read-based machine learning approach yielded final assemblies with the highest predicted genome completeness of any method tested. Read-first binning subsequently was used to extract Calescamantes bins from all metagenomes with abundant Calescamantes populations, including metagenomes from Octopus Spring and Bison Pool in Yellowstone National Park and Gongxiaoshe Spring in Yunnan Province, China. Metabolic reconstruction suggests that Calescamantes are heterotrophic, facultative anaerobes, which can utilize oxidized nitrogen sources as terminal electron acceptors for respiration in the absence of oxygen and use proteins as their primary carbon source. Despite their phylogenetic divergence, the geographically separate Calescamantes populations were highly similar in their predicted metabolic capabilities and core gene content, respiring O2, or oxidized nitrogen species for energy conservation in distant but chemically similar hot springs. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  11. Constrained sampling experiments reveal principles of detection in natural scenes.

    PubMed

    Sebastian, Stephen; Abrams, Jared; Geisler, Wilson S

    2017-07-11

    A fundamental everyday visual task is to detect target objects within a background scene. Using relatively simple stimuli, vision science has identified several major factors that affect detection thresholds, including the luminance of the background, the contrast of the background, the spatial similarity of the background to the target, and uncertainty due to random variations in the properties of the background and in the amplitude of the target. Here we use an experimental approach based on constrained sampling from multidimensional histograms of natural stimuli, together with a theoretical analysis based on signal detection theory, to discover how these factors affect detection in natural scenes. We sorted a large collection of natural image backgrounds into multidimensional histograms, where each bin corresponds to a particular luminance, contrast, and similarity. Detection thresholds were measured for a subset of bins spanning the space, where a natural background was randomly sampled from a bin on each trial. In low-uncertainty conditions, both the background bin and the amplitude of the target were fixed, and, in high-uncertainty conditions, they varied randomly on each trial. We found that thresholds increase approximately linearly along all three dimensions and that detection accuracy is unaffected by background bin and target amplitude uncertainty. The results are predicted from first principles by a normalized matched-template detector, where the dynamic normalizing gain factor follows directly from the statistical properties of the natural backgrounds. The results provide an explanation for classic laws of psychophysics and their underlying neural mechanisms.

  12. Octree Bin-to-Bin Fractional-NTC Collisions

    DTIC Science & Technology

    2015-09-17

    Briefing Charts 3. DATES COVERED (From - To) 24 August 2015 – 17 September 2015 4. TITLE AND SUBTITLE Octree bin-to-bin fractional -NTC collisions...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. 239.18 OCTREE BIN-TO-BIN FRACTIONAL -NTC COLLISIONS Robert Martin ERC INC., SPACECRAFT PROPULSION...AFTC/PA clearance No. TBD ROBERT MARTIN (AFRL/RQRS) DISTRIBUTION A: APPROVED FOR PUBLIC RELEASE 1 / 15 OUTLINE 1 BACKGROUND 2 FRACTIONAL COLLISIONS 3 BIN

  13. BAC-end sequence-based SNPs and Bin mapping for rapid integration of physical and genetic maps in apple.

    PubMed

    Han, Yuepeng; Chagné, David; Gasic, Ksenija; Rikkerink, Erik H A; Beever, Jonathan E; Gardiner, Susan E; Korban, Schuyler S

    2009-03-01

    A genome-wide BAC physical map of the apple, Malus x domestica Borkh., has been recently developed. Here, we report on integrating the physical and genetic maps of the apple using a SNP-based approach in conjunction with bin mapping. Briefly, BAC clones located at ends of BAC contigs were selected, and sequenced at both ends. The BAC end sequences (BESs) were used to identify candidate SNPs. Subsequently, these candidate SNPs were genetically mapped using a bin mapping strategy for the purpose of mapping the physical onto the genetic map. Using this approach, 52 (23%) out of 228 BESs tested were successfully exploited to develop SNPs. These SNPs anchored 51 contigs, spanning approximately 37 Mb in cumulative physical length, onto 14 linkage groups. The reliability of the integration of the physical and genetic maps using this SNP-based strategy is described, and the results confirm the feasibility of this approach to construct an integrated physical and genetic maps for apple.

  14. Very-High-Energy γ-Ray Observations of the Blazar 1ES 2344+514 with VERITAS

    NASA Astrophysics Data System (ADS)

    Allen, C.; Archambault, S.; Archer, A.; Benbow, W.; Bird, R.; Bourbeau, E.; Brose, R.; Buchovecky, M.; Buckley, J. H.; Bugaev, V.; Cardenzana, J. V.; Cerruti, M.; Chen, X.; Christiansen, J. L.; Connolly, M. P.; Cui, W.; Daniel, M. K.; Eisch, J. D.; Falcone, A.; Feng, Q.; Fernandez-Alonso, M.; Finley, J. P.; Fleischhack, H.; Flinders, A.; Fortson, L.; Furniss, A.; Gillanders, G. H.; Griffin, S.; Grube, J.; Hütten, M.; Håkansson, N.; Hanna, D.; Hervet, O.; Holder, J.; Hughes, G.; Humensky, T. B.; Johnson, C. A.; Kaaret, P.; Kar, P.; Kelley-Hoskins, N.; Kertzman, M.; Kieda, D.; Krause, M.; Krennrich, F.; Kumar, S.; Lang, M. J.; Maier, G.; McArthur, S.; McCann, A.; Meagher, K.; Moriarty, P.; Mukherjee, R.; Nguyen, T.; Nieto, D.; O'Brien, S.; de Bhróithe, A. O'Faoláin; Ong, R. A.; Otte, A. N.; Park, N.; Petrashyk, A.; Pichel, A.; Pohl, M.; Popkow, A.; Pueschel, E.; Quinn, J.; Ragan, K.; Reynolds, P. T.; Richards, G. T.; Roache, E.; Rovero, A. C.; Rulten, C.; Sadeh, I.; Santander, M.; Sembroski, G. H.; Shahinyan, K.; Telezhinsky, I.; Tucci, J. V.; Tyler, J.; Wakely, S. P.; Weinstein, A.; Wilhelm, A.; Williams, D. A.

    2017-10-01

    We present very-high-energy γ-ray observations of the BL Lac object 1ES 2344+514 taken by the Very Energetic Radiation Imaging Telescope Array System between 2007 and 2015. 1ES 2344+514 is detected with a statistical significance above the background of 20.8σ in 47.2 h (livetime) of observations, making this the most comprehensive very-high-energy study of 1ES 2344+514 to date. Using these observations, the temporal properties of 1ES 2344+514 are studied on short and long times-scales. We fit a constant-flux model to nightly and seasonally binned light curves and apply a fractional variability test to determine the stability of the source on different time-scales. We reject the constant-flux model for the 2007-2008 and 2014-2015 nightly binned light curves and for the long-term seasonally binned light curve at the >3σ level. The spectra of the time-averaged emission before and after correction for attenuation by the extragalactic background light are obtained. The observed time-averaged spectrum above 200 GeV is satisfactorily fitted (χ2/NDF = 7.89/6) by a power-law function with an index Γ = 2.46 ± 0.06stat ± 0.20sys and extends to at least 8 TeV. The extragalactic-background-light-deabsorbed spectrum is adequately fit (χ2/NDF = 6.73/6) by a power-law function with an index Γ = 2.15 ± 0.06stat ± 0.20sys while an F-test indicates that the power law with an exponential cut-off function provides a marginally better fit (χ2/NDF = 2.56/5) at the 2.1σ level. The source location is found to be consistent with the published radio location and its spatial extent is consistent with a point source.

  15. Limited-angle effect compensation for respiratory binned cardiac SPECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qi, Wenyuan; Yang, Yongyi, E-mail: yy@ece.iit.edu; Wernick, Miles N.

    Purpose: In cardiac single photon emission computed tomography (SPECT), respiratory-binned study is used to combat the motion blur associated with respiratory motion. However, owing to the variability in respiratory patterns during data acquisition, the acquired data counts can vary significantly both among respiratory bins and among projection angles within individual bins. If not properly accounted for, such variation could lead to artifacts similar to limited-angle effect in image reconstruction. In this work, the authors aim to investigate several reconstruction strategies for compensating the limited-angle effect in respiratory binned data for the purpose of reducing the image artifacts. Methods: The authorsmore » first consider a model based correction approach, in which the variation in acquisition time is directly incorporated into the imaging model, such that the data statistics are accurately described among both the projection angles and respiratory bins. Afterward, the authors consider an approximation approach, in which the acquired data are rescaled to accommodate the variation in acquisition time among different projection angles while the imaging model is kept unchanged. In addition, the authors also consider the use of a smoothing prior in reconstruction for suppressing the artifacts associated with limited-angle effect. In our evaluation study, the authors first used Monte Carlo simulated imaging with 4D NCAT phantom wherein the ground truth is known for quantitative comparison. The authors evaluated the accuracy of the reconstructed myocardium using a number of metrics, including regional and overall accuracy of the myocardium, uniformity and spatial resolution of the left ventricle (LV) wall, and detectability of perfusion defect using a channelized Hotelling observer. As a preliminary demonstration, the authors also tested the different approaches on five sets of clinical acquisitions. Results: The quantitative evaluation results show that the three compensation methods could all, but to different extents, reduce the reconstruction artifacts over no compensation. In particular, the model based approach reduced the mean-squared-error of the reconstructed myocardium by as much as 40%. Compared to the approach of data rescaling, the model based approach further improved both the overall and regional accuracy of the myocardium; it also further improved the lesion detectability and the uniformity of the LV wall. When ML reconstruction was used, the model based approach was notably more effective for improving the LV wall; when MAP reconstruction was used, the smoothing prior could reduce the noise level and artifacts with little or no increase in bias, but at the cost of a slight resolution loss of the LV wall. The improvements in image quality by the different compensation methods were also observed in the clinical acquisitions. Conclusions: Compensating for the uneven distribution of acquisition time among both projection angles and respiratory bins can effectively reduce the limited-angle artifacts in respiratory-binned cardiac SPECT reconstruction. Direct incorporation of the time variation into the imaging model together with a smoothing prior in reconstruction can lead to the most improvement in the accuracy of the reconstructed myocardium.« less

  16. Accuracy analysis and design of A3 parallel spindle head

    NASA Astrophysics Data System (ADS)

    Ni, Yanbing; Zhang, Biao; Sun, Yupeng; Zhang, Yuan

    2016-03-01

    As functional components of machine tools, parallel mechanisms are widely used in high efficiency machining of aviation components, and accuracy is one of the critical technical indexes. Lots of researchers have focused on the accuracy problem of parallel mechanisms, but in terms of controlling the errors and improving the accuracy in the stage of design and manufacturing, further efforts are required. Aiming at the accuracy design of a 3-DOF parallel spindle head(A3 head), its error model, sensitivity analysis and tolerance allocation are investigated. Based on the inverse kinematic analysis, the error model of A3 head is established by using the first-order perturbation theory and vector chain method. According to the mapping property of motion and constraint Jacobian matrix, the compensatable and uncompensatable error sources which affect the accuracy in the end-effector are separated. Furthermore, sensitivity analysis is performed on the uncompensatable error sources. The sensitivity probabilistic model is established and the global sensitivity index is proposed to analyze the influence of the uncompensatable error sources on the accuracy in the end-effector of the mechanism. The results show that orientation error sources have bigger effect on the accuracy in the end-effector. Based upon the sensitivity analysis results, the tolerance design is converted into the issue of nonlinearly constrained optimization with the manufacturing cost minimum being the optimization objective. By utilizing the genetic algorithm, the allocation of the tolerances on each component is finally determined. According to the tolerance allocation results, the tolerance ranges of ten kinds of geometric error sources are obtained. These research achievements can provide fundamental guidelines for component manufacturing and assembly of this kind of parallel mechanisms.

  17. Computation of reliable textural indices from multimodal brain MRI: suggestions based on a study of patients with diffuse intrinsic pontine glioma.

    PubMed

    Goya-Outi, Jessica; Orlhac, Fanny; Calmon, Raphael; Alentorn, Agusti; Nioche, Christophe; Philippe, Cathy; Puget, Stéphanie; Boddaert, Nathalie; Buvat, Irène; Grill, Jacques; Frouin, Vincent; Frouin, Frederique

    2018-05-10

    Few methodological studies regarding widely used textural indices robustness in MRI have been reported. In this context, this study aims to propose some rules to compute reliable textural indices from multimodal 3D brain MRI. Diagnosis and post-biopsy MR scans including T1, post-contrast T1, T2 and FLAIR images from thirty children with diffuse intrinsic pontine glioma (DIPG) were considered. The hybrid white stripe method was adapted to standardize MR intensities. Sixty textural indices were then computed for each modality in different regions of interest (ROI), including tumor and white matter (WM). Three types of intensity binning were compared [Formula: see text]: constant bin width and relative bounds; [Formula: see text] constant number of bins and relative bounds; [Formula: see text] constant number of bins and absolute bounds. The impact of the volume of the region was also tested within the WM. First, the mean Hellinger distance between patient-based intensity distributions decreased by a factor greater than 10 in WM and greater than 2.5 in gray matter after standardization. Regarding the binning strategy, the ranking of patients was highly correlated for 188/240 features when comparing [Formula: see text] with [Formula: see text], but for only 20 when comparing [Formula: see text] with [Formula: see text], and nine when comparing [Formula: see text] with [Formula: see text]. Furthermore, when using [Formula: see text] or [Formula: see text] texture indices reflected tumor heterogeneity as assessed visually by experts. Last, 41 features presented statistically significant differences between contralateral WM regions when ROI size slightly varies across patients, and none when using ROI of the same size. For regions with similar size, 224 features were significantly different between WM and tumor. Valuable information from texture indices can be biased by methodological choices. Recommendations are to standardize intensities in MR brain volumes, to use intensity binning with constant bin width, and to define regions with the same volumes to get reliable textural indices.

  18. Developing a climate-based risk map of fascioliasis outbreaks in Iran.

    PubMed

    Halimi, Mansour; Farajzadeh, Manuchehr; Delavari, Mahdi; Arbabi, Mohsen

    2015-01-01

    The strong relationship between climate and fascioliasis outbreaks enables the development of climate-based models to estimate the potential risk of fascioliasis outbreaks. This work aims to develop a climate-based risk map of fascioliasis outbreaks in Iran using Ollerenshaw's fascioliasis risk index incorporating geographical information system (GIS). Using this index, a risk map of fascioliasis outbreaks for the entire country was developed. We determined that the country can be divided into 4 fascioliasis outbreak risk categories. Class 1, in which the Mt value is less than 100, includes more than 0.91 of the country's area. The climate in this class is not conducive to fascioliasis outbreaks in any month. Dryness and low temperature in the wet season (December to April) are the key barriers against fascioliasis outbreaks in this class. The risk map developed based on climatic factors indicated that only 0.03 of the country's area, including Gilan province in the northern region of Iran, is highly suitable to fascioliasis outbreaks during September to January. The Mt value is greater than 500 in this class. Heavy rainfall in the summer and fall, especially in Rasht, Astara and Bandar Anzaly (≥ 1000 mm/year), creates more suitable breeding places for snail intermediate hosts. Copyright © 2015 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  19. optBINS: Optimal Binning for histograms

    NASA Astrophysics Data System (ADS)

    Knuth, Kevin H.

    2018-03-01

    optBINS (optimal binning) determines the optimal number of bins in a uniform bin-width histogram by deriving the posterior probability for the number of bins in a piecewise-constant density model after assigning a multinomial likelihood and a non-informative prior. The maximum of the posterior probability occurs at a point where the prior probability and the the joint likelihood are balanced. The interplay between these opposing factors effectively implements Occam's razor by selecting the most simple model that best describes the data.

  20. See Also:physica status solidi (a)physica status solidi (c)Copyright © 2004 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  21. Get Sample Copy
  22. Recommend to Your Librarian
  23. E-MailPrint
  1. Evaluation of respiratory and cardiac motion correction schemes in dual gated PET/CT cardiac imaging.

    PubMed

    Lamare, F; Le Maitre, A; Dawood, M; Schäfers, K P; Fernandez, P; Rimoldi, O E; Visvikis, D

    2014-07-01

    Cardiac imaging suffers from both respiratory and cardiac motion. One of the proposed solutions involves double gated acquisitions. Although such an approach may lead to both respiratory and cardiac motion compensation there are issues associated with (a) the combination of data from cardiac and respiratory motion bins, and (b) poor statistical quality images as a result of using only part of the acquired data. The main objective of this work was to evaluate different schemes of combining binned data in order to identify the best strategy to reconstruct motion free cardiac images from dual gated positron emission tomography (PET) acquisitions. A digital phantom study as well as seven human studies were used in this evaluation. PET data were acquired in list mode (LM). A real-time position management system and an electrocardiogram device were used to provide the respiratory and cardiac motion triggers registered within the LM file. Acquired data were subsequently binned considering four and six cardiac gates, or the diastole only in combination with eight respiratory amplitude gates. PET images were corrected for attenuation, but no randoms nor scatter corrections were included. Reconstructed images from each of the bins considered above were subsequently used in combination with an affine or an elastic registration algorithm to derive transformation parameters allowing the combination of all acquired data in a particular position in the cardiac and respiratory cycles. Images were assessed in terms of signal-to-noise ratio (SNR), contrast, image profile, coefficient-of-variation (COV), and relative difference of the recovered activity concentration. Regardless of the considered motion compensation strategy, the nonrigid motion model performed better than the affine model, leading to higher SNR and contrast combined with a lower COV. Nevertheless, when compensating for respiration only, no statistically significant differences were observed in the performance of the two motion models considered. Superior image SNR and contrast were seen using the affine respiratory motion model in combination with the diastole cardiac bin in comparison to the use of the whole cardiac cycle. In contrast, when simultaneously correcting for cardiac beating and respiration, the elastic respiratory motion model outperformed the affine model. In this context, four cardiac bins associated with eight respiratory amplitude bins seemed to be adequate. Considering the compensation of respiratory motion effects only, both affine and elastic based approaches led to an accurate resizing and positioning of the myocardium. The use of the diastolic phase combined with an affine model based respiratory motion correction may therefore be a simple approach leading to significant quality improvements in cardiac PET imaging. However, the best performance was obtained with the combined correction for both cardiac and respiratory movements considering all the dual-gated bins independently through the use of an elastic model based motion compensation.

  2. PhyloPythiaS+: a self-training method for the rapid reconstruction of low-ranking taxonomic bins from metagenomes.

    PubMed

    Gregor, Ivan; Dröge, Johannes; Schirmer, Melanie; Quince, Christopher; McHardy, Alice C

    2016-01-01

    Background. Metagenomics is an approach for characterizing environmental microbial communities in situ, it allows their functional and taxonomic characterization and to recover sequences from uncultured taxa. This is often achieved by a combination of sequence assembly and binning, where sequences are grouped into 'bins' representing taxa of the underlying microbial community. Assignment to low-ranking taxonomic bins is an important challenge for binning methods as is scalability to Gb-sized datasets generated with deep sequencing techniques. One of the best available methods for species bins recovery from deep-branching phyla is the expert-trained PhyloPythiaS package, where a human expert decides on the taxa to incorporate in the model and identifies 'training' sequences based on marker genes directly from the sample. Due to the manual effort involved, this approach does not scale to multiple metagenome samples and requires substantial expertise, which researchers who are new to the area do not have. Results. We have developed PhyloPythiaS+, a successor to our PhyloPythia(S) software. The new (+) component performs the work previously done by the human expert. PhyloPythiaS+ also includes a new k-mer counting algorithm, which accelerated the simultaneous counting of 4-6-mers used for taxonomic binning 100-fold and reduced the overall execution time of the software by a factor of three. Our software allows to analyze Gb-sized metagenomes with inexpensive hardware, and to recover species or genera-level bins with low error rates in a fully automated fashion. PhyloPythiaS+ was compared to MEGAN, taxator-tk, Kraken and the generic PhyloPythiaS model. The results showed that PhyloPythiaS+ performs especially well for samples originating from novel environments in comparison to the other methods. Availability. PhyloPythiaS+ in a virtual machine is available for installation under Windows, Unix systems or OS X on: https://github.com/algbioi/ppsp/wiki.

  3. Glioma survival prediction with the combined analysis of in vivo 11C-MET-PET, ex vivo and patient features by supervised machine learning.

    PubMed

    Papp, Laszlo; Poetsch, Nina; Grahovac, Marko; Schmidbauer, Victor; Woehrer, Adelheid; Preusser, Matthias; Mitterhauser, Markus; Kiesel, Barbara; Wadsak, Wolfgang; Beyer, Thomas; Hacker, Marcus; Traub-Weidinger, Tatjana

    2017-11-24

    Gliomas are the most common types of tumors in the brain. While the definite diagnosis is routinely made ex vivo by histopathologic and molecular examination, diagnostic work-up of patients with suspected glioma is mainly done by using magnetic resonance imaging (MRI). Nevertheless, L-S-methyl- 11 C-methionine ( 11 C-MET) Positron Emission Tomography (PET) holds a great potential in characterization of gliomas. The aim of this study was to establish machine learning (ML) driven survival models for glioma built on 11 C-MET-PET, ex vivo and patient characteristics. Methods: 70 patients with a treatment naïve glioma, who had a positive 11 C-MET-PET and histopathology-derived ex vivo feature extraction, such as World Health Organization (WHO) 2007 tumor grade, histology and isocitrate dehydrogenase (IDH1-R132H) mutation status were included. The 11 C-MET-positive primary tumors were delineated semi-automatically on PET images followed by the feature extraction of tumor-to-background ratio based general and higher-order textural features by applying five different binning approaches. In vivo and ex vivo features, as well as patient characteristics (age, weight, height, body-mass-index, Karnofsky-score) were merged to characterize the tumors. Machine learning approaches were utilized to identify relevant in vivo, ex vivo and patient features and their relative weights for 36 months survival prediction. The resulting feature weights were used to establish three predictive models per binning configuration based on a combination of: in vivo/ex vivo and clinical patient information (M36IEP), in vivo and patient-only information (M36IP), and in vivo only (M36I). In addition a binning-independent ex vivo and patient-only (M36EP) model was created. The established models were validated in a Monte Carlo (MC) cross-validation scheme. Results: Most prominent ML-selected and -weighted features were patient and ex vivo based followed by in vivo features. The highest area under the curve (AUC) values of our models as revealed by the MC cross-validation were: 0.9 (M36IEP), 0.87 (M36EP), 0.77 (M36IP) and 0.72 (M36I). Conclusion: Survival prediction of glioma patients based on amino acid PET using computer-supported predictive models based on in vivo, ex vivo and patient features is highly accurate. Copyright © 2017 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  4. Development of a Novel Therapeutic Paradigm Utilizing a Mammary Gland-Targeted, Bin1-Knockout Mouse Model

    DTIC Science & Technology

    2008-07-01

    Remodeling and Drives Cancer Progression Mee Young Chang, 1 Janette Boulden, 1 Erika Sutanto-Ward, 1 James B. Duhadaway, 1 Alejandro Peralta Soler, 1,2...proliferation by multiple mechanisms. Oncogene 1999;18:3564–73. 5. Pineda -Lucena A, Ho CS, Mao DY, et al. A structure- based model of the c-Myc/Bin1 protein

  5. Non-Invasive Cell-Based Therapy for Traumatic Optic Neuropathy

    DTIC Science & Technology

    2015-06-01

    Morphological and Functional Changes in an Animal Model of Retinitis Pigmentosa . Vis Neurosci, 2013: 1-13. Bin Lu, Catherine W. Morgans, Sergey Girman...of human retinal progenitor cells for treatment of retinitis pigmentosa 2013, ARVO, A0106. Benjamin Bakondi; YuChun Tsai; Bin Lu; Sergey...Systemic administration of MSCs significantly preserved retinal ganglion cell survival after TON. (d) Systemic administration of MSCs also promote limited

  6. Non-Invasive Cell-Based Therapy for Traumatic Optic Neuropathy

    DTIC Science & Technology

    2014-10-01

    Functional Changes in an Animal Model of Retinitis Pigmentosa . Vis Neurosci, 2013: 1-13. Bin Lu, Catherine W. Morgans, Sergey Girman, Jing Luo, Jiagang...human retinal progenitor cells for treatment of retinitis pigmentosa 2013, ARVO, A0106. Benjamin Bakondi; YuChun Tsai; Bin Lu; Sergey...degeneration. Pending NEI (R24) Wang (PI) Preclinical program for Treating Retinitis Pigmentosa by Neural Progenitor Cells   18

  7. Simple 2.5 GHz time-bin quantum key distribution

    NASA Astrophysics Data System (ADS)

    Boaron, Alberto; Korzh, Boris; Houlmann, Raphael; Boso, Gianluca; Rusca, Davide; Gray, Stuart; Li, Ming-Jun; Nolan, Daniel; Martin, Anthony; Zbinden, Hugo

    2018-04-01

    We present a 2.5 GHz quantum key distribution setup with the emphasis on a simple experimental realization. It features a three-state time-bin protocol based on a pulsed diode laser and a single intensity modulator. Implementing an efficient one-decoy scheme and finite-key analysis, we achieve record breaking secret key rates of 1.5 kbps over 200 km of standard optical fibers.

  8. Validating the Operational Bias and Hypothesis of Universal Exponent in Landslide Frequency-Area Distribution

    PubMed Central

    Huang, Jr-Chuan; Lee, Tsung-Yu; Teng, Tse-Yang; Chen, Yi-Chin; Huang, Cho-Ying; Lee, Cheing-Tung

    2014-01-01

    The exponent decay in landslide frequency-area distribution is widely used for assessing the consequences of landslides and with some studies arguing that the slope of the exponent decay is universal and independent of mechanisms and environmental settings. However, the documented exponent slopes are diverse and hence data processing is hypothesized for this inconsistency. An elaborated statistical experiment and two actual landslide inventories were used here to demonstrate the influences of the data processing on the determination of the exponent. Seven categories with different landslide numbers were generated from the predefined inverse-gamma distribution and then analyzed by three data processing procedures (logarithmic binning, LB, normalized logarithmic binning, NLB and cumulative distribution function, CDF). Five different bin widths were also considered while applying LB and NLB. Following that, the maximum likelihood estimation was used to estimate the exponent slopes. The results showed that the exponents estimated by CDF were unbiased while LB and NLB performed poorly. Two binning-based methods led to considerable biases that increased with the increase of landslide number and bin width. The standard deviations of the estimated exponents were dependent not just on the landslide number but also on binning method and bin width. Both extremely few and plentiful landslide numbers reduced the confidence of the estimated exponents, which could be attributed to limited landslide numbers and considerable operational bias, respectively. The diverse documented exponents in literature should therefore be adjusted accordingly. Our study strongly suggests that the considerable bias due to data processing and the data quality should be constrained in order to advance the understanding of landslide processes. PMID:24852019

  9. An optimal FFT-based anisotropic power spectrum estimator

    NASA Astrophysics Data System (ADS)

    Hand, Nick; Li, Yin; Slepian, Zachary; Seljak, Uroš

    2017-07-01

    Measurements of line-of-sight dependent clustering via the galaxy power spectrum's multipole moments constitute a powerful tool for testing theoretical models in large-scale structure. Recent work shows that this measurement, including a moving line-of-sight, can be accelerated using Fast Fourier Transforms (FFTs) by decomposing the Legendre polynomials into products of Cartesian vectors. Here, we present a faster, optimal means of using FFTs for this measurement. We avoid redundancy present in the Cartesian decomposition by using a spherical harmonic decomposition of the Legendre polynomials. With this method, a given multipole of order l requires only 2l+1 FFTs rather than the (l+1)(l+2)/2 FFTs of the Cartesian approach. For the hexadecapole (l = 4), this translates to 40% fewer FFTs, with increased savings for higher l. The reduction in wall-clock time enables the calculation of finely-binned wedges in P(k,μ), obtained by computing multipoles up to a large lmax and combining them. This transformation has a number of advantages. We demonstrate that by using non-uniform bins in μ, we can isolate plane-of-sky (angular) systematics to a narrow bin at 0μ simeq while eliminating the contamination from all other bins. We also show that the covariance matrix of clustering wedges binned uniformly in μ becomes ill-conditioned when combining multipoles up to large values of lmax, but that the problem can be avoided with non-uniform binning. As an example, we present results using lmax=16, for which our procedure requires a factor of 3.4 fewer FFTs than the Cartesian method, while removing the first μ bin leads only to a 7% increase in statistical error on f σ8, as compared to a 54% increase with lmax=4.

  10. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    ERIC Educational Resources Information Center

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  11. Comments on, Xuan Li, Shanghong Zhao, Zihang Zhu, Bing Gong, Xingchun Chu, Yongjun Li, Jing Zhao and Yun Liu `an optical millimeter-wave generation scheme based on two parallel dual-parallel Mach-Zehnder modulators and polarization multiplexing', Journal of Modern Optics, 2015

    NASA Astrophysics Data System (ADS)

    Hasan, Mehedi; Hall, Trevor

    2016-11-01

    In the title paper, Li et al. have presented a scheme for filter-less photonic millimetre-wave (mm-wave) generation based on two polarization multiplexed parallel dual-parallel Mach-Zehnder modulators (DP-MZMs). For frequency octo-tupling, all the harmonics are suppressed except those of order 4l, where l is the integer. The carrier is then suppressed by the polarization multiplexing technique, which is the principal innovative step in their design. Frequency 12-tupling and 16-tupling is also described following a similar method. The two DP-MZM are similarly driven and provide identical outputs for the same RF modulation indices. Consequently, a demerit of their design is the requirement to apply two different RF signal modulation indexes in a particular range and set the polarizer to a precise angle which depends on the pair of modulation indices used in order to suppress the unwanted harmonics (e.g. the carrier) without simultaneously suppressing the wanted harmonics. The aim of this comment is to show that, an adjustment of the RF drive phases with a fixed polarizer angle with the design presented by Li, all harmonics can be suppressed except those of order4l, where l is an odd integer. Hence, a filter-less frequency octo-tupling can be generated whose performance is not limited by the careful adjustment of the RF drive signal, rather it can be operated for a wide range of modulation indexes (m 2.5 → 7.5). If the modulation index is adjusted to suppress 4th harmonics, then the design can be used to perform frequency 24-tupling. Since, the carrier is suppressed by design in the modified architecture, the strict requirement to adjust the RF drive (and polarizer angle) can be avoided without any significant change to the circuit complexity.

  12. VizieR Online Data Catalog: BAL QSOs from SDSS DR3 (Trump+, 2006)

    NASA Astrophysics Data System (ADS)

    Trump, J. R.; Hall, P. B.; Reichard, T. A.; Richards, G. T.; Schneider, D. P.; vanden Berk, D. E.; Knapp, G. R.; Anderson, S. F.; Fan, X.; Brinkman, J.; Kleinman, S. J.; Nitta, A.

    2007-11-01

    We present a total of 4784 unique broad absorption line quasars from the Sloan Digital Sky Survey Third Data Release (Cat. ). An automated algorithm was used to match a continuum to each quasar and to identify regions of flux at least 10% below the continuum over a velocity range of at least 1000km/s in the CIV and MgII absorption regions. The model continuum was selected as the best-fit match from a set of template quasar spectra binned in luminosity, emission line width, and redshift, with the power-law spectral index and amount of dust reddening as additional free parameters. We characterize our sample through the traditional balnicity index and a revised absorption index, as well as through parameters such as the width, outflow velocity, fractional depth, and number of troughs. (1 data file).

  13. Disrupted Membrane Structure and Intracellular Ca2+ Signaling in Adult Skeletal Muscle with Acute Knockdown of Bin1

    PubMed Central

    Tjondrokoesoemo, Andoria; Park, Ki Ho; Ferrante, Christopher; Komazaki, Shinji; Lesniak, Sebastian; Brotto, Marco; Ko, Jae-Kyun; Zhou, Jingsong; Weisleder, Noah; Ma, Jianjie

    2011-01-01

    Efficient intracellular Ca2+ ([Ca2+]i) homeostasis in skeletal muscle requires intact triad junctional complexes comprised of t-tubule invaginations of plasma membrane and terminal cisternae of sarcoplasmic reticulum. Bin1 consists of a specialized BAR domain that is associated with t-tubule development in skeletal muscle and involved in tethering the dihydropyridine receptors (DHPR) to the t-tubule. Here, we show that Bin1 is important for Ca2+ homeostasis in adult skeletal muscle. Since systemic ablation of Bin1 in mice results in postnatal lethality, in vivo electroporation mediated transfection method was used to deliver RFP-tagged plasmid that produced short –hairpin (sh)RNA targeting Bin1 (shRNA-Bin1) to study the effect of Bin1 knockdown in adult mouse FDB skeletal muscle. Upon confirming the reduction of endogenous Bin1 expression, we showed that shRNA-Bin1 muscle displayed swollen t-tubule structures, indicating that Bin1 is required for the maintenance of intact membrane structure in adult skeletal muscle. Reduced Bin1 expression led to disruption of t-tubule structure that was linked with alterations to intracellular Ca2+ release. Voltage-induced Ca2+ released in isolated single muscle fibers of shRNA-Bin1 showed that both the mean amplitude of Ca2+ current and SR Ca2+ transient were reduced when compared to the shRNA-control, indicating compromised coupling between DHPR and ryanodine receptor 1. The mean frequency of osmotic stress induced Ca2+ sparks was reduced in shRNA-Bin1, indicating compromised DHPR activation. ShRNA-Bin1 fibers also displayed reduced Ca2+ sparks' amplitude that was attributed to decreased total Ca2+ stores in the shRNA-Bin1 fibers. Human mutation of Bin1 is associated with centronuclear myopathy and SH3 domain of Bin1 is important for sarcomeric protein organization in skeletal muscle. Our study showing the importance of Bin1 in the maintenance of intact t-tubule structure and ([Ca2+]i) homeostasis in adult skeletal muscle could provide mechanistic insight on the potential role of Bin1 in skeletal muscle contractility and pathology of myopathy. PMID:21984944

  14. MaxBin 2.0: an automated binning algorithm to recover genomes from multiple metagenomic datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Yu-Wei; Simmons, Blake A.; Singer, Steven W.

    The recovery of genomes from metagenomic datasets is a critical step to defining the functional roles of the underlying uncultivated populations. We previously developed MaxBin, an automated binning approach for high-throughput recovery of microbial genomes from metagenomes. Here, we present an expanded binning algorithm, MaxBin 2.0, which recovers genomes from co-assembly of a collection of metagenomic datasets. Tests on simulated datasets revealed that MaxBin 2.0 is highly accurate in recovering individual genomes, and the application of MaxBin 2.0 to several metagenomes from environmental samples demonstrated that it could achieve two complementary goals: recovering more bacterial genomes compared to binning amore » single sample as well as comparing the microbial community composition between different sampling environments. Availability and implementation: MaxBin 2.0 is freely available at http://sourceforge.net/projects/maxbin/ under BSD license. Supplementary information: Supplementary data are available at Bioinformatics online.« less

  15. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Li, X.; Khain, A.; Simpson, S.; Johnson, D.; Remer, L.

    2004-01-01

    Cloud microphysics is inevitably affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds. Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effects of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, two detailed spectral-bin microphysical schemes were implemented into the Goddard Cumulus Ensembel (GCE) model. The formulation for the explicit spectral-bin microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e. pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e. 33 bins). Atmospheric aerosols are also described using number density size distribution functions. A spectral-bin microphysical model is very expensive from a computational point of view and has only been implemented into the 2D version of the GCE at the present time. The model is tested by studying the evolution of deep tropical clouds in the west Pacific warm pool region and in the mid-latitude continent with different concentrations of CCN: a low "c1ean"concentration and a high "dirty" concentration. In addition, differences and similarities between bulk microphysics and spectral-bin microphysical schemes will be examined and discussed.

  16. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Li, X.; Khain, A.; Simpson, S.; Johnson, D.; Remer, L.

    2004-01-01

    Cloud microphysics is inevitably affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds. Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effects of atmospheric aerosol concentration on cloud development, r d a U production, and rainfall rates for convective clouds. Recently, two detailed spectral-bin microphysical schemes were implemented into the Goddard Cumulus Ensembe1 (GCE) model. The formulation for the explicit spectral-bin microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e. pristine ice crystals (columnar and platelike), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e. 33 bins). Atmospheric aerosols are also described using number density size-distribution functions. A spectral-bin microphysical model is very expensive from a computational point of view and has only been implemented into the 2D version of the GCE at the present time. The model is tested by studying the evolution of deep tropical clouds in the west Pacific warm pool region and in the mid-latitude continent with different concentrations of CCN: a low "c1ean"concentration and a high "dirty" concentration. In addition, differences and similarities between bulk microphysics and spectral-bin microphysical schemes will be examined and discussed.

  17. Entanglement between a Photonic Time-Bin Qubit and a Collective Atomic Spin Excitation.

    PubMed

    Farrera, Pau; Heinze, Georg; de Riedmatten, Hugues

    2018-03-09

    Entanglement between light and matter combines the advantage of long distance transmission of photonic qubits with the storage and processing capabilities of atomic qubits. To distribute photonic states efficiently over long distances several schemes to encode qubits have been investigated-time-bin encoding being particularly promising due to its robustness against decoherence in optical fibers. Here, we demonstrate the generation of entanglement between a photonic time-bin qubit and a single collective atomic spin excitation (spin wave) in a cold atomic ensemble, followed by the mapping of the atomic qubit onto another photonic qubit. A magnetic field that induces a periodic dephasing and rephasing of the atomic excitation ensures the temporal distinguishability of the two time bins and plays a central role in the entanglement generation. To analyze the generated quantum state, we use largely imbalanced Mach-Zehnder interferometers to perform projective measurements in different qubit bases and verify the entanglement by violating a Clauser-Horne-Shimony-Holt Bell inequality.

  18. Entanglement between a Photonic Time-Bin Qubit and a Collective Atomic Spin Excitation

    NASA Astrophysics Data System (ADS)

    Farrera, Pau; Heinze, Georg; de Riedmatten, Hugues

    2018-03-01

    Entanglement between light and matter combines the advantage of long distance transmission of photonic qubits with the storage and processing capabilities of atomic qubits. To distribute photonic states efficiently over long distances several schemes to encode qubits have been investigated—time-bin encoding being particularly promising due to its robustness against decoherence in optical fibers. Here, we demonstrate the generation of entanglement between a photonic time-bin qubit and a single collective atomic spin excitation (spin wave) in a cold atomic ensemble, followed by the mapping of the atomic qubit onto another photonic qubit. A magnetic field that induces a periodic dephasing and rephasing of the atomic excitation ensures the temporal distinguishability of the two time bins and plays a central role in the entanglement generation. To analyze the generated quantum state, we use largely imbalanced Mach-Zehnder interferometers to perform projective measurements in different qubit bases and verify the entanglement by violating a Clauser-Horne-Shimony-Holt Bell inequality.

  19. Association between an endoglin gene polymorphism and systemic sclerosis-related pulmonary arterial hypertension.

    PubMed

    Wipff, J; Kahan, A; Hachulla, E; Sibilia, J; Cabane, J; Meyer, O; Mouthon, L; Guillevin, L; Junien, C; Boileau, C; Allanore, Y

    2007-04-01

    Systemic sclerosis (SSc) is a connective tissue disorder characterized by early generalized microangiopathy with disturbed angiogenesis. Endoglin gene (ENG) encodes a transmembrane glycoprotein which acts as an accessory receptor for the transforming growth factor-beta (TGF-beta) superfamily, and is crucial for maintaining vascular integrity. A 6-base insertion in intron 7 (6bINS) of ENG has been reported to be associated with microvascular disturbance. Our objective was to investigate the relationship between 6bINS and the vascular complication pulmonary arterial hypertension (PAH) in SSc in a French Caucasian population. Two hundred eighty SSc cases containing 29/280 having PAH diagnosed by catheterism were compared with 140 patients with osteoarthritis. Genotyping was performed by polymerase-chain-reaction-based fluorescence and direct sequencing of genomic DNA. The polymorphism was in Hardy-Weinberg equilibrium. We observed a significant lower frequency of 6bINS allele in SSc patients with associated PAH compared with controls [10.3 vs 23.9%, P = 0.01; odds ratio (OR) 0.37, 95% confidence interval (CI) 0.15-0.89], and a trend in comparison with SSc patients without PAH (10.3 vs 20.3%, P = 0.05; OR: 0.45, 95% CI: 0.19-1.08). Genotypes carrying allele 6bINS were also less frequent in SSc patients with PAH than in controls (20.7 vs 42.9%, P = 0.02). Thus the frequency of 6bINS differs between SSc patients with or without PAH, suggesting the implication of ENG in this devastating vascular complication of SSc.

  20. Toward zero waste: Composting and recycling for sustainable venue based events

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hottle, Troy A., E-mail: troy.hottle@asu.edu; Bilec, Melissa M., E-mail: mbilec@pitt.edu; Brown, Nicholas R., E-mail: nick.brown@asu.edu

    Highlights: • Venues have billions of customers per year contributing to waste generation. • Waste audits of four university baseball games were conducted to assess venue waste. • Seven scenarios including composting were modeled using EPA’s WARM. • Findings demonstrate tradeoffs between emissions, energy, and landfill avoidance. • Sustainability of handling depends on efficacy of collection and treatment impacts. - Abstract: This study evaluated seven different waste management strategies for venue-based events and characterized the impacts of event waste management via waste audits and the Waste Reduction Model (WARM). The seven waste management scenarios included traditional waste handling methods (e.g.more » recycle and landfill) and management of the waste stream via composting, including purchasing where only compostable food service items were used during the events. Waste audits were conducted at four Arizona State University (ASU) baseball games, including a three game series. The findings demonstrate a tradeoff among CO{sub 2} equivalent emissions, energy use, and landfill diversion rates. Of the seven waste management scenarios assessed, the recycling scenarios provide the greatest reductions in CO{sub 2} eq. emissions and energy use because of the retention of high value materials but are compounded by the difficulty in managing a two or three bin collection system. The compost only scenario achieves complete landfill diversion but does not perform as well with respect to CO{sub 2} eq. emissions or energy. The three game series was used to test the impact of staffed bins on contamination rates; the first game served as a baseline, the second game employed staffed bins, and the third game had non staffed bins to determine the effect of staffing on contamination rates. Contamination rates in both the recycling and compost bins were tracked throughout the series. Contamination rates were reduced from 34% in the first game to 11% on the second night (with the staffed bins) and 23% contamination rates at the third game.« less

  1. Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions Over the Indian Ocean

    DTIC Science & Technology

    2012-09-30

    December 2011 • Daily weather forecasts and briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http...summary of aircraft missions posted on EOL website (http://catalog.eol.ucar.edu/cgi-bin/dynamo/report/index) 3. Post-field campaign (including on...going data analysis into FY13): • Dropsonde data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop

  2. The lick-index calibration of the Gemini multi-object spectrographs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puzia, Thomas H.; Miller, Bryan W.; Trancho, Gelys

    2013-06-01

    We present the calibration of the spectroscopic Lick/IDS standard line-index system for measurements obtained with the Gemini Multi-Object Spectrographs known as GMOS-North and GMOS-South. We provide linear correction functions for each of the 25 standard Lick line indices for the B600 grism and two instrumental setups, one with 0.''5 slit width and 1 × 1 CCD pixel binning (corresponding to ∼2.5 Å spectral resolution) and the other with 0.''75 slit width and 2 × 2 binning (∼4 Å). We find small and well-defined correction terms for the set of Balmer indices Hβ, Hγ {sub A}, and Hδ {sub A} alongmore » with the metallicity sensitive indices Fe5015, Fe5270, Fe5335, Fe5406, Mg{sub 2}, and Mgb that are widely used for stellar population diagnostics of distant stellar systems. We find other indices that sample molecular absorption bands, such as TiO{sub 1} and TiO{sub 2}, with very wide wavelength coverage or indices that sample very weak molecular and atomic absorption features, such as Mg{sub 1}, as well as indices with particularly narrow passband definitions, such as Fe4384, Ca4455, Fe4531, Ca4227, and Fe5782, which are less robustly calibrated. These indices should be used with caution.« less

  3. Conceptual design and kinematic analysis of a novel parallel robot for high-speed pick-and-place operations

    NASA Astrophysics Data System (ADS)

    Meng, Qizhi; Xie, Fugui; Liu, Xin-Jun

    2018-06-01

    This paper deals with the conceptual design, kinematic analysis and workspace identification of a novel four degrees-of-freedom (DOFs) high-speed spatial parallel robot for pick-and-place operations. The proposed spatial parallel robot consists of a base, four arms and a 1½ mobile platform. The mobile platform is a major innovation that avoids output singularity and offers the advantages of both single and double platforms. To investigate the characteristics of the robot's DOFs, a line graph method based on Grassmann line geometry is adopted in mobility analysis. In addition, the inverse kinematics is derived, and the constraint conditions to identify the correct solution are also provided. On the basis of the proposed concept, the workspace of the robot is identified using a set of presupposed parameters by taking input and output transmission index as the performance evaluation criteria.

  4. DNA Barcoding of genus Hexacentrus in China reveals cryptic diversity within Hexacentrus japonicus (Orthoptera, Tettigoniidae).

    PubMed

    Guo, Hui-Fang; Guan, Bei; Shi, Fu-Ming; Zhou, Zhi-Jun

    2016-01-01

    DNA barcoding has been proved successful to provide resolution beyond the boundaries of morphological information. Hence, a study was undertaken to establish DNA barcodes for all morphologically determined Hexacentrus species in China collections. In total, 83 specimens of five Hexacentrus species were barcoded using standard mitochondrial cytochrome c oxidase subunit I (COI) gene. Except for Hexacentrus japonicus, barcode gaps were present in the remaining Hexacentrus species. Taxon ID tree generated seven BOLD's barcode index numbers (BINs), four of which were in agreement with the morphological species. For Hexacentrus japonicus, the maximum intraspecific divergence (4.43%) produced a minimal overlap (0.64%), and 19 specimens were divided into three different BINs. There may be cryptic species within the current Hexacentrus japonicus. This study adds to a growing body of DNA barcodes that have become available for katydids, and shows that a DNA barcoding approach enables the identification of known Hexacentrus species with a very high resolution.

  5. DNA Barcoding of genus Hexacentrus in China reveals cryptic diversity within Hexacentrus japonicus (Orthoptera, Tettigoniidae)

    PubMed Central

    Guo, Hui-Fang; Guan, Bei; Shi, Fu-Ming; Zhou, Zhi-Jun

    2016-01-01

    Abstract DNA barcoding has been proved successful to provide resolution beyond the boundaries of morphological information. Hence, a study was undertaken to establish DNA barcodes for all morphologically determined Hexacentrus species in China collections. In total, 83 specimens of five Hexacentrus species were barcoded using standard mitochondrial cytochrome c oxidase subunit I (COI) gene. Except for Hexacentrus japonicus, barcode gaps were present in the remaining Hexacentrus species. Taxon ID tree generated seven BOLD’s barcode index numbers (BINs), four of which were in agreement with the morphological species. For Hexacentrus japonicus, the maximum intraspecific divergence (4.43%) produced a minimal overlap (0.64%), and 19 specimens were divided into three different BINs. There may be cryptic species within the current Hexacentrus japonicus. This study adds to a growing body of DNA barcodes that have become available for katydids, and shows that a DNA barcoding approach enables the identification of known Hexacentrus species with a very high resolution. PMID:27408576

  6. Barrier island vulnerability to breaching: a case study on Dauphin Island, Alabama

    USGS Publications Warehouse

    Hansen, Mark; Sallenger, Asbury H.

    2007-01-01

    Breaching of barrier islands can adversely impact society by severing infrastructure, destroying private properties, and altering water quality in back bays and estuaries. This study provides a scheme that assesses the relative vulnerability of a barrier island to breach during storms. Dauphin Island, Alabama was selected for this study because it has a well documented history of island breaches and extensive geological and geomorphic data. To assess the vulnerability of the island, we defined several variables contributing to the risk of breaching: island geology, breaching history, and island topography and geomorphology. These variables were combined to form a breaching index (BI) value for cross island computational bins, each bin every 50 m in the alongshore direction. Results suggest the eastern section of Dauphin Island has the lowest risk of breaching with the remaining portion of the island having a moderate to high risk of breaching. Two reaches in the western section of the island were found to be particularly vulnerable due primarily to their minimal cross-sectional dimensions.

  7. Brassinosteroids regulate pavement cell growth by mediating BIN2-induced microtubule stabilization.

    PubMed

    Liu, Xiaolei; Yang, Qin; Wang, Yuan; Wang, Linhai; Fu, Ying; Wang, Xuelu

    2018-02-23

    Brassinosteroids (BRs), a group of plant steroid hormones, play important roles in regulating plant development. The cytoskeleton also affects key developmental processes and a deficiency in BR biosynthesis or signaling leads to abnormal phenotypes similar to those of microtubule-defective mutants. However, how BRs regulate microtubule and cell morphology remains unknown. Here, using liquid chromatography-tandem mass spectrometry, we identified tubulin proteins that interact with Arabidopsis BRASSINOSTEROID INSENSITIVE2 (BIN2), a negative regulator of BR responses in plants. In vitro and in vivo pull-down assays confirmed that BIN2 interacts with tubulin proteins. High-speed co-sedimentation assays demonstrated that BIN2 also binds microtubules. The Arabidopsis genome also encodes two BIN2 homologs, BIN2-LIKE 1 (BIL1) and BIL2, which function redundantly with BIN2. In the bin2-3 bil1 bil2 triple mutant, cortical microtubules were more sensitive to treatment with the microtubule-disrupting drug oryzalin than in wild-type, whereas in the BIN2 gain-of-function mutant bin2-1, cortical microtubules were insensitive to oryzalin treatment. These results provide important insight into how BR regulates plant pavement cell and leaf growth by mediating the stabilization of microtubules by BIN2.

  8. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles

    PubMed Central

    Xu, Bing; Fu, Ying; Liu, Yan; Agvanian, Sosse; Wirka, Robert C.; Baum, Rachel; Zhou, Kang; Shaw, Robin M.

    2017-01-01

    Microparticles (MPs) are cell–cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17) creates transverse-tubule (t-tubule) membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA), and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT)-III subunit charged multivesicular body protein 4B (CHMP4B) colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR) domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily to the ESCRT pathway for MP biogenesis in mammalian cardiac ventricular cells, identifying elements of a pathway by which cytoplasmic cBIN1 is released into blood. PMID:28806752

  9. The ESCRT-III pathway facilitates cardiomyocyte release of cBIN1-containing microparticles.

    PubMed

    Xu, Bing; Fu, Ying; Liu, Yan; Agvanian, Sosse; Wirka, Robert C; Baum, Rachel; Zhou, Kang; Shaw, Robin M; Hong, TingTing

    2017-08-01

    Microparticles (MPs) are cell-cell communication vesicles derived from the cell surface plasma membrane, although they are not known to originate from cardiac ventricular muscle. In ventricular cardiomyocytes, the membrane deformation protein cardiac bridging integrator 1 (cBIN1 or BIN1+13+17) creates transverse-tubule (t-tubule) membrane microfolds, which facilitate ion channel trafficking and modulate local ionic concentrations. The microfold-generated microdomains continuously reorganize, adapting in response to stress to modulate the calcium signaling apparatus. We explored the possibility that cBIN1-microfolds are externally released from cardiomyocytes. Using electron microscopy imaging with immunogold labeling, we found in mouse plasma that cBIN1 exists in membrane vesicles about 200 nm in size, which is consistent with the size of MPs. In mice with cardiac-specific heterozygous Bin1 deletion, flow cytometry identified 47% less cBIN1-MPs in plasma, supporting cardiac origin. Cardiac release was also evidenced by the detection of cBIN1-MPs in medium bathing a pure population of isolated adult mouse cardiomyocytes. In human plasma, osmotic shock increased cBIN1 detection by enzyme-linked immunosorbent assay (ELISA), and cBIN1 level decreased in humans with heart failure, a condition with reduced cardiac muscle cBIN1, both of which support cBIN1 release in MPs from human hearts. Exploring putative mechanisms of MP release, we found that the membrane fission complex endosomal sorting complexes required for transport (ESCRT)-III subunit charged multivesicular body protein 4B (CHMP4B) colocalizes and coimmunoprecipitates with cBIN1, an interaction enhanced by actin stabilization. In HeLa cells with cBIN1 overexpression, knockdown of CHMP4B reduced the release of cBIN1-MPs. Using truncation mutants, we identified that the N-terminal BAR (N-BAR) domain in cBIN1 is required for CHMP4B binding and MP release. This study links the BAR protein superfamily to the ESCRT pathway for MP biogenesis in mammalian cardiac ventricular cells, identifying elements of a pathway by which cytoplasmic cBIN1 is released into blood.

  10. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    NASA Astrophysics Data System (ADS)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  11. A 4.2 ps Time-Interval RMS Resolution Time-to-Digital Converter Using a Bin Decimation Method in an UltraScale FPGA

    NASA Astrophysics Data System (ADS)

    Wang, Yonggang; Liu, Chong

    2016-10-01

    The common solution for a field programmable gate array (FPGA)-based time-to-digital converter (TDC) is constructing a tapped delay line (TDL) for time interpolation to yield a sub-clock time resolution. The granularity and uniformity of the delay elements of TDL determine the TDC time resolution. In this paper, we propose a dual-sampling TDL architecture and a bin decimation method that could make the delay elements as small and uniform as possible, so that the implemented TDCs can achieve a high time resolution beyond the intrinsic cell delay. Two identical full hardware-based TDCs were implemented in a Xilinx UltraScale FPGA for performance evaluation. For fixed time intervals in the range from 0 to 440 ns, the average time-interval RMS resolution is measured by the two TDCs with 4.2 ps, thus the timestamp resolution of single TDC is derived as 2.97 ps. The maximum hit rate of the TDC is as high as half the system clock rate of FPGA, namely 250 MHz in our demo prototype. Because the conventional online bin-by-bin calibration is not needed, the implementation of the proposed TDC is straightforward and relatively resource-saving.

  12. High figure of merit ultra-compact 3-channel parallel-connected photonic crystal mini-hexagonal-H1 defect microcavity sensor array

    NASA Astrophysics Data System (ADS)

    Wang, Chunhong; Sun, Fujun; Fu, Zhongyuan; Ding, Zhaoxiang; Wang, Chao; Zhou, Jian; Wang, Jiawen; Tian, Huiping

    2017-08-01

    In this paper, a photonic crystal (PhC) butt-coupled mini-hexagonal-H1 defect (MHHD) microcavity sensor is proposed. The MHHD microcavity is designed by introducing six mini-holes into the initial H1 defect region. Further, based on a well-designed 1 ×3 PhC Beam Splitter and three optimal MHHD microcavity sensors with different lattice constants (a), a 3-channel parallel-connected PhC sensor array on monolithic silicon on insulator (SOI) is proposed. Finite-difference time-domain (FDTD) simulations method is performed to demonstrate the high performance of our structures. As statistics show, the quality factor (Q) of our optimal MHHD microcavity attains higher than 7×104, while the sensitivity (S) reaches up to 233 nm/RIU(RIU = refractive index unit). Thus, the figure of merit (FOM) >104 of the sensor is obtained, which is enhanced by two orders of magnitude compared to the previous butt-coupled sensors [1-4]. As for the 3-channel parallel-connected PhC MHHD microcavity sensor array, the FOMs of three independent MHHD microcavity sensors are 8071, 8250 and 8250, respectively. In addition, the total footprint of the proposed 3-channel parallel-connected PhC sensor array is ultra-compactness of 12.5 μm ×31 μm (width × length). Therefore, the proposed high FOM sensor array is an ideal platform for realizing ultra-compact highly parallel refractive index (RI) sensing.

  13. Novel features and enhancements in BioBin, a tool for the biologically inspired binning and association analysis of rare variants

    PubMed Central

    Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D

    2018-01-01

    Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757

  14. Elemental analysis using temporal gating of a pulsed neutron generator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Sudeep

    Technologies related to determining elemental composition of a sample that comprises fissile material are described herein. In a general embodiment, a pulsed neutron generator periodically emits bursts of neutrons, and is synchronized with an analyzer circuit. The bursts of neutrons are used to interrogate the sample, and the sample outputs gamma rays based upon the neutrons impacting the sample. A detector outputs pulses based upon the gamma rays impinging upon the material of the detector, and the analyzer circuit assigns the pulses to temporally-based bins based upon the analyzer circuit being synchronized with the pulsed neutron generator. A computing devicemore » outputs data that is indicative of elemental composition of the sample based upon the binned pulses.« less

  15. The generalized accessibility and spectral gap of lower hybrid waves in tokamaks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takahashi, Hironori

    1994-03-01

    The generalized accessibility of lower hybrid waves, primarily in the current drive regime of tokamak plasmas, which may include shifting, either upward or downward, of the parallel refractive index (n{sub {parallel}}), is investigated, based upon a cold plasma dispersion relation and various geometrical constraint (G.C.) relations imposed on the behavior of n{sub {parallel}}. It is shown that n{sub {parallel}} upshifting can be bounded and insufficient to bridge a large spectral gap to cause wave damping, depending upon whether the G.C. relation allows the oblique resonance to occur. The traditional n{sub {parallel}} upshifting mechanism caused by the pitch angle of magneticmore » field lines is shown to lead to contradictions with experimental observations. An upshifting mechanism brought about by the density gradient along field lines is proposed, which is not inconsistent with experimental observations, and provides plausible explanations to some unresolved issues of lower hybrid wave theory, including generation of {open_quote}seed electrons.{close_quote}« less

  16. Direct Comparison of Respiration-Correlated Four-Dimensional Magnetic Resonance Imaging Reconstructed Using Concurrent Internal Navigator and External Bellows.

    PubMed

    Li, Guang; Wei, Jie; Olek, Devin; Kadbi, Mo; Tyagi, Neelam; Zakian, Kristen; Mechalakos, James; Deasy, Joseph O; Hunt, Margie

    2017-03-01

    To compare the image quality of amplitude-binned 4-dimensional magnetic resonance imaging (4DMRI) reconstructed using 2 concurrent respiratory (navigator and bellows) waveforms. A prospective, respiratory-correlated 4DMRI scanning program was used to acquire T2-weighted single-breath 4DMRI images with internal navigator and external bellows. After a 10-second training waveform of a surrogate signal, 2-dimensional MRI acquisition was triggered at a level (bin) and anatomic location (slice) until the bin-slice table was completed for 4DMRI reconstruction. The bellows signal was always collected, even when the navigator trigger was used, to retrospectively reconstruct a bellows-rebinned 4DMRI. Ten volunteers participated in this institutional review board-approved 4DMRI study. Four scans were acquired for each subject, including coronal and sagittal scans triggered by either navigator or bellows, and 6 4DMRI images (navigator-triggered, bellows-rebinned, and bellows-triggered) were reconstructed. The simultaneously acquired waveforms and resulting 4DMRI quality were compared using signal correlation, bin/phase shift, and binning motion artifacts. The consecutive bellows-triggered 4DMRI scan was used for indirect comparison. Correlation coefficients between the navigator and bellows signals were found to be patient-specific and inhalation-/exhalation-dependent, ranging from 0.1 to 0.9 because of breathing irregularities (>50% scans) and commonly observed bin/phase shifts (-1.1 ± 0.6 bin) in both 1-dimensional waveforms and diaphragm motion extracted from 4D images. Navigator-triggered 4DMRI contained many fewer binning motion artifacts at the diaphragm than did the bellows-rebinned and bellows-triggered 4DMRI scans. Coronal scans were faster than sagittal scans because of the fewer slices and higher achievable acceleration factors. Navigator-triggered 4DMRI contains substantially fewer binning motion artifacts than bellows-rebinned and bellows-triggered 4DMRI, primarily owing to the deviation of the external from the internal surrogate. The present study compared 2 concurrent surrogates during the same 4DMRI scan and their resulting 4DMRI quality. The navigator-triggered 4DMRI scanning protocol should be preferred to the bellows-based, especially for coronal scans, for clinical respiratory motion simulation. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Evaluation of respiratory and cardiac motion correction schemes in dual gated PET/CT cardiac imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lamare, F., E-mail: frederic.lamare@chu-bordeaux.fr; Fernandez, P.; CNRS, INCIA, UMR 5287, F-33400 Talence

    Purpose: Cardiac imaging suffers from both respiratory and cardiac motion. One of the proposed solutions involves double gated acquisitions. Although such an approach may lead to both respiratory and cardiac motion compensation there are issues associated with (a) the combination of data from cardiac and respiratory motion bins, and (b) poor statistical quality images as a result of using only part of the acquired data. The main objective of this work was to evaluate different schemes of combining binned data in order to identify the best strategy to reconstruct motion free cardiac images from dual gated positron emission tomography (PET)more » acquisitions. Methods: A digital phantom study as well as seven human studies were used in this evaluation. PET data were acquired in list mode (LM). A real-time position management system and an electrocardiogram device were used to provide the respiratory and cardiac motion triggers registered within the LM file. Acquired data were subsequently binned considering four and six cardiac gates, or the diastole only in combination with eight respiratory amplitude gates. PET images were corrected for attenuation, but no randoms nor scatter corrections were included. Reconstructed images from each of the bins considered above were subsequently used in combination with an affine or an elastic registration algorithm to derive transformation parameters allowing the combination of all acquired data in a particular position in the cardiac and respiratory cycles. Images were assessed in terms of signal-to-noise ratio (SNR), contrast, image profile, coefficient-of-variation (COV), and relative difference of the recovered activity concentration. Results: Regardless of the considered motion compensation strategy, the nonrigid motion model performed better than the affine model, leading to higher SNR and contrast combined with a lower COV. Nevertheless, when compensating for respiration only, no statistically significant differences were observed in the performance of the two motion models considered. Superior image SNR and contrast were seen using the affine respiratory motion model in combination with the diastole cardiac bin in comparison to the use of the whole cardiac cycle. In contrast, when simultaneously correcting for cardiac beating and respiration, the elastic respiratory motion model outperformed the affine model. In this context, four cardiac bins associated with eight respiratory amplitude bins seemed to be adequate. Conclusions: Considering the compensation of respiratory motion effects only, both affine and elastic based approaches led to an accurate resizing and positioning of the myocardium. The use of the diastolic phase combined with an affine model based respiratory motion correction may therefore be a simple approach leading to significant quality improvements in cardiac PET imaging. However, the best performance was obtained with the combined correction for both cardiac and respiratory movements considering all the dual-gated bins independently through the use of an elastic model based motion compensation.« less

  18. Parallel SOR methods with a parabolic-diffusion acceleration technique for solving an unstructured-grid Poisson equation on 3D arbitrary geometries

    NASA Astrophysics Data System (ADS)

    Zapata, M. A. Uh; Van Bang, D. Pham; Nguyen, K. D.

    2016-05-01

    This paper presents a parallel algorithm for the finite-volume discretisation of the Poisson equation on three-dimensional arbitrary geometries. The proposed method is formulated by using a 2D horizontal block domain decomposition and interprocessor data communication techniques with message passing interface. The horizontal unstructured-grid cells are reordered according to the neighbouring relations and decomposed into blocks using a load-balanced distribution to give all processors an equal amount of elements. In this algorithm, two parallel successive over-relaxation methods are presented: a multi-colour ordering technique for unstructured grids based on distributed memory and a block method using reordering index following similar ideas of the partitioning for structured grids. In all cases, the parallel algorithms are implemented with a combination of an acceleration iterative solver. This solver is based on a parabolic-diffusion equation introduced to obtain faster solutions of the linear systems arising from the discretisation. Numerical results are given to evaluate the performances of the methods showing speedups better than linear.

  19. SU-F-303-11: Implementation and Applications of Rapid, SIFT-Based Cine MR Image Binning and Region Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mazur, T; Wang, Y; Fischer-Valuck, B

    2015-06-15

    Purpose: To develop a novel and rapid, SIFT-based algorithm for assessing feature motion on cine MR images acquired during MRI-guided radiotherapy treatments. In particular, we apply SIFT descriptors toward both partitioning cine images into respiratory states and tracking regions across frames. Methods: Among a training set of images acquired during a fraction, we densely assign SIFT descriptors to pixels within the images. We cluster these descriptors across all frames in order to produce a dictionary of trackable features. Associating the best-matching descriptors at every frame among the training images to these features, we construct motion traces for the features. Wemore » use these traces to define respiratory bins for sorting images in order to facilitate robust pixel-by-pixel tracking. Instead of applying conventional methods for identifying pixel correspondences across frames we utilize a recently-developed algorithm that derives correspondences via a matching objective for SIFT descriptors. Results: We apply these methods to a collection of lung, abdominal, and breast patients. We evaluate the procedure for respiratory binning using target sites exhibiting high-amplitude motion among 20 lung and abdominal patients. In particular, we investigate whether these methods yield minimal variation between images within a bin by perturbing the resulting image distributions among bins. Moreover, we compare the motion between averaged images across respiratory states to 4DCT data for these patients. We evaluate the algorithm for obtaining pixel correspondences between frames by tracking contours among a set of breast patients. As an initial case, we track easily-identifiable edges of lumpectomy cavities that show minimal motion over treatment. Conclusions: These SIFT-based methods reliably extract motion information from cine MR images acquired during patient treatments. While we performed our analysis retrospectively, the algorithm lends itself to prospective motion assessment. Applications of these methods include motion assessment, identifying treatment windows for gating, and determining optimal margins for treatment.« less

  20. SU-F-I-08: CT Image Ring Artifact Reduction Based On Prior Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, C; Qi, H; Chen, Z

    Purpose: In computed tomography (CT) system, CT images with ring artifacts will be reconstructed when some adjacent bins of detector don’t work. The ring artifacts severely degrade CT image quality. We present a useful CT ring artifacts reduction based on projection data correction, aiming at estimating the missing data of projection data accurately, thus removing the ring artifacts of CT images. Methods: The method consists of ten steps: 1) Identification of abnormal pixel line in projection sinogram; 2) Linear interpolation within the pixel line of projection sinogram; 3) FBP reconstruction using interpolated projection data; 4) Filtering FBP image using meanmore » filter; 5) Forwarding projection of filtered FBP image; 6) Subtraction forwarded projection from original projection; 7) Linear interpolation of abnormal pixel line area in the subtraction projection; 8) Adding the interpolated subtraction projection on the forwarded projection; 9) FBP reconstruction using corrected projection data; 10) Return to step 4 until the pre-set iteration number is reached. The method is validated on simulated and real data to restore missing projection data and reconstruct ring artifact-free CT images. Results: We have studied impact of amount of dead bins of CT detector on the accuracy of missing data estimation in projection sinogram. For the simulated case with a resolution of 256 by 256 Shepp-Logan phantom, three iterations are sufficient to restore projection data and reconstruct ring artifact-free images when the dead bins rating is under 30%. The dead-bin-induced artifacts are substantially reduced. More iteration number is needed to reconstruct satisfactory images while the rating of dead bins increases. Similar results were found for a real head phantom case. Conclusion: A practical CT image ring artifact correction scheme based on projection data is developed. This method can produce ring artifact-free CT images feasibly and effectively.« less

  1. Aircraft Measurements for Understanding Air-Sea Coupling and Improving Coupled Model Predictions Over the Indian Ocean

    DTIC Science & Technology

    2012-09-30

    briefing for aircraft operations in Diego Garcia, reports posted on EOL field catalog in realtime (http://catalog.eol.ucar.edu/cgi- bin/dynamo/report...index); • Dropsonde data processing on all P3 flights and realtime QC/reporting to GTS; and • Science summary of aircraft missions posted on EOL ...data analysis, worked with EOL on data quality control (QC), participated in the DYNAMO Sounding Workshop at EOL /NCAR from 6-7 February 2012

  2. Bright high z SnIa: A challenge for ΛCDM

    NASA Astrophysics Data System (ADS)

    Perivolaropoulos, L.; Shafieloo, A.

    2009-06-01

    It has recently been pointed out by Kowalski et. al. [Astrophys. J. 686, 749 (2008).ASJOAB0004-637X10.1086/589937] that there is “an unexpected brightness of the SnIa data at z>1.” We quantify this statement by constructing a new statistic which is applicable directly on the type Ia supernova (SnIa) distance moduli. This statistic is designed to pick up systematic brightness trends of SnIa data points with respect to a best fit cosmological model at high redshifts. It is based on binning the normalized differences between the SnIa distance moduli and the corresponding best fit values in the context of a specific cosmological model (e.g. ΛCDM). These differences are normalized by the standard errors of the observed distance moduli. We then focus on the highest redshift bin and extend its size toward lower redshifts until the binned normalized difference (BND) changes sign (crosses 0) at a redshift zc (bin size Nc). The bin size Nc of this crossing (the statistical variable) is then compared with the corresponding crossing bin size Nmc for Monte Carlo data realizations based on the best fit model. We find that the crossing bin size Nc obtained from the Union08 and Gold06 data with respect to the best fit ΛCDM model is anomalously large compared to Nmc of the corresponding Monte Carlo data sets obtained from the best fit ΛCDM in each case. In particular, only 2.2% of the Monte Carlo ΛCDM data sets are consistent with the Gold06 value of Nc while the corresponding probability for the Union08 value of Nc is 5.3%. Thus, according to this statistic, the probability that the high redshift brightness bias of the Union08 and Gold06 data sets is realized in the context of a (w0,w1)=(-1,0) model (ΛCDM cosmology) is less than 6%. The corresponding realization probability in the context of a (w0,w1)=(-1.4,2) model is more than 30% for both the Union08 and the Gold06 data sets indicating a much better consistency for this model with respect to the BND statistic.

  3. Synergistic Effects of Age on Patterns of White and Gray Matter Volume across Childhood and Adolescence1,2,3

    PubMed Central

    Krongold, Mark; Cooper, Cassandra; Lebel, Catherine

    2015-01-01

    Abstract The human brain develops with a nonlinear contraction of gray matter across late childhood and adolescence with a concomitant increase in white matter volume. Across the adult population, properties of cortical gray matter covary within networks that may represent organizational units for development and degeneration. Although gray matter covariance may be strongest within structurally connected networks, the relationship to volume changes in white matter remains poorly characterized. In the present study we examined age-related trends in white and gray matter volume using T1-weighted MR images from 360 human participants from the NIH MRI study of Normal Brain Development. Images were processed through a voxel-based morphometry pipeline. Linear effects of age on white and gray matter volume were modeled within four age bins, spanning 4-18 years, each including 90 participants (45 male). White and gray matter age-slope maps were separately entered into k-means clustering to identify regions with similar age-related variability across the four age bins. Four white matter clusters were identified, each with a dominant direction of underlying fibers: anterior–posterior, left–right, and two clusters with superior–inferior directions. Corresponding, spatially proximal, gray matter clusters encompassed largely cerebellar, fronto-insular, posterior, and sensorimotor regions, respectively. Pairs of gray and white matter clusters followed parallel slope trajectories, with white matter changes generally positive from 8 years onward (indicating volume increases) and gray matter negative (decreases). As developmental disorders likely target networks rather than individual regions, characterizing typical coordination of white and gray matter development can provide a normative benchmark for understanding atypical development. PMID:26464999

  4. Shuttle car loading system

    NASA Technical Reports Server (NTRS)

    Collins, E. R., Jr. (Inventor)

    1985-01-01

    A system is described for loading newly mined material such as coal, into a shuttle car, at a location near the mine face where there is only a limited height available for a loading system. The system includes a storage bin having several telescoping bin sections and a shuttle car having a bottom wall that can move under the bin. With the bin in an extended position and filled with coal the bin sections can be telescoped to allow the coal to drop out of the bin sections and into the shuttle car, to quickly load the car. The bin sections can then be extended, so they can be slowly filled with more while waiting another shuttle car.

  5. Dual-point reflective refractometer based on parallel no-core fiber/FBG structure

    NASA Astrophysics Data System (ADS)

    Guo, Cuijuan; Niu, Panpan; Wang, Juan; Zhao, Junfa; Zhang, Cheng

    2018-01-01

    A novel dual-point reflective fiber-optic refractometer based on multimode interference (MMI) effect and fiber Bragg grating (FBG) reflection is proposed and experimentally demonstrated, which adopts parallel structure. Each point of the refractometer consists of a single mode-no core-single mode fiber (SNS) structure cascaded with a FBG. Assisted by the reflection of FBG, refractive index (RI) measurement can be achieved by monitoring the peak power variation of the reflected FBG spectrum. By selecting different length of the no core fiber and center wavelength of the FBG, independent dual-point refractometer is easily realized. Experiment results show that the refractometer has a nonlinear relationship between the surrounding refractive index (SRI) and the peak power of the reflected FBG spectrum in the RI range of 1.3330-1.4086. Linear relationship can be approximately obtained by dividing the measuring range into 1.3330-1.3611 and 1.3764-1.4086. In the RI range of 1.3764-1.4086, the two sensing points have higher RI sensitivities of 319.34 dB/RIU and 211.84 dB/RIU, respectively.

  6. Characteristic features of a high-energy x-ray spectra estimation method based on the Waggener iterative perturbation principle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iwasaki, Akira; Kubota, Mamoru; Hirota, Junichi

    2006-11-15

    We have redeveloped a high-energy x-ray spectra estimation method reported by Iwasaki et al. [A. Iwasaki, H. Matsutani, M. Kubota, A. Fujimori, K. Suzaki, and Y. Abe, Radiat. Phys. Chem. 67, 81-91 (2003)]. The method is based on the iterative perturbation principle to minimize differences between measured and calculated transmission curves, originally proposed by Waggener et al. [R. G. Waggener, M. M. Blough, J. A. Terry, D. Chen, N. E. Lee, S. Zhang, and W. D. McDavid, Med. Phys. 26, 1269-1278 (1999)]. The method can estimate spectra applicable for media at least from water to lead using only about tenmore » energy bins. Estimating spectra of 4-15 MV x-ray beams from a linear accelerator, we describe characteristic features of the method with regard to parameters including the prespectrum, number of transmission measurements, number of energy bins, energy bin widths, and artifactual bipeaked spectrum production.« less

  7. Terahertz Microfluidic Sensing Using a Parallel-plate Waveguide Sensor

    PubMed Central

    Astley, Victoria; Reichel, Kimberly; Mendis, Rajind; Mittleman, Daniel M.

    2012-01-01

    Refractive index (RI) sensing is a powerful noninvasive and label-free sensing technique for the identification, detection and monitoring of microfluidic samples with a wide range of possible sensor designs such as interferometers and resonators 1,2. Most of the existing RI sensing applications focus on biological materials in aqueous solutions in visible and IR frequencies, such as DNA hybridization and genome sequencing. At terahertz frequencies, applications include quality control, monitoring of industrial processes and sensing and detection applications involving nonpolar materials. Several potential designs for refractive index sensors in the terahertz regime exist, including photonic crystal waveguides 3, asymmetric split-ring resonators 4, and photonic band gap structures integrated into parallel-plate waveguides 5. Many of these designs are based on optical resonators such as rings or cavities. The resonant frequencies of these structures are dependent on the refractive index of the material in or around the resonator. By monitoring the shifts in resonant frequency the refractive index of a sample can be accurately measured and this in turn can be used to identify a material, monitor contamination or dilution, etc. The sensor design we use here is based on a simple parallel-plate waveguide 6,7. A rectangular groove machined into one face acts as a resonant cavity (Figures 1 and 2). When terahertz radiation is coupled into the waveguide and propagates in the lowest-order transverse-electric (TE1) mode, the result is a single strong resonant feature with a tunable resonant frequency that is dependent on the geometry of the groove 6,8. This groove can be filled with nonpolar liquid microfluidic samples which cause a shift in the observed resonant frequency that depends on the amount of liquid in the groove and its refractive index 9. Our technique has an advantage over other terahertz techniques in its simplicity, both in fabrication and implementation, since the procedure can be accomplished with standard laboratory equipment without the need for a clean room or any special fabrication or experimental techniques. It can also be easily expanded to multichannel operation by the incorporation of multiple grooves 10. In this video we will describe our complete experimental procedure, from the design of the sensor to the data analysis and determination of the sample refractive index. PMID:22951593

  8. Observer model optimization of a spectral mammography system

    NASA Astrophysics Data System (ADS)

    Fredenberg, Erik; Åslund, Magnus; Cederström, Björn; Lundqvist, Mats; Danielsson, Mats

    2010-04-01

    Spectral imaging is a method in medical x-ray imaging to extract information about the object constituents by the material-specific energy dependence of x-ray attenuation. Contrast-enhanced spectral imaging has been thoroughly investigated, but unenhanced imaging may be more useful because it comes as a bonus to the conventional non-energy-resolved absorption image at screening; there is no additional radiation dose and no need for contrast medium. We have used a previously developed theoretical framework and system model that include quantum and anatomical noise to characterize the performance of a photon-counting spectral mammography system with two energy bins for unenhanced imaging. The theoretical framework was validated with synthesized images. Optimal combination of the energy-resolved images for detecting large unenhanced tumors corresponded closely, but not exactly, to minimization of the anatomical noise, which is commonly referred to as energy subtraction. In that case, an ideal-observer detectability index could be improved close to 50% compared to absorption imaging. Optimization with respect to the signal-to-quantum-noise ratio, commonly referred to as energy weighting, deteriorated detectability. For small microcalcifications or tumors on uniform backgrounds, however, energy subtraction was suboptimal whereas energy weighting provided a minute improvement. The performance was largely independent of beam quality, detector energy resolution, and bin count fraction. It is clear that inclusion of anatomical noise and imaging task in spectral optimization may yield completely different results than an analysis based solely on quantum noise.

  9. Forecast errors in dust vertical distributions over Rome (Italy): Multiple particle size representation and cloud contributions

    NASA Astrophysics Data System (ADS)

    Kishcha, P.; Alpert, P.; Shtivelman, A.; Krichak, S. O.; Joseph, J. H.; Kallos, G.; Katsafados, P.; Spyrou, C.; Gobbi, G. P.; Barnaba, F.; Nickovic, S.; PéRez, C.; Baldasano, J. M.

    2007-08-01

    In this study, forecast errors in dust vertical distributions were analyzed. This was carried out by using quantitative comparisons between dust vertical profiles retrieved from lidar measurements over Rome, Italy, performed from 2001 to 2003, and those predicted by models. Three models were used: the four-particle-size Dust Regional Atmospheric Model (DREAM), the older one-particle-size version of the SKIRON model from the University of Athens (UOA), and the pre-2006 one-particle-size Tel Aviv University (TAU) model. SKIRON and DREAM are initialized on a daily basis using the dust concentration from the previous forecast cycle, while the TAU model initialization is based on the Total Ozone Mapping Spectrometer aerosol index (TOMS AI). The quantitative comparison shows that (1) the use of four-particle-size bins in the dust modeling instead of only one-particle-size bins improves dust forecasts; (2) cloud presence could contribute to noticeable dust forecast errors in SKIRON and DREAM; and (3) as far as the TAU model is concerned, its forecast errors were mainly caused by technical problems with TOMS measurements from the Earth Probe satellite. As a result, dust forecast errors in the TAU model could be significant even under cloudless conditions. The DREAM versus lidar quantitative comparisons at different altitudes show that the model predictions are more accurate in the middle part of dust layers than in the top and bottom parts of dust layers.

  10. High sensitivity and high Q-factor nanoslotted parallel quadrabeam photonic crystal cavity for real-time and label-free sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Daquan; State Key Laboratory of Information Photonics and Optical Communications, School of Information and Communication Engineering, Beijing University of Posts and Telecommunications, Beijing 100876; School of Engineering and Applied Sciences, Harvard University, Cambridge, Massachusetts 02138

    We experimentally demonstrate a label-free sensor based on nanoslotted parallel quadrabeam photonic crystal cavity (NPQC). The NPQC possesses both high sensitivity and high Q-factor. We achieved sensitivity (S) of 451 nm/refractive index unit and Q-factor >7000 in water at telecom wavelength range, featuring a sensor figure of merit >2000, an order of magnitude improvement over the previous photonic crystal sensors. In addition, we measured the streptavidin-biotin binding affinity and detected 10 ag/mL concentrated streptavidin in the phosphate buffered saline solution.

  11. Fast Census of Moth Diversity in the Neotropics: A Comparison of Field-Assigned Morphospecies and DNA Barcoding in Tiger Moths

    PubMed Central

    Zenker, Mauricio M.; Rougerie, Rodolphe; Teston, José A.; Laguerre, Michel; Pie, Marcio R.; Freitas, André V. L.

    2016-01-01

    The morphological species delimitations (i.e. morphospecies) have long been the best way to avoid the taxonomic impediment and compare insect taxa biodiversity in highly diverse tropical and subtropical regions. The development of DNA barcoding, however, has shown great potential to replace (or at least complement) the morphospecies approach, with the advantage of relying on automated methods implemented in computer programs or even online rather than in often subjective morphological features. We sampled moths extensively for two years using light traps in a patch of the highly endangered Atlantic Forest of Brazil to produce a nearly complete census of arctiines (Noctuoidea: Erebidae), whose species richness was compared using different morphological and molecular approaches (DNA barcoding). A total of 1,075 barcode sequences of 286 morphospecies were analyzed. Based on the clustering method Barcode Index Number (BIN) we found a taxonomic bias of approximately 30% in our initial morphological assessment. However, a morphological reassessment revealed that the correspondence between morphospecies and molecular operational taxonomic units (MOTUs) can be up to 94% if differences in genitalia morphology are evaluated in individuals of different MOTUs originated from the same morphospecies (putative cases of cryptic species), and by recording if individuals of different genders in different morphospecies merge together in the same MOTU (putative cases of sexual dimorphism). The results of two other clustering methods (i.e. Automatic Barcode Gap Discovery and 2% threshold) were very similar to those of the BIN approach. Using empirical data we have shown that DNA barcoding performed substantially better than the morphospecies approach, based on superficial morphology, to delimit species of a highly diverse moth taxon, and thus should be used in species inventories. PMID:26859488

  12. Fast Census of Moth Diversity in the Neotropics: A Comparison of Field-Assigned Morphospecies and DNA Barcoding in Tiger Moths.

    PubMed

    Zenker, Mauricio M; Rougerie, Rodolphe; Teston, José A; Laguerre, Michel; Pie, Marcio R; Freitas, André V L

    2016-01-01

    The morphological species delimitations (i.e. morphospecies) have long been the best way to avoid the taxonomic impediment and compare insect taxa biodiversity in highly diverse tropical and subtropical regions. The development of DNA barcoding, however, has shown great potential to replace (or at least complement) the morphospecies approach, with the advantage of relying on automated methods implemented in computer programs or even online rather than in often subjective morphological features. We sampled moths extensively for two years using light traps in a patch of the highly endangered Atlantic Forest of Brazil to produce a nearly complete census of arctiines (Noctuoidea: Erebidae), whose species richness was compared using different morphological and molecular approaches (DNA barcoding). A total of 1,075 barcode sequences of 286 morphospecies were analyzed. Based on the clustering method Barcode Index Number (BIN) we found a taxonomic bias of approximately 30% in our initial morphological assessment. However, a morphological reassessment revealed that the correspondence between morphospecies and molecular operational taxonomic units (MOTUs) can be up to 94% if differences in genitalia morphology are evaluated in individuals of different MOTUs originated from the same morphospecies (putative cases of cryptic species), and by recording if individuals of different genders in different morphospecies merge together in the same MOTU (putative cases of sexual dimorphism). The results of two other clustering methods (i.e. Automatic Barcode Gap Discovery and 2% threshold) were very similar to those of the BIN approach. Using empirical data we have shown that DNA barcoding performed substantially better than the morphospecies approach, based on superficial morphology, to delimit species of a highly diverse moth taxon, and thus should be used in species inventories.

  13. Haystack, a web-based tool for metabolomics research

    PubMed Central

    2014-01-01

    Background Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. Results To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Conclusion Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods. PMID:25350247

  14. Haystack, a web-based tool for metabolomics research.

    PubMed

    Grace, Stephen C; Embry, Stephen; Luo, Heng

    2014-01-01

    Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non-biased differential profiling studies through a unique and flexible binning function that provides an alternative to conventional peak deconvolution analysis methods.

  15. 50-GHz-spaced comb of high-dimensional frequency-bin entangled photons from an on-chip silicon nitride microresonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imany, Poolad; Jaramillo-Villegas, Jose A.; Odele, Ogaga D.

    Quantum frequency combs from chip-scale integrated sources are promising candidates for scalable and robust quantum information processing (QIP). However, to use these quantum combs for frequency domain QIP, demonstration of entanglement in the frequency basis, showing that the entangled photons are in a coherent superposition of multiple frequency bins, is required. We present a verification of qubit and qutrit frequency-bin entanglement using an on-chip quantum frequency comb with 40 mode pairs, through a two-photon interference measurement that is based on electro-optic phase modulation. Our demonstrations provide an important contribution in establishing integrated optical microresonators as a source for high-dimensional frequency-binmore » encoded quantum computing, as well as dense quantum key distribution.« less

  16. 50-GHz-spaced comb of high-dimensional frequency-bin entangled photons from an on-chip silicon nitride microresonator

    DOE PAGES

    Imany, Poolad; Jaramillo-Villegas, Jose A.; Odele, Ogaga D.; ...

    2018-01-18

    Quantum frequency combs from chip-scale integrated sources are promising candidates for scalable and robust quantum information processing (QIP). However, to use these quantum combs for frequency domain QIP, demonstration of entanglement in the frequency basis, showing that the entangled photons are in a coherent superposition of multiple frequency bins, is required. We present a verification of qubit and qutrit frequency-bin entanglement using an on-chip quantum frequency comb with 40 mode pairs, through a two-photon interference measurement that is based on electro-optic phase modulation. Our demonstrations provide an important contribution in establishing integrated optical microresonators as a source for high-dimensional frequency-binmore » encoded quantum computing, as well as dense quantum key distribution.« less

  17. Integrated technologies for solid waste bin monitoring system.

    PubMed

    Arebey, Maher; Hannan, M A; Basri, Hassan; Begum, R A; Abdullah, Huda

    2011-06-01

    The integration of communication technologies such as radio frequency identification (RFID), global positioning system (GPS), general packet radio system (GPRS), and geographic information system (GIS) with a camera are constructed for solid waste monitoring system. The aim is to improve the way of responding to customer's inquiry and emergency cases and estimate the solid waste amount without any involvement of the truck driver. The proposed system consists of RFID tag mounted on the bin, RFID reader as in truck, GPRS/GSM as web server, and GIS as map server, database server, and control server. The tracking devices mounted in the trucks collect location information in real time via the GPS. This information is transferred continuously through GPRS to a central database. The users are able to view the current location of each truck in the collection stage via a web-based application and thereby manage the fleet. The trucks positions and trash bin information are displayed on a digital map, which is made available by a map server. Thus, the solid waste of the bin and the truck are being monitored using the developed system.

  18. Adjustments for the display of quantized ion channel dwell times in histograms with logarithmic bins.

    PubMed

    Stark, J A; Hladky, S B

    2000-02-01

    Dwell-time histograms are often plotted as part of patch-clamp investigations of ion channel currents. The advantages of plotting these histograms with a logarithmic time axis were demonstrated by, J. Physiol. (Lond.). 378:141-174), Pflügers Arch. 410:530-553), and, Biophys. J. 52:1047-1054). Sigworth and Sine argued that the interpretation of such histograms is simplified if the counts are presented in a manner similar to that of a probability density function. However, when ion channel records are recorded as a discrete time series, the dwell times are quantized. As a result, the mapping of dwell times to logarithmically spaced bins is highly irregular; bins may be empty, and significant irregularities may extend beyond the duration of 100 samples. Using simple approximations based on the nature of the binning process and the transformation rules for probability density functions, we develop adjustments for the display of the counts to compensate for this effect. Tests with simulated data suggest that this procedure provides a faithful representation of the data.

  19. Effect of various binning methods and ROI sizes on the accuracy of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of texture features at HRCT

    NASA Astrophysics Data System (ADS)

    Kim, Namkug; Seo, Joon Beom; Sung, Yu Sub; Park, Bum-Woo; Lee, Youngjoo; Park, Seong Hoon; Lee, Young Kyung; Kang, Suk-Ho

    2008-03-01

    To find optimal binning, variable binning size linear binning (LB) and non-linear binning (NLB) methods were tested. In case of small binning size (Q <= 10), NLB shows significant better accuracy than the LB. K-means NLB (Q = 26) is statistically significant better than every LB. To find optimal binning method and ROI size of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of textural analysis at HRCT Six-hundred circular regions of interest (ROI) with 10, 20, and 30 pixel diameter, comprising of each 100 ROIs representing six regional disease patterns (normal, NL; ground-glass opacity, GGO; reticular opacity, RO; honeycombing, HC; emphysema, EMPH; and consolidation, CONS) were marked by an experienced radiologist from HRCT images. Histogram (mean) and co-occurrence matrix (mean and SD of angular second moment, contrast, correlation, entropy, and inverse difference momentum) features were employed to test binning and ROI effects. To find optimal binning, variable binning size LB (bin size Q: 4~30, 32, 64, 128, 144, 196, 256, 384) and NLB (Q: 4~30) methods (K-means, and Fuzzy C-means clustering) were tested. For automated classification, a SVM classifier was implemented. To assess cross-validation of the system, a five-folding method was used. Each test was repeatedly performed twenty times. Overall accuracies with every combination of variable ROIs, and binning sizes were statistically compared. In case of small binning size (Q <= 10), NLB shows significant better accuracy than the LB. K-means NLB (Q = 26) is statistically significant better than every LB. In case of 30x30 ROI size and most of binning size, the K-means method showed better than other NLB and LB methods. When optimal binning and other parameters were set, overall sensitivity of the classifier was 92.85%. The sensitivity and specificity of the system for each class were as follows: NL, 95%, 97.9%; GGO, 80%, 98.9%; RO 85%, 96.9%; HC, 94.7%, 97%; EMPH, 100%, 100%; and CONS, 100%, 100%, respectively. We determined the optimal binning method and ROI size of the automatic classification system for differentiation between diffuse infiltrative lung diseases on the basis of texture features at HRCT.

  20. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    PubMed Central

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.

    2013-01-01

    Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905

  1. A DNA Barcode Library for North American Pyraustinae (Lepidoptera: Pyraloidea: Crambidae).

    PubMed

    Yang, Zhaofu; Landry, Jean-François; Hebert, Paul D N

    2016-01-01

    Although members of the crambid subfamily Pyraustinae are frequently important crop pests, their identification is often difficult because many species lack conspicuous diagnostic morphological characters. DNA barcoding employs sequence diversity in a short standardized gene region to facilitate specimen identifications and species discovery. This study provides a DNA barcode reference library for North American pyraustines based upon the analysis of 1589 sequences recovered from 137 nominal species, 87% of the fauna. Data from 125 species were barcode compliant (>500bp, <1% n), and 99 of these taxa formed a distinct cluster that was assigned to a single BIN. The other 26 species were assigned to 56 BINs, reflecting frequent cases of deep intraspecific sequence divergence and a few instances of barcode sharing, creating a total of 155 BINs. Two systems for OTU designation, ABGD and BIN, were examined to check the correspondence between current taxonomy and sequence clusters. The BIN system performed better than ABGD in delimiting closely related species, while OTU counts with ABGD were influenced by the value employed for relative gap width. Different species with low or no interspecific divergence may represent cases of unrecognized synonymy, whereas those with high intraspecific divergence require further taxonomic scrutiny as they may involve cryptic diversity. The barcode library developed in this study will also help to advance understanding of relationships among species of Pyraustinae.

  2. Genetic dissection and fine mapping of a novel dt gene associated with determinate growth habit in sesame.

    PubMed

    Zhang, Yanxin; Wang, Linhai; Gao, Yuan; Li, Donghua; Yu, Jingyin; Zhou, Rong; Zhang, Xiurong

    2018-06-14

    As an important oil crop, growth habit of sesame (Sesamum indicum L.) is naturally indeterminate, which brings about asynchronous maturity of capsules and causes loss of yield. The genetic basis of determinate growth habit in sesame was investigated by classical genetic analysis through multiple populations, results revealed that it was controlled by an unique recessive gene. The genotyping by sequencing (GBS) approach was employed for high-throughput SNP identification and genotyping in the F 2 population, then a high density bin map was constructed, the map was 1086.403 cM in length, which consisted of 1184 bins (13,679 SNPs), with an average of 0.918 cM between adjacent bins. Based on bin mapping in conjunction with SSR markers analysis in targeted region, the novel sesame determinacy gene was mapped on LG09 in a genome region of 41 kb. This study dissected genetic basis of determinate growth habit in sesame, constructed a new high-density bin map and mapped a novel determinacy gene. Results of this study demonstrate that we employed an optimized approach to get fine-accuracy, high-resolution and high-efficiency mapping result in sesame. The findings provided important foundation for sesame determinacy gene cloning and were expected to be applied in breeding for cultivars suited to mechanized production.

  3. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Li, X.; Khain, A.; Simpson, S.

    2004-01-01

    Cloud microphysics are inevitably affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds. Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effects of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, two detailed spectral-bin microphysical schemes were implemented into the Goddard Cumulus Ensemble (GCE) model. The formulation for the explicit spectral-bin microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles (i.e., pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail). Each type is described by a special size distribution function containing many categories (i.e. 33 bins). Atmospheric aerosols are also described using number density size-distribution functions. A spectral-bin microphysical model is very expensive from a computational point of view and has only been implemented into the 2D version of the GCE at the present time. The model is tested by studying the evolution of deep cloud systems in the west Pacific warm pool region, in the sub-tropics (Florida) and in the mid-latitude using identical thermodynamic conditions but with different concentrations of CCN: a low 'clean' concentration and a high 'dirty' concentration.

  4. KiDS-450: the tomographic weak lensing power spectrum and constraints on cosmological parameters

    NASA Astrophysics Data System (ADS)

    Köhlinger, F.; Viola, M.; Joachimi, B.; Hoekstra, H.; van Uitert, E.; Hildebrandt, H.; Choi, A.; Erben, T.; Heymans, C.; Joudaki, S.; Klaes, D.; Kuijken, K.; Merten, J.; Miller, L.; Schneider, P.; Valentijn, E. A.

    2017-11-01

    We present measurements of the weak gravitational lensing shear power spectrum based on 450 ° ^2 of imaging data from the Kilo Degree Survey. We employ a quadratic estimator in two and three redshift bins and extract band powers of redshift autocorrelation and cross-correlation spectra in the multipole range 76 ≤ ℓ ≤ 1310. The cosmological interpretation of the measured shear power spectra is performed in a Bayesian framework assuming a ΛCDM model with spatially flat geometry, while accounting for small residual uncertainties in the shear calibration and redshift distributions as well as marginalizing over intrinsic alignments, baryon feedback and an excess-noise power model. Moreover, massive neutrinos are included in the modelling. The cosmological main result is expressed in terms of the parameter combination S_8 ≡ σ _8 √{Ω_m/0.3} yielding S8 = 0.651 ± 0.058 (three z-bins), confirming the recently reported tension in this parameter with constraints from Planck at 3.2σ (three z-bins). We cross-check the results of the three z-bin analysis with the weaker constraints from the two z-bin analysis and find them to be consistent. The high-level data products of this analysis, such as the band power measurements, covariance matrices, redshift distributions and likelihood evaluation chains are available at http://kids.strw.leidenuniv.nl.

  5. Subcellular Changes in Bridging Integrator 1 Protein Expression in the Cerebral Cortex During the Progression of Alzheimer Disease Pathology.

    PubMed

    Adams, Stephanie L; Tilton, Kathy; Kozubek, James A; Seshadri, Sudha; Delalle, Ivana

    2016-08-01

    Genome-wide association studies have established BIN1 (Bridging Integrator 1) as the most significant late-onset Alzheimer disease (AD) susceptibility locus after APOE We analyzed BIN1 protein expression using automated immunohistochemistry on the hippocampal CA1 region in 19 patients with either no, mild, or moderate-to-marked AD pathology, who had been assessed by Clinical Dementia Rating and CERAD scores. We also examined the amygdala, prefrontal, temporal, and occipital regions in a subset of these patients. In non-demented controls without AD pathology, BIN1 protein was expressed in white matter, glia, particularly oligodendrocytes, and in the neuropil in which the BIN1 signal decorated axons. With increasing severity of AD, BIN1 in the CA1 region showed: 1) sustained expression in glial cells, 2) decreased areas of neuropil expression, and 3) increased cytoplasmic neuronal expression that did not correlate with neurofibrillary tangle load. In patients with AD, both the prefrontal cortex and CA1 showed a decrease in BIN1-immunoreactive (BIN1-ir) neuropil areas and increases in numbers of BIN1-ir neurons. The numbers of CA1 BIN1-ir pyramidal neurons correlated with hippocampal CERAD neuritic plaque scores; BIN1 neuropil signal was absent in neuritic plaques. Our data provide novel insight into the relationship between BIN1 protein expression and the progression of AD-associated pathology and its diagnostic hallmarks. © 2016 American Association of Neuropathologists, Inc. All rights reserved.

  6. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods.

    PubMed

    Towers, Sherry; Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle.

  7. Detecting the contagion effect in mass killings; a constructive example of the statistical advantages of unbinned likelihood methods

    PubMed Central

    Mubayi, Anuj; Castillo-Chavez, Carlos

    2018-01-01

    Background When attempting to statistically distinguish between a null and an alternative hypothesis, many researchers in the life and social sciences turn to binned statistical analysis methods, or methods that are simply based on the moments of a distribution (such as the mean, and variance). These methods have the advantage of simplicity of implementation, and simplicity of explanation. However, when null and alternative hypotheses manifest themselves in subtle differences in patterns in the data, binned analysis methods may be insensitive to these differences, and researchers may erroneously fail to reject the null hypothesis when in fact more sensitive statistical analysis methods might produce a different result when the null hypothesis is actually false. Here, with a focus on two recent conflicting studies of contagion in mass killings as instructive examples, we discuss how the use of unbinned likelihood methods makes optimal use of the information in the data; a fact that has been long known in statistical theory, but perhaps is not as widely appreciated amongst general researchers in the life and social sciences. Methods In 2015, Towers et al published a paper that quantified the long-suspected contagion effect in mass killings. However, in 2017, Lankford & Tomek subsequently published a paper, based upon the same data, that claimed to contradict the results of the earlier study. The former used unbinned likelihood methods, and the latter used binned methods, and comparison of distribution moments. Using these analyses, we also discuss how visualization of the data can aid in determination of the most appropriate statistical analysis methods to distinguish between a null and alternate hypothesis. We also discuss the importance of assessment of the robustness of analysis results to methodological assumptions made (for example, arbitrary choices of number of bins and bin widths when using binned methods); an issue that is widely overlooked in the literature, but is critical to analysis reproducibility and robustness. Conclusions When an analysis cannot distinguish between a null and alternate hypothesis, care must be taken to ensure that the analysis methodology itself maximizes the use of information in the data that can distinguish between the two hypotheses. The use of binned methods by Lankford & Tomek (2017), that examined how many mass killings fell within a 14 day window from a previous mass killing, substantially reduced the sensitivity of their analysis to contagion effects. The unbinned likelihood methods used by Towers et al (2015) did not suffer from this problem. While a binned analysis might be favorable for simplicity and clarity of presentation, unbinned likelihood methods are preferable when effects might be somewhat subtle. PMID:29742115

  8. Large Scale Document Inversion using a Multi-threaded Computing System

    PubMed Central

    Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won

    2018-01-01

    Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. CCS Concepts •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations. PMID:29861701

  9. Large Scale Document Inversion using a Multi-threaded Computing System.

    PubMed

    Jung, Sungbo; Chang, Dar-Jen; Park, Juw Won

    2017-06-01

    Current microprocessor architecture is moving towards multi-core/multi-threaded systems. This trend has led to a surge of interest in using multi-threaded computing devices, such as the Graphics Processing Unit (GPU), for general purpose computing. We can utilize the GPU in computation as a massive parallel coprocessor because the GPU consists of multiple cores. The GPU is also an affordable, attractive, and user-programmable commodity. Nowadays a lot of information has been flooded into the digital domain around the world. Huge volume of data, such as digital libraries, social networking services, e-commerce product data, and reviews, etc., is produced or collected every moment with dramatic growth in size. Although the inverted index is a useful data structure that can be used for full text searches or document retrieval, a large number of documents will require a tremendous amount of time to create the index. The performance of document inversion can be improved by multi-thread or multi-core GPU. Our approach is to implement a linear-time, hash-based, single program multiple data (SPMD), document inversion algorithm on the NVIDIA GPU/CUDA programming platform utilizing the huge computational power of the GPU, to develop high performance solutions for document indexing. Our proposed parallel document inversion system shows 2-3 times faster performance than a sequential system on two different test datasets from PubMed abstract and e-commerce product reviews. •Information systems➝Information retrieval • Computing methodologies➝Massively parallel and high-performance simulations.

  10. Potential fitting biases resulting from grouping data into variable width bins

    NASA Astrophysics Data System (ADS)

    Towers, S.

    2014-07-01

    When reading peer-reviewed scientific literature describing any analysis of empirical data, it is natural and correct to proceed with the underlying assumption that experiments have made good faith efforts to ensure that their analyses yield unbiased results. However, particle physics experiments are expensive and time consuming to carry out, thus if an analysis has inherent bias (even if unintentional), much money and effort can be wasted trying to replicate or understand the results, particularly if the analysis is fundamental to our understanding of the universe. In this note we discuss the significant biases that can result from data binning schemes. As we will show, if data are binned such that they provide the best comparison to a particular (but incorrect) model, the resulting model parameter estimates when fitting to the binned data can be significantly biased, leading us to too often accept the model hypothesis when it is not in fact true. When using binned likelihood or least squares methods there is of course no a priori requirement that data bin sizes need to be constant, but we show that fitting to data grouped into variable width bins is particularly prone to produce biased results if the bin boundaries are chosen to optimize the comparison of the binned data to a wrong model. The degree of bias that can be achieved simply with variable binning can be surprisingly large. Fitting the data with an unbinned likelihood method, when possible to do so, is the best way for researchers to show that their analyses are not biased by binning effects. Failing that, equal bin widths should be employed as a cross-check of the fitting analysis whenever possible.

  11. Electron imaging with an EBSD detector.

    PubMed

    Wright, Stuart I; Nowell, Matthew M; de Kloe, René; Camus, Patrick; Rampton, Travis

    2015-01-01

    Electron Backscatter Diffraction (EBSD) has proven to be a useful tool for characterizing the crystallographic orientation aspects of microstructures at length scales ranging from tens of nanometers to millimeters in the scanning electron microscope (SEM). With the advent of high-speed digital cameras for EBSD use, it has become practical to use the EBSD detector as an imaging device similar to a backscatter (or forward-scatter) detector. Using the EBSD detector in this manner enables images exhibiting topographic, atomic density and orientation contrast to be obtained at rates similar to slow scanning in the conventional SEM manner. The high-speed acquisition is achieved through extreme binning of the camera-enough to result in a 5 × 5 pixel pattern. At such high binning, the captured patterns are not suitable for indexing. However, no indexing is required for using the detector as an imaging device. Rather, a 5 × 5 array of images is formed by essentially using each pixel in the 5 × 5 pixel pattern as an individual scattered electron detector. The images can also be formed at traditional EBSD scanning rates by recording the image data during a scan or can also be formed through post-processing of patterns recorded at each point in the scan. Such images lend themselves to correlative analysis of image data with the usual orientation data provided by and with chemical data obtained simultaneously via X-Ray Energy Dispersive Spectroscopy (XEDS). Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  12. 45. VIEW OF UPPER LEVEL CRUSHER ADDITION FROM CRUSHED OXIDIZED ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    45. VIEW OF UPPER LEVEL CRUSHER ADDITION FROM CRUSHED OXIDIZED ORE BIN. 18 INCH BELT CONVEYOR BIN FEED, LOWER CENTER, WITH STEPHENS-ADAMSON 25 TON/HR ELEVATOR SPLIT DISCHARGE (OXIDIZED/UNOXIDIZED) IN CENTER. CRUDE ORE BINS AND MACHINE SHOP BEYOND. NOTE TOP OF CRUSHED OXIDIZED ORE BIN IS BELOW TOP OF CRUDE ORE BINS. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  13. Effects of Mixtures on Liquid and Solid Fragment Size Distributions

    DTIC Science & Technology

    2016-05-01

    bins, too few size bins, fixed bin widths, or inadequately- varying bin widths. Overpopulated bins – which typically occur for smaller fragments...2010 C. V. B. Cunningham, The Kuz-Ram Fragmentation Model – 20 Years On, In R. Holmberg et. al., Editors, Proceedings of the 3 rd World ...1992 P. K. Sahoo and T. Riedel, Mean Value Theorems and Functional Equations, World Scientific, 1998 K. A. Sallam, C. Aalburg, G.M. Faeth

  14. United States Department of Agriculture-Agricultural Research Service stored-grain areawide integrated pest management program.

    PubMed

    Flinn, Paul W; Hagstrum, David W; Reed, Carl; Phillips, Tom W

    2003-01-01

    The USDA Agricultural Research Service (ARS) funded a demonstration project (1998-2002) for areawide IPM for stored wheat in Kansas and Oklahoma. This project was a collaboration of researchers at the ARS Grain Marketing and Production Research Center in Manhattan, Kansas, Kansas State University, and Oklahoma State University. The project utilized two elevator networks, one in each state, for a total of 28 grain elevators. These elevators stored approximately 31 million bushels of wheat, which is approximately 1.2% of the annual national production. Stored wheat was followed as it moved from farm to the country elevator and finally to the terminal elevator. During this study, thousands of grain samples were taken in concrete elevator silos. Wheat stored at elevators was frequently infested by several insect species, which sometimes reached high numbers and damaged the grain. Fumigation using aluminum phosphide pellets was the main method for managing these insect pests in elevators in the USA. Fumigation decisions tended to be based on past experience with controlling stored-grain insects, or were calendar based. Integrated pest management (IPM) requires sampling and risk benefit analysis. We found that the best sampling method for estimating insect density, without turning the grain from one bin to another, was the vacuum probe sampler. Decision support software, Stored Grain Advisor Pro (SGA Pro) was developed that interprets insect sampling data, and provides grain managers with a risk analysis report detailing which bins are at low, moderate or high risk for insect-caused economic losses. Insect density was predicted up to three months in the future based on current insect density, grain temperature and moisture. Because sampling costs money, there is a trade-off between frequency of sampling and the cost of fumigation. The insect growth model in SGA Pro reduces the need to sample as often, thereby making the program more cost-effective. SGA Pro was validated during the final year of the areawide program. Based on data from 533 bins, SGA Pro accurately predicted which bins were at low, moderate or high risk. Only in two out of 533 bins did SGA Pro incorrectly predict bins as being low risk and, in both cases, insect density was only high (> two insects kg(-1)) at the surface, which suggested recent immigration. SGA Pro is superior to calendar-based management because it ensures that grain is only treated when insect densities exceed economic thresholds (two insects kg(-1)). This approach will reduce the frequency of fumigation while maintaining high grain quality. Minimizing the use of fumigant improves worker safety and reduces both control costs and harm to the environment.

  15. BIN1 is reduced and Cav1.2 trafficking is impaired in human failing cardiomyocytes.

    PubMed

    Hong, Ting-Ting; Smyth, James W; Chu, Kevin Y; Vogan, Jacob M; Fong, Tina S; Jensen, Brian C; Fang, Kun; Halushka, Marc K; Russell, Stuart D; Colecraft, Henry; Hoopes, Charles W; Ocorr, Karen; Chi, Neil C; Shaw, Robin M

    2012-05-01

    Heart failure is a growing epidemic, and a typical aspect of heart failure pathophysiology is altered calcium transients. Normal cardiac calcium transients are initiated by Cav1.2 channels at cardiac T tubules. Bridging integrator 1 (BIN1) is a membrane scaffolding protein that causes Cav1.2 to traffic to T tubules in healthy hearts. The mechanisms of Cav1.2 trafficking in heart failure are not known. To study BIN1 expression and its effect on Cav1.2 trafficking in failing hearts. Intact myocardium and freshly isolated cardiomyocytes from nonfailing and end-stage failing human hearts were used to study BIN1 expression and Cav1.2 localization. To confirm Cav1.2 surface expression dependence on BIN1, patch-clamp recordings were performed of Cav1.2 current in cell lines with and without trafficking-competent BIN1. Also, in adult mouse cardiomyocytes, surface Cav1.2 and calcium transients were studied after small hairpin RNA-mediated knockdown of BIN1. For a functional readout in intact heart, calcium transients and cardiac contractility were analyzed in a zebrafish model with morpholino-mediated knockdown of BIN1. BIN1 expression is significantly decreased in failing cardiomyocytes at both mRNA (30% down) and protein (36% down) levels. Peripheral Cav1.2 is reduced to 42% by imaging, and a biochemical T-tubule fraction of Cav1.2 is reduced to 68%. The total calcium current is reduced to 41% in a cell line expressing a nontrafficking BIN1 mutant. In mouse cardiomyocytes, BIN1 knockdown decreases surface Cav1.2 and impairs calcium transients. In zebrafish hearts, BIN1 knockdown causes a 75% reduction in calcium transients and severe ventricular contractile dysfunction. The data indicate that BIN1 is significantly reduced in human heart failure, and this reduction impairs Cav1.2 trafficking, calcium transients, and contractility. Copyright © 2012 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  16. BIN1 is Reduced and Cav1.2 Trafficking is Impaired in Human Failing Cardiomyocytes

    PubMed Central

    Hong, Ting-Ting; Smyth, James W.; Chu, Kevin Y.; Vogan, Jacob M.; Fong, Tina S.; Jensen, Brian C.; Fang, Kun; Halushka, Marc K.; Russell, Stuart D.; Colecraft, Henry; Hoopes, Charles W.; Ocorr, Karen; Chi, Neil C.; Shaw, Robin M.

    2011-01-01

    Background Heart failure is a growing epidemic and a typical aspect of heart failure pathophysiology is altered calcium transients. Normal cardiac calcium transients are initiated by Cav1.2 channels at cardiac T-tubules. BIN1 is a membrane scaffolding protein that causes Cav1.2 to traffic to T-tubules in healthy hearts. The mechanisms of Cav1.2 trafficking in heart failure are not known. Objective To study BIN1 expression and its effect on Cav1.2 trafficking in failing hearts. Methods Intact myocardium and freshly isolated cardiomyocytes from non-failing and end-stage failing human hearts were used to study BIN1 expression and Cav1.2 localization. To confirm Cav1.2 surface expression dependence on BIN1, patch clamp recordings were performed of Cav1.2 current in cell lines with and without trafficking competent BIN1. Also, in adult mouse cardiomyocytes, surface Cav1.2 and calcium transients were studied after shRNA mediated knockdown of BIN1. For a functional readout in intact heart, calcium transients and cardiac contractility were analyzed in a zebrafish model with morpholino mediated knockdown of BIN1. Results BIN1 expression is significantly decreased in failing cardiomyocytes at both mRNA (30% down) and protein (36% down) levels. Peripheral Cav1.2 is reduced 42% by imaging and biochemical T-tubule fraction of Cav1.2 is reduced 68%. Total calcium current is reduced 41% in a cell line expressing non-trafficking BIN1 mutant. In mouse cardiomyocytes, BIN1 knockdown decreases surface Cav1.2 and impairs calcium transients. In zebrafish hearts, BIN1 knockdown causes a 75% reduction in calcium transients and severe ventricular contractile dysfunction. Conclusions The data indicate that BIN1 is significantly reduced in human heart failure, and this reduction impairs Cav1.2 trafficking, calcium transients, and contractility. PMID:22138472

  17. RELEASE OF DRIED RADIOACTIVE WASTE MATERIALS TECHNICAL BASIS DOCUMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    KOZLOWSKI, S.D.

    2007-05-30

    This technical basis document was developed to support RPP-23429, Preliminary Documented Safety Analysis for the Demonstration Bulk Vitrification System (PDSA) and RPP-23479, Preliminary Documented Safety Analysis for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Facility. The main document describes the risk binning process and the technical basis for assigning risk bins to the representative accidents involving the release of dried radioactive waste materials from the Demonstration Bulk Vitrification System (DBVS) and to the associated represented hazardous conditions. Appendices D through F provide the technical basis for assigning risk bins to the representative dried waste release accident and associated represented hazardous conditionsmore » for the Contact-Handled Transuranic Mixed (CH-TRUM) Waste Packaging Unit (WPU). The risk binning process uses an evaluation of the frequency and consequence of a given representative accident or represented hazardous condition to determine the need for safety structures, systems, and components (SSC) and technical safety requirement (TSR)-level controls. A representative accident or a represented hazardous condition is assigned to a risk bin based on the potential radiological and toxicological consequences to the public and the collocated worker. Note that the risk binning process is not applied to facility workers because credible hazardous conditions with the potential for significant facility worker consequences are considered for safety-significant SSCs and/or TSR-level controls regardless of their estimated frequency. The controls for protection of the facility workers are described in RPP-23429 and RPP-23479. Determination of the need for safety-class SSCs was performed in accordance with DOE-STD-3009-94, Preparation Guide for US. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses, as described below.« less

  18. 15. NORTH ELEVATION OF UPPER ORE BIN, CHUTE, AND JAW ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. NORTH ELEVATION OF UPPER ORE BIN, CHUTE, AND JAW CRUSHER, LOOKING SOUTH FROM END OF CONVEYOR PLATFORM. NOTICE THE THREE ORE BIN CONTROL DOORS, CORRESPONDING TO SEPARATE COMPARTMENTS OF THE BIN. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  19. Reply-frequency interference/jamming detector

    NASA Astrophysics Data System (ADS)

    Bishop, Walton B.

    1995-01-01

    Received IFF reply-frequency signals are examined to determine whether they are being interfered with by enemy sources and indication of the extent of detected interference is provided. The number of correct replies received from selected range bins surrounding and including the center one in which a target leading edge is first declared is counted and compared with the count of the number of friend-accept decisions made based on replies from the selected range bins. The level of interference is then indicated by the ratio between the two counts.

  20. Unsupervised discovery of microbial population structure within metagenomes using nucleotide base composition

    PubMed Central

    Saeed, Isaam; Tang, Sen-Lin; Halgamuge, Saman K.

    2012-01-01

    An approach to infer the unknown microbial population structure within a metagenome is to cluster nucleotide sequences based on common patterns in base composition, otherwise referred to as binning. When functional roles are assigned to the identified populations, a deeper understanding of microbial communities can be attained, more so than gene-centric approaches that explore overall functionality. In this study, we propose an unsupervised, model-based binning method with two clustering tiers, which uses a novel transformation of the oligonucleotide frequency-derived error gradient and GC content to generate coarse groups at the first tier of clustering; and tetranucleotide frequency to refine these groups at the secondary clustering tier. The proposed method has a demonstrated improvement over PhyloPythia, S-GSOM, TACOA and TaxSOM on all three benchmarks that were used for evaluation in this study. The proposed method is then applied to a pyrosequenced metagenomic library of mud volcano sediment sampled in southwestern Taiwan, with the inferred population structure validated against complementary sequencing of 16S ribosomal RNA marker genes. Finally, the proposed method was further validated against four publicly available metagenomes, including a highly complex Antarctic whale-fall bone sample, which was previously assumed to be too complex for binning prior to functional analysis. PMID:22180538

  1. Role of HPC in Advancing Computational Aeroelasticity

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2004-01-01

    On behalf of the High Performance Computing and Modernization Program (HPCMP) and NASA Advanced Supercomputing Division (NAS) a study is conducted to assess the role of supercomputers on computational aeroelasticity of aerospace vehicles. The study is mostly based on the responses to a web based questionnaire that was designed to capture the nuances of high performance computational aeroelasticity, particularly on parallel computers. A procedure is presented to assign a fidelity-complexity index to each application. Case studies based on major applications using HPCMP resources are presented.

  2. Evaluation of French and English MeSH Indexing Systems with a Parallel Corpus

    PubMed Central

    Névéol, Aurélie; Mork, James G.; Aronson, Alan R.; Darmoni, Stefan J.

    2005-01-01

    Objective This paper presents the evaluation of two MeSH® indexing systems for French and English on a parallel corpus. Material and methods We describe two automatic MeSH indexing systems - MTI for English, and MAIF for French. The French version of the evaluation resources has been manually indexed with MeSH keyword/qualifier pairs. This professional indexing is used as our gold standard in the evaluation of both systems on keyword retrieval. Results The English system (MTI) obtains significantly better precision and recall (78% precision and 21% recall at rank 1, vs. 37%. precision and 6% recall for MAIF ). Moreover, the performance of both systems can be optimised by the breakage function used by the French system (MAIF), which selects an adaptive number of descriptors for each resource indexed. Conclusion MTI achieves better performance. However, both systems have features that can benefit each other. PMID:16779103

  3. Improved image retrieval based on fuzzy colour feature vector

    NASA Astrophysics Data System (ADS)

    Ben-Ahmeida, Ahlam M.; Ben Sasi, Ahmed Y.

    2013-03-01

    One of Image indexing techniques is the Content-Based Image Retrieval which is an efficient way for retrieving images from the image database automatically based on their visual contents such as colour, texture, and shape. In this paper will be discuss how using content-based image retrieval (CBIR) method by colour feature extraction and similarity checking. By dividing the query image and all images in the database into pieces and extract the features of each part separately and comparing the corresponding portions in order to increase the accuracy in the retrieval. The proposed approach is based on the use of fuzzy sets, to overcome the problem of curse of dimensionality. The contribution of colour of each pixel is associated to all the bins in the histogram using fuzzy-set membership functions. As a result, the Fuzzy Colour Histogram (FCH), outperformed the Conventional Colour Histogram (CCH) in image retrieving, due to its speedy results, where were images represented as signatures that took less size of memory, depending on the number of divisions. The results also showed that FCH is less sensitive and more robust to brightness changes than the CCH with better retrieval recall values.

  4. Identification of QTL for Early Vigor and Stay-Green Conferring Tolerance to Drought in Two Connected Advanced Backcross Populations in Tropical Maize (Zea mays L.)

    PubMed Central

    Trachsel, Samuel; Sun, Dapeng; SanVicente, Felix M.; Zheng, Hongjian; Atlin, Gary N.; Suarez, Edgar Antonio; Babu, Raman; Zhang, Xuecai

    2016-01-01

    We aimed to identify quantitative trait loci (QTL) for secondary traits related to grain yield (GY) in two BC1F2:3 backcross populations (LPSpop and DTPpop) under well-watered (4 environments; WW) and drought stressed (6; DS) conditions to facilitate breeding efforts towards drought tolerant maize. GY reached 5.6 and 5.8 t/ha under WW in the LPSpop and the DTPpop, respectively. Under DS, grain yield was reduced by 65% (LPSpop) to 59% (DTPpop) relative to WW. GY was strongly associated with the normalized vegetative index (NDVI; r ranging from 0.61 to 0.96) across environmental conditions and with an early flowering under drought stressed conditions (r ranging from -0.18 to -0.25) indicative of the importance of early vigor and drought escape for GY. Out of the 105 detected QTL, 53 were overdominant indicative of strong heterosis. For 14 out of 18 detected vigor QTL, as well as for eight flowering time QTL the trait increasing allele was derived from CML491. Collocations of early vigor QTL with QTL for stay green (bin 2.02, WW, LPSpop; 2.07, DS, DTPpop), the number of ears per plant (bins 2.02, 2.05, WW, LPSpop; 5.02, DS, LPSpop) and GY (bin 2.07, WW, DTPpop; 5.04, WW, LPSpop), reinforce the importance of the observed correlations. LOD scores for early vigor QTL in these bins ranged from 2.2 to 11.25 explaining 4.6 (additivity: +0.28) to 19.9% (additivity: +0.49) of the observed phenotypic variance. A strong flowering QTL was detected in bin 2.06 across populations and environmental conditions explaining 26–31.3% of the observed phenotypic variation (LOD: 13–17; additivity: 0.1–0.6d). Improving drought tolerance while at the same time maintaining yield potential could be achieved by combining alleles conferring early vigor from the recurrent parent with alleles advancing flowering from the donor. Additionally bin 8.06 (DTPpop) harbored a QTL for GY under WW (additivity: 0.27 t/ha) and DS (additivity: 0.58 t/ha). R2 ranged from 0 (DTPpop, WW) to 26.54% (LPSpop, DS) for NDVI, 18.6 (LPSpop, WW) to 42.45% (LPSpop, DS) for anthesis and from 0 (DTPpop, DS) to 24.83% (LPSpop, WW) for GY. Lines out-yielding the best check by 32.5% (DTPpop, WW) to 60% (DTPpop, DS) for all population-by-irrigation treatment combination (except LPSpop, WW) identified are immediately available for the use by breeders. PMID:26999525

  5. Exact simulation of polarized light reflectance by particle deposits

    NASA Astrophysics Data System (ADS)

    Ramezan Pour, B.; Mackowski, D. W.

    2015-12-01

    The use of polarimetric light reflection measurements as a means of identifying the physical and chemical characteristics of particulate materials obviously relies on an accurate model of predicting the effects of particle size, shape, concentration, and refractive index on polarized reflection. The research examines two methods for prediction of reflection from plane parallel layers of wavelength—sized particles. The first method is based on an exact superposition solution to Maxwell's time harmonic wave equations for a deposit of spherical particles that are exposed to a plane incident wave. We use a FORTRAN-90 implementation of this solution (the Multiple Sphere T Matrix (MSTM) code), coupled with parallel computational platforms, to directly simulate the reflection from particle layers. The second method examined is based upon the vector radiative transport equation (RTE). Mie theory is used in our RTE model to predict the extinction coefficient, albedo, and scattering phase function of the particles, and the solution of the RTE is obtained from adding—doubling method applied to a plane—parallel configuration. Our results show that the MSTM and RTE predictions of the Mueller matrix elements converge when particle volume fraction in the particle layer decreases below around five percent. At higher volume fractions the RTE can yield results that, depending on the particle size and refractive index, significantly depart from the exact predictions. The particle regimes which lead to dependent scattering effects, and the application of methods to correct the vector RTE for particle interaction, will be discussed.

  6. A GIS-based vulnerability assessment of brine contamination to aquatic resources from oil and gas development in eastern Sheridan County, Montana.

    PubMed

    Preston, Todd M; Chesley-Preston, Tara L; Thamke, Joanna N

    2014-02-15

    Water (brine) co-produced with oil in the Williston Basin is some of the most saline in the nation. The Prairie Pothole Region (PPR), characterized by glacial sediments and numerous wetlands, covers the northern and eastern portion of the Williston Basin. Sheridan County, Montana, lies within the PPR and has a documented history of brine contamination. Surface water and shallow groundwater in the PPR are saline and sulfate dominated while the deeper brines are much more saline and chloride dominated. A Contamination Index (CI), defined as the ratio of chloride concentration to specific conductance in a water sample, was developed by the Montana Bureau of Mines and Geology to delineate the magnitude of brine contamination in Sheridan County. Values >0.035 indicate contamination. Recently, the U.S. Geological Survey completed a county level geographic information system (GIS)-based vulnerability assessment of brine contamination to aquatic resources in the PPR of the Williston Basin based on the age and density of oil wells, number of wetlands, and stream length per county. To validate and better define this assessment, a similar approach was applied in eastern Sheridan County at a greater level of detail (the 2.59 km(2) Public Land Survey System section grid) and included surficial geology. Vulnerability assessment scores were calculated for the 780 modeled sections and these scores were divided into ten equal interval bins representing similar probabilities of contamination. Two surface water and two groundwater samples were collected from the section with the greatest acreage of Federal land in each bin. Nineteen of the forty water samples, and at least one water sample from seven of the ten selected sections, had CI values indicating contamination. Additionally, CI values generally increased with increasing vulnerability assessment score, with a stronger correlation for groundwater samples (R(2)=0.78) than surface water samples (R(2)=0.53). Copyright © 2013 Elsevier B.V. All rights reserved.

  7. A GIS-based vulnerability assessment of brine contamination to aquatic resources from oil and gas development in eastern Sheridan County, Montana

    USGS Publications Warehouse

    Preston, Todd M.; Chesley-Preston, Tara L.; Thamke, Joanna N.

    2014-01-01

    Water (brine) co-produced with oil in the Williston Basin is some of the most saline in the nation. The Prairie Pothole Region (PPR), characterized by glacial sediments and numerous wetlands, covers the northern and eastern portion of the Williston Basin. Sheridan County, Montana, lies within the PPR and has a documented history of brine contamination. Surface water and shallow groundwater in the PPR are saline and sulfate dominated while the deeper brines are much more saline and chloride dominated. A Contamination Index (CI), defined as the ratio of chloride concentration to specific conductance in a water sample, was developed by the Montana Bureau of Mines and Geology to delineate the magnitude of brine contamination in Sheridan County. Values > 0.035 indicate contamination. Recently, the U.S. Geological Survey completed a county level geographic information system (GIS)-based vulnerability assessment of brine contamination to aquatic resources in the PPR of the Williston Basin based on the age and density of oil wells, number of wetlands, and stream length per county. To validate and better define this assessment, a similar approach was applied in eastern Sheridan County at a greater level of detail (the 2.59 km2 Public Land Survey System section grid) and included surficial geology. Vulnerability assessment scores were calculated for the 780 modeled sections and these scores were divided into ten equal interval bins representing similar probabilities of contamination. Two surface water and two groundwater samples were collected from the section with the greatest acreage of Federal land in each bin. Nineteen of the forty water samples, and at least one water sample from seven of the ten selected sections, had CI values indicating contamination. Additionally, CI values generally increased with increasing vulnerability assessment score, with a stronger correlation for groundwater samples (R2 = 0.78) than surface water samples (R2 = 0.53).

  8. Streamlined method for parallel identification of single domain antibodies to membrane receptors on whole cells

    PubMed Central

    Rossotti, Martín; Tabares, Sofía; Alfaya, Lucía; Leizagoyen, Carmen; Moron, Gabriel; González-Sapienza, Gualberto

    2015-01-01

    BACKGROUND Owing to their minimal size, high production yield, versatility and robustness, the recombinant variable domain (nanobody) of camelid single chain antibodies are valued affinity reagents for research, diagnostic, and therapeutic applications. While their preparation against purified antigens is straightforward, the generation of nanobodies to difficult targets such as multi-pass or complex membrane cell receptors remains challenging. Here we devised a platform for high throughput identification of nanobodies to cell receptor based on the use of a biotin handle. METHODS Using a biotin-acceptor peptide tag, the in vivo biotinylation of nanobodies in 96 well culture blocks was optimized allowing their parallel analysis by flow cytometry and ELISA, and their direct used for pull-down/MS target identification. RESULTS The potential of this strategy was demonstrated by the selection and characterization of panels of nanobodies to Mac-1 (CD11b/CD18), MHC II and the mouse Ly-5 leukocyte common antigen (CD45) receptors, from a VHH library obtained from a llama immunized with mouse bone marrow derived dendritic cells. By on and off switching of the addition of biotin, the method also allowed the epitope binning of the selected Nbs directly on cells. CONCLUSIONS This strategy streamline the selection of potent nanobodies to complex antigens, and the selected nanobodies constitute ready-to-use biotinylated reagents. GENERAL SIGNIFICANCE This method will accelerate the discovery of nanobodies to cell membrane receptors which comprise the largest group of drug and analytical targets. PMID:25819371

  9. 30 CFR 57.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... NONMETAL MINES Materials Storage and Handling § 57.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled...

  10. 30 CFR 56.16002 - Bins, hoppers, silos, tanks, and surge piles.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bins, hoppers, silos, tanks, and surge piles... MINES Materials Storage and Handling § 56.16002 Bins, hoppers, silos, tanks, and surge piles. (a) Bins, hoppers, silos, tanks, and surge piles, where loose unconsolidated materials are stored, handled or...

  11. Pack Factor Measurementss for Corn in Grain Storage Bins

    USDA-ARS?s Scientific Manuscript database

    Grain is commonly stored commercially in tall bins, which often are as deep as 35 m (114.8 ft) for tall and narrow concrete bins and about 32 m (105 ft) in diameter for large corrugated steel bins. Grain can support the great pressure without crushing, but it yields somewhat to compaction under its ...

  12. 19. VIEW OF CRUDE ORE BINS FROM EAST. EAST CRUDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    19. VIEW OF CRUDE ORE BINS FROM EAST. EAST CRUDE ORE BIN IN FOREGROUND WITH DISCHARGE TO GRIZZLY AT BOTTOM OF VIEW. CONCRETE RETAINING WALL TO LEFT (SOUTH) AND BOTTOM (EAST EDGE OF EAST BIN). - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  13. Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system.

    PubMed

    Hannan, M A; Arebey, Maher; Begum, R A; Basri, Hassan

    2011-12-01

    This paper deals with a system of integration of Radio Frequency Identification (RFID) and communication technologies for solid waste bin and truck monitoring system. RFID, GPS, GPRS and GIS along with camera technologies have been integrated and developed the bin and truck intelligent monitoring system. A new kind of integrated theoretical framework, hardware architecture and interface algorithm has been introduced between the technologies for the successful implementation of the proposed system. In this system, bin and truck database have been developed such a way that the information of bin and truck ID, date and time of waste collection, bin status, amount of waste and bin and truck GPS coordinates etc. are complied and stored for monitoring and management activities. The results showed that the real-time image processing, histogram analysis, waste estimation and other bin information have been displayed in the GUI of the monitoring system. The real-time test and experimental results showed that the performance of the developed system was stable and satisfied the monitoring system with high practicability and validity. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Evaluation of amplitude-based sorting algorithm to reduce lung tumor blurring in PET images using 4D NCAT phantom.

    PubMed

    Wang, Jiali; Byrne, James; Franquiz, Juan; McGoron, Anthony

    2007-08-01

    develop and validate a PET sorting algorithm based on the respiratory amplitude to correct for abnormal respiratory cycles. using the 4D NCAT phantom model, 3D PET images were simulated in lung and other structures at different times within a respiratory cycle and noise was added. To validate the amplitude binning algorithm, NCAT phantom was used to simulate one case of five different respiratory periods and another case of five respiratory periods alone with five respiratory amplitudes. Comparison was performed for gated and un-gated images and for the new amplitude binning algorithm with the time binning algorithm by calculating the mean number of counts in the ROI (region of interest). an average of 8.87+/-5.10% improvement was reported for total 16 tumors with different tumor sizes and different T/B (tumor to background) ratios using the new sorting algorithm. As both the T/B ratio and tumor size decreases, image degradation due to respiration increases. The greater benefit for smaller diameter tumor and lower T/B ratio indicates a potential improvement in detecting more problematic tumors.

  15. Intrication temporelle et communication quantique

    NASA Astrophysics Data System (ADS)

    Bussieres, Felix

    Quantum communication is the art of transferring a quantum state from one place to another and the study of tasks that can be accomplished with it. This thesis is devoted to the development of tools and tasks for quantum communication in a real-world setting. These were implemented using an underground optical fibre link deployed in an urban environment. The technological and theoretical innovations presented here broaden the range of applications of time-bin entanglement through new methods of manipulating time-bin qubits, a novel model for characterizing sources of photon pairs, new ways of testing non-locality and the design and the first implementation of a new loss-tolerant quantum coin-flipping protocol. Manipulating time-bin qubits. A single photon is an excellent vehicle in which a qubit, the fundamental unit of quantum information, can be encoded. In particular, the time-bin encoding of photonic qubits is well suited for optical fibre transmission. Before this thesis, the applications of quantum communication based on the time-bin encoding were limited due to the lack of methods to implement arbitrary operations and measurements. We have removed this restriction by proposing the first methods to realize arbitrary deterministic operations on time-bin qubits as well as single qubit measurements in an arbitrary basis. We applied these propositions to the specific case of optical measurement-based quantum computing and showed how to implement the feedforward operations, which are essential to this model. This therefore opens new possibilities for creating an optical quantum computer, but also for other quantum communication tasks. Characterizing sources of photon pairs. Experimental quantum communication requires the creation of single photons and entangled photons. These two ingredients can be obtained from a source of photon pairs based on non-linear spontaneous processes. Several tasks in quantum communication require a precise knowledge of the properties of the source being used. We developed and implemented a fast and simple method to characterize a source of photon pairs. This method is well suited for a realistic setting where experimental conditions, such as channel transmittance, may fluctuate, and for which the characterization of the source has to be done in real time. Testing the non-locality of time-bin entanglement. Entanglement is a resource needed for the realization of many important tasks in quantum communication. It also allows two physical systems to be correlated in a way that cannot be explained by classical physics; this manifestation of entanglement is called non-locality. We built a source of time-bin entangled photonic qubits and characterized it with the new methods implementing arbitrary single qubit measurements that we developed. This allowed us to reveal the non-local nature of our source of entanglement in ways that were never implemented before. It also opens the door to study previously untested features of non-locality using this source. Theses experiments were performed in a realistic setting where quantum (non-local) correlations were observed even after transmission of one of the entangled qubits over 12.4 km of an underground optical fibre. Flipping quantum coins. Quantum coin-flipping is a quantum cryptographic primitive proposed in 1984, that is when the very first steps of quantum communication were being taken, where two players alternate in sending classical and quantum information in order to generate a shared random bit. The use of quantum information is such that a potential cheater cannot force the outcome to his choice with certainty. Classically, however, one of the players can always deterministically choose the outcome. Unfortunately, the security of all previous quantum coin-flipping protocols is seriously compromised in the presence of losses on the transmission channel, thereby making this task impractical. We found a solution to this problem and obtained the first loss-tolerant quantum coin-flipping protocol whose security is independent of the amount of the losses. We have also experimentally demonstrated our loss-tolerant protocol using our source of time-bin entanglement combined with our arbitrary single qubit measurement methods. This experiment took place in a realistic setting where qubits travelled over an underground optical fibre link. This new task thus joins quantum key distribution as a practical application of quantum communication. Keywords. quantum communication, photonics, time-bin encoding, source of photon pairs, heralded single photon source, entanglement, non-locality, time-bin entanglement, hybrid entanglement, quantum network, quantum cryptography, quantum coin-flipping, measurement-based quantum computation, telecommunication, optical fibre, nonlinear optics.

  16. MBMC: An Effective Markov Chain Approach for Binning Metagenomic Reads from Environmental Shotgun Sequencing Projects.

    PubMed

    Wang, Ying; Hu, Haiyan; Li, Xiaoman

    2016-08-01

    Metagenomics is a next-generation omics field currently impacting postgenomic life sciences and medicine. Binning metagenomic reads is essential for the understanding of microbial function, compositions, and interactions in given environments. Despite the existence of dozens of computational methods for metagenomic read binning, it is still very challenging to bin reads. This is especially true for reads from unknown species, from species with similar abundance, and/or from low-abundance species in environmental samples. In this study, we developed a novel taxonomy-dependent and alignment-free approach called MBMC (Metagenomic Binning by Markov Chains). Different from all existing methods, MBMC bins reads by measuring the similarity of reads to the trained Markov chains for different taxa instead of directly comparing reads with known genomic sequences. By testing on more than 24 simulated and experimental datasets with species of similar abundance, species of low abundance, and/or unknown species, we report here that MBMC reliably grouped reads from different species into separate bins. Compared with four existing approaches, we demonstrated that the performance of MBMC was comparable with existing approaches when binning reads from sequenced species, and superior to existing approaches when binning reads from unknown species. MBMC is a pivotal tool for binning metagenomic reads in the current era of Big Data and postgenomic integrative biology. The MBMC software can be freely downloaded at http://hulab.ucf.edu/research/projects/metagenomics/MBMC.html .

  17. Reducing 4D CT artifacts using optimized sorting based on anatomic similarity.

    PubMed

    Johnston, Eric; Diehn, Maximilian; Murphy, James D; Loo, Billy W; Maxim, Peter G

    2011-05-01

    Four-dimensional (4D) computed tomography (CT) has been widely used as a tool to characterize respiratory motion in radiotherapy. The two most commonly used 4D CT algorithms sort images by the associated respiratory phase or displacement into a predefined number of bins, and are prone to image artifacts at transitions between bed positions. The purpose of this work is to demonstrate a method of reducing motion artifacts in 4D CT by incorporating anatomic similarity into phase or displacement based sorting protocols. Ten patient datasets were retrospectively sorted using both the displacement and phase based sorting algorithms. Conventional sorting methods allow selection of only the nearest-neighbor image in time or displacement within each bin. In our method, for each bed position either the displacement or the phase defines the center of a bin range about which several candidate images are selected. The two dimensional correlation coefficients between slices bordering the interface between adjacent couch positions are then calculated for all candidate pairings. Two slices have a high correlation if they are anatomically similar. Candidates from each bin are then selected to maximize the slice correlation over the entire data set using the Dijkstra's shortest path algorithm. To assess the reduction of artifacts, two thoracic radiation oncologists independently compared the resorted 4D datasets pairwise with conventionally sorted datasets, blinded to the sorting method, to choose which had the least motion artifacts. Agreement between reviewers was evaluated using the weighted kappa score. Anatomically based image selection resulted in 4D CT datasets with significantly reduced motion artifacts with both displacement (P = 0.0063) and phase sorting (P = 0.00022). There was good agreement between the two reviewers, with complete agreement 34 times and complete disagreement 6 times. Optimized sorting using anatomic similarity significantly reduces 4D CT motion artifacts compared to conventional phase or displacement based sorting. This improved sorting algorithm is a straightforward extension of the two most common 4D CT sorting algorithms.

  18. Use of RORA for Complex Ground-Water Flow Conditions

    USGS Publications Warehouse

    Rutledge, A.T.

    2004-01-01

    The RORA computer program for estimating recharge is based on a condition in which ground water flows perpendicular to the nearest stream that receives ground-water discharge. The method, therefore, does not explicitly account for the ground-water-flow component that is parallel to the stream. Hypothetical finite-difference simulations are used to demonstrate effects of complex flow conditions that consist of two components: one that is perpendicular to the stream and one that is parallel to the stream. Results of the simulations indicate that the RORA program can be used if certain constraints are applied in the estimation of the recession index, an input variable to the program. These constraints apply to a mathematical formulation based on aquifer properties, recession of ground-water levels, and recession of streamflow.

  19. An efficient indexing scheme for binary feature based biometric database

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Sana, A.; Mehrotra, H.; Hwang, C. Jinshong

    2007-04-01

    The paper proposes an efficient indexing scheme for binary feature template using B+ tree. In this scheme the input image is decomposed into approximation, vertical, horizontal and diagonal coefficients using the discrete wavelet transform. The binarized approximation coefficient at second level is divided into four quadrants of equal size and Hamming distance (HD) for each quadrant with respect to sample template of all ones is measured. This HD value of each quadrant is used to generate upper and lower range values which are inserted into B+ tree. The nodes of tree at first level contain the lower and upper range values generated from HD of first quadrant. Similarly, lower and upper range values for the three quadrants are stored in the second, third and fourth level respectively. Finally leaf node contains the set of identifiers. At the time of identification, the test image is used to generate HD for four quadrants. Then the B+ tree is traversed based on the value of HD at every node and terminates to leaf nodes with set of identifiers. The feature vector for each identifier is retrieved from the particular bin of secondary memory and matched with test feature template to get top matches. The proposed scheme is implemented on ear biometric database collected at IIT Kanpur. The system is giving an overall accuracy of 95.8% at penetration rate of 34%.

  20. Data-driven optimal binning for respiratory motion management in PET.

    PubMed

    Kesner, Adam L; Meier, Joseph G; Burckhardt, Darrell D; Schwartz, Jazmin; Lynch, David A

    2018-01-01

    Respiratory gating has been used in PET imaging to reduce the amount of image blurring caused by patient motion. Optimal binning is an approach for using the motion-characterized data by binning it into a single, easy to understand/use, optimal bin. To date, optimal binning protocols have utilized externally driven motion characterization strategies that have been tuned with population-derived assumptions and parameters. In this work, we are proposing a new strategy with which to characterize motion directly from a patient's gated scan, and use that signal to create a patient/instance-specific optimal bin image. Two hundred and nineteen phase-gated FDG PET scans, acquired using data-driven gating as described previously, were used as the input for this study. For each scan, a phase-amplitude motion characterization was generated and normalized using principle component analysis. A patient-specific "optimal bin" window was derived using this characterization, via methods that mirror traditional optimal window binning strategies. The resulting optimal bin images were validated by correlating quantitative and qualitative measurements in the population of PET scans. In 53% (n = 115) of the image population, the optimal bin was determined to include 100% of the image statistics. In the remaining images, the optimal binning windows averaged 60% of the statistics and ranged between 20% and 90%. Tuning the algorithm, through a single acceptance window parameter, allowed for adjustments of the algorithm's performance in the population toward conservation of motion or reduced noise-enabling users to incorporate their definition of optimal. In the population of images that were deemed appropriate for segregation, average lesion SUV max were 7.9, 8.5, and 9.0 for nongated images, optimal bin, and gated images, respectively. The Pearson correlation of FWHM measurements between optimal bin images and gated images were better than with nongated images, 0.89 and 0.85, respectively. Generally, optimal bin images had better resolution than the nongated images and better noise characteristics than the gated images. We extended the concept of optimal binning to a data-driven form, updating a traditionally one-size-fits-all approach to a conformal one that supports adaptive imaging. This automated strategy was implemented easily within a large population and encapsulated motion information in an easy to use 3D image. Its simplicity and practicality may make this, or similar approaches ideal for use in clinical settings. © 2017 American Association of Physicists in Medicine.

  1. Comammox in drinking water systems.

    PubMed

    Wang, Yulin; Ma, Liping; Mao, Yanping; Jiang, Xiaotao; Xia, Yu; Yu, Ke; Li, Bing; Zhang, Tong

    2017-06-01

    The discovery of complete ammonia oxidizer (comammox) has fundamentally upended our perception of the global nitrogen cycle. Here, we reported four metagenome assembled genomes (MAGs) of comammox Nitrospira that were retrieved from metagenome datasets of tap water in Singapore (SG-bin1 and SG-bin2), Hainan province, China (HN-bin3) and Stanford, CA, USA (ST-bin4). Genes of phylogenetically distinct ammonia monooxygenase subunit A (amoA) and hydroxylamine dehydrogenase (hao) were identified in these four MAGs. Phylogenetic analysis based on ribosomal proteins, AmoA, hao and nitrite oxidoreductase (subunits nxrA and nxrB) sequences indicated their close relationships with published comammox Nitrospira. Canonical ammonia-oxidizing microbes (AOM) were also identified in the three tap water samples, ammonia-oxidizing bacteria (AOB) in Singapore's and Stanford's samples and ammonia-oxidizing archaea (AOA) in Hainan's sample. The comammox amoA-like sequences were also detected from some other drinking water systems, and even outnumbered the AOA and AOB amoA-like sequences. The findings of MAGs and the occurrences of AOM in different drinking water systems provided a significant clue that comammox are widely distributed in drinking water systems. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Spectral Prior Image Constrained Compressed Sensing (Spectral PICCS) for Photon-Counting Computed Tomography

    PubMed Central

    Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.

    2016-01-01

    Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in-vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43~73%) without sacrificing CT number accuracy or spatial resolution. PMID:27551878

  3. Voigt equivalent widths and spectral-bin single-line transmittances: Exact expansions and the MODTRAN®5 implementation

    NASA Astrophysics Data System (ADS)

    Berk, Alexander

    2013-03-01

    Exact expansions for Voigt line-shape total, line-tail and spectral bin equivalent widths and for Voigt finite spectral bin single-line transmittances have been derived in terms of optical depth dependent exponentially-scaled modified Bessel functions of integer order and optical depth independent Fourier integral coefficients. The series are convergent for the full range of Voigt line-shapes, from pure Doppler to pure Lorentzian. In the Lorentz limit, the expansion reduces to the Ladenburg and Reiche function for the total equivalent width. Analytic expressions are derived for the first 8 Fourier coefficients for pure Lorentzian lines, for pure Doppler lines and for Voigt lines with at most moderate Doppler dependence. A strong-line limit sum rule on the Fourier coefficients is enforced to define an additional Fourier coefficient and to optimize convergence of the truncated expansion. The moderate Doppler dependence scenario is applicable to and has been implemented in the MODTRAN5 atmospheric band model radiative transfer software. Finite-bin transmittances computed with the truncated expansions reduce transmittance residuals compared to the former Rodgers-Williams equivalent width based approach by ∼2 orders of magnitude.

  4. Identification of Intensity Ratio Break Points from Photon Arrival Trajectories in Ratiometric Single Molecule Spectroscopy

    PubMed Central

    Bingemann, Dieter; Allen, Rachel M.

    2012-01-01

    We describe a statistical method to analyze dual-channel photon arrival trajectories from single molecule spectroscopy model-free to identify break points in the intensity ratio. Photons are binned with a short bin size to calculate the logarithm of the intensity ratio for each bin. Stochastic photon counting noise leads to a near-normal distribution of this logarithm and the standard student t-test is used to find statistically significant changes in this quantity. In stochastic simulations we determine the significance threshold for the t-test’s p-value at a given level of confidence. We test the method’s sensitivity and accuracy indicating that the analysis reliably locates break points with significant changes in the intensity ratio with little or no error in realistic trajectories with large numbers of small change points, while still identifying a large fraction of the frequent break points with small intensity changes. Based on these results we present an approach to estimate confidence intervals for the identified break point locations and recommend a bin size to choose for the analysis. The method proves powerful and reliable in the analysis of simulated and actual data of single molecule reorientation in a glassy matrix. PMID:22837704

  5. Spectral prior image constrained compressed sensing (spectral PICCS) for photon-counting computed tomography

    NASA Astrophysics Data System (ADS)

    Yu, Zhicong; Leng, Shuai; Li, Zhoubo; McCollough, Cynthia H.

    2016-09-01

    Photon-counting computed tomography (PCCT) is an emerging imaging technique that enables multi-energy imaging with only a single scan acquisition. To enable multi-energy imaging, the detected photons corresponding to the full x-ray spectrum are divided into several subgroups of bin data that correspond to narrower energy windows. Consequently, noise in each energy bin increases compared to the full-spectrum data. This work proposes an iterative reconstruction algorithm for noise suppression in the narrower energy bins used in PCCT imaging. The algorithm is based on the framework of prior image constrained compressed sensing (PICCS) and is called spectral PICCS; it uses the full-spectrum image reconstructed using conventional filtered back-projection as the prior image. The spectral PICCS algorithm is implemented using a constrained optimization scheme with adaptive iterative step sizes such that only two tuning parameters are required in most cases. The algorithm was first evaluated using computer simulations, and then validated by both physical phantoms and in vivo swine studies using a research PCCT system. Results from both computer-simulation and experimental studies showed substantial image noise reduction in narrow energy bins (43-73%) without sacrificing CT number accuracy or spatial resolution.

  6. A 1.5-Mb cosmid contig of the CMT1A duplication/HNPP deletion critical region in 17p11.2-p12

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murakami, Tatsufumi; Lupski, J.R.

    1996-05-15

    Charcot-Marie-Tooth disease type 1A (CMT1A) is associated with a 1.5-Mb tandem duplication in chromosome 17p11.2-p12, and hereditary neuropathy with liability to pressure palsies (HNPP) is associated with a 1.5-Mb deletion at this locus. Both diseases appear to result from an altered copy number of the peripheral myelin protein-22 gene, PMP22, which maps within the critical region. To identify additional genes and characterize chromosomal elements, a 1.5-Mb cosmid contig of the CMT1A duplication/HNPP deletion critical region was assembled using a yeast artificial chromosome (YAC)-based isolation and binning strategy. Whole YAC probes were used for screening a high-density arrayed chromosome 17-specific cosmidmore » library. Selected cosmids were spotted on dot blots and assigned to bins defined by YACs. This binning of cosmids facilitated the subsequent fingerprint analysis. The 1.5-Mb region was covered by 137 cosmids with a minimum overlap set of 52 cosmids assigned to 17 bins and 9 contigs. 20 refs., 2 figs.« less

  7. Dissecting genomic hotspots underlying seed protein, oil, and sucrose content in an interspecific mapping population of soybean using high-density linkage mapping.

    PubMed

    Patil, Gunvant; Vuong, Tri D; Kale, Sandip; Valliyodan, Babu; Deshmukh, Rupesh; Zhu, Chengsong; Wu, Xiaolei; Bai, Yonghe; Yungbluth, Dennis; Lu, Fang; Kumpatla, Siva; Shannon, J Grover; Varshney, Rajeev K; Nguyen, Henry T

    2018-04-04

    The cultivated [Glycine max (L) Merr.] and wild [Glycine soja Siebold & Zucc.] soybean species comprise wide variation in seed composition traits. Compared to wild soybean, cultivated soybean contains low protein, high oil, and high sucrose. In this study, an interspecific population was derived from a cross between G. max (Williams 82) and G. soja (PI 483460B). This recombinant inbred line (RIL) population of 188 lines was sequenced at 0.3× depth. Based on 91 342 single nucleotide polymorphisms (SNPs), recombination events in RILs were defined, and a high-resolution bin map was developed (4070 bins). In addition to bin mapping, quantitative trait loci (QTL) analysis for protein, oil, and sucrose was performed using 3343 polymorphic SNPs (3K-SNP), derived from Illumina Infinium BeadChip sequencing platform. The QTL regions from both platforms were compared, and a significant concordance was observed between bin and 3K-SNP markers. Importantly, the bin map derived from next-generation sequencing technology enhanced mapping resolution (from 1325 to 50 Kb). A total of five, nine, and four QTLs were identified for protein, oil, and sucrose content, respectively, and some of the QTLs coincided with soybean domestication-related genomic loci. The major QTL for protein and oil were mapped on Chr. 20 (qPro_20) and suggested negative correlation between oil and protein. In terms of sucrose content, a novel and major QTL were identified on Chr. 8 (qSuc_08) and harbours putative genes involved in sugar transport. In addition, genome-wide association using 91 342 SNPs confirmed the genomic loci derived from QTL mapping. A QTL-based haplotype using whole-genome resequencing of 106 diverse soybean lines identified unique allelic variation in wild soybean that could be utilized to widen the genetic base in cultivated soybean. © 2018 The Authors. Plant Biotechnology Journal published by Society for Experimental Biology and The Association of Applied Biologists and John Wiley & Sons Ltd.

  8. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services

    PubMed Central

    Castaño-Díez, Daniel

    2017-01-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance. PMID:28580909

  9. The Dynamo package for tomography and subtomogram averaging: components for MATLAB, GPU computing and EC2 Amazon Web Services.

    PubMed

    Castaño-Díez, Daniel

    2017-06-01

    Dynamo is a package for the processing of tomographic data. As a tool for subtomogram averaging, it includes different alignment and classification strategies. Furthermore, its data-management module allows experiments to be organized in groups of tomograms, while offering specialized three-dimensional tomographic browsers that facilitate visualization, location of regions of interest, modelling and particle extraction in complex geometries. Here, a technical description of the package is presented, focusing on its diverse strategies for optimizing computing performance. Dynamo is built upon mbtools (middle layer toolbox), a general-purpose MATLAB library for object-oriented scientific programming specifically developed to underpin Dynamo but usable as an independent tool. Its structure intertwines a flexible MATLAB codebase with precompiled C++ functions that carry the burden of numerically intensive operations. The package can be delivered as a precompiled standalone ready for execution without a MATLAB license. Multicore parallelization on a single node is directly inherited from the high-level parallelization engine provided for MATLAB, automatically imparting a balanced workload among the threads in computationally intense tasks such as alignment and classification, but also in logistic-oriented tasks such as tomogram binning and particle extraction. Dynamo supports the use of graphical processing units (GPUs), yielding considerable speedup factors both for native Dynamo procedures (such as the numerically intensive subtomogram alignment) and procedures defined by the user through its MATLAB-based GPU library for three-dimensional operations. Cloud-based virtual computing environments supplied with a pre-installed version of Dynamo can be publicly accessed through the Amazon Elastic Compute Cloud (EC2), enabling users to rent GPU computing time on a pay-as-you-go basis, thus avoiding upfront investments in hardware and longterm software maintenance.

  10. Creating aperiodic photonic structures by synthesized Mathieu-Gauss beams

    NASA Astrophysics Data System (ADS)

    Vasiljević, Jadranka M.; Zannotti, Alessandro; Timotijević, Dejan V.; Denz, Cornelia; Savić, Dragana M. Jović

    2017-08-01

    We demonstrate a kind of aperiodic photonic structure realized using the interference of multiple Mathieu-Gauss beams. Depending on the beam configurations, their mutual distances, angles of rotation, or phase relations we are able to observe different classes of such aperiodic optically induced refractive index structures. Our experimental approach is based on the optical induction in a single parallel writing process.

  11. Two-dimensional imaging via a narrowband MIMO radar system with two perpendicular linear arrays.

    PubMed

    Wang, Dang-wei; Ma, Xiao-yan; Su, Yi

    2010-05-01

    This paper presents a system model and method for the 2-D imaging application via a narrowband multiple-input multiple-output (MIMO) radar system with two perpendicular linear arrays. Furthermore, the imaging formulation for our method is developed through a Fourier integral processing, and the parameters of antenna array including the cross-range resolution, required size, and sampling interval are also examined. Different from the spatial sequential procedure sampling the scattered echoes during multiple snapshot illuminations in inverse synthetic aperture radar (ISAR) imaging, the proposed method utilizes a spatial parallel procedure to sample the scattered echoes during a single snapshot illumination. Consequently, the complex motion compensation in ISAR imaging can be avoided. Moreover, in our array configuration, multiple narrowband spectrum-shared waveforms coded with orthogonal polyphase sequences are employed. The mainlobes of the compressed echoes from the different filter band could be located in the same range bin, and thus, the range alignment in classical ISAR imaging is not necessary. Numerical simulations based on synthetic data are provided for testing our proposed method.

  12. 14 CFR 125.183 - Carriage of cargo in passenger compartments.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... emergency landing conditions applicable to the passenger seats of the airplane in which the bin is installed... bin. (3) The bin may not impose any load on the floor or other structure of the airplane that exceeds the load limitations of that structure. (4) The bin must be attached to the seat tracks or to the...

  13. 18. VIEW OF CRUDE ORE BINS FROM WEST. WEST CRUDE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. VIEW OF CRUDE ORE BINS FROM WEST. WEST CRUDE ORE BIN AND TRESTLE FROM TWO JOHNS TRAMLINE TO SOUTH, CRUDE ORE BIN IN FOREGROUND. MACHINE SHOP IN BACKGROUND. THE TRAM TO PORTLAND PASSED TO NORTH OF MACHINE SHOP. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  14. 4. TROJAN MILL, DETAIL OF CRUDE ORE BINS FROM NORTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. TROJAN MILL, DETAIL OF CRUDE ORE BINS FROM NORTH, c. 1912. SHOWS TIMBER FRAMING UNDER CONSTRUCTION FOR EAST AND WEST CRUDE ORE BINS AT PREVIOUS LOCATION OF CRUSHER HOUSE, AND SNOW SHED PRESENT OVER SOUTH CRUDE ORE BIN WITH PHASE CHANGE IN SNOW SHED CONSTRUCTION INDICATED AT EAST END OF EAST CRUDE ORE BIN. THIS PHOTOGRAPH IS THE FIRST IMAGE OF THE MACHINE SHOP, UPPER LEFT CORNER. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  15. DNA barcoding for species identification in deep-sea clams (Mollusca: Bivalvia: Vesicomyidae).

    PubMed

    Liu, Jun; Zhang, Haibin

    2018-01-15

    Deep-sea clams (Bivalvia: Vesicomyidae) have been found in reduced environments over the world oceans, but taxonomy of this group remains confusing at species and supraspecific levels due to their high-morphological similarity and plasticity. In the present study, we collected mitochondrial COI sequences to evaluate the utility of DNA barcoding on identifying vesicomyid species. COI dataset identified 56 well-supported putative species/operational taxonomic units (OTUs), approximately covering half of the extant vesicomyid species. One species (OTU2) was first detected, and may represent a new species. Average distances between species ranged from 1.65 to 29.64%, generally higher than average intraspecific distances (0-1.41%) when excluding Pliocardia sp.10 cf. venusta (average intraspecific distance 1.91%). Local barcoding gap existed in 33 of the 35 species when comparing distances of maximum interspecific and minimum interspecific distances with two exceptions (Abyssogena southwardae and Calyptogena rectimargo-starobogatovi). The barcode index number (BIN) system determined 41 of the 56 species/OTUs, each with a unique BIN, indicating their validity. Three species were found to have two BINs, together with their high level of intraspecific variation, implying cryptic diversity within them. Although fewer 16 S sequences were collected, similar results were obtained. Nineteen putative species were determined and no overlap observed between intra- and inter-specific variation. Implications of DNA barcoding for the Vesicomyidae taxonomy were then discussed. Findings of this study will provide important evidence for taxonomic revision in this problematic clam group, and accelerate the discovery of new vesicomyid species in the future.

  16. Explicit Cloud Nucleation from Arbitrary Mixtures of Aerosol Types and Sizes Using an Ultra-Efficient In-Line Aerosol Bin Model in High-Resolution Simulations of Hurricanes

    NASA Astrophysics Data System (ADS)

    Walko, R. L.; Ashby, T.; Cotton, W. R.

    2017-12-01

    The fundamental role of atmospheric aerosols in the process of cloud droplet nucleation is well known, and there is ample evidence that the concentration, size, and chemistry of aerosols can strongly influence microphysical, thermodynamic, and ultimately dynamic properties and evolution of clouds and convective systems. With the increasing availability of observation- and model-based environmental representations of different types of anthropogenic and natural aerosols, there is increasing need for models to be able to represent which aerosols nucleate and which do not in supersaturated conditions. However, this is a very complex process that involves competition for water vapor between multiple aerosol species (chemistries) and different aerosol sizes within each species. Attempts have been made to parameterize the nucleation properties of mixtures of different aerosol species, but it is very difficult or impossible to represent all possible mixtures that may occur in practice. As part of a modeling study of the impact of anthropogenic and natural aerosols on hurricanes, we developed an ultra-efficient aerosol bin model to represent nucleation in a high-resolution atmospheric model that explicitly represents cloud- and subcloud-scale vertical motion. The bin model is activated at any time and location in a simulation where supersaturation occurs and is potentially capable of activating new cloud droplets. The bins are populated from the aerosol species that are present at the given time and location and by multiple sizes from each aerosol species according to a characteristic size distribution, and the chemistry of each species is represented by its absorption or adsorption characteristics. The bin model is integrated in time increments that are smaller than that of the atmospheric model in order to temporally resolve the peak supersaturation, which determines the total nucleated number. Even though on the order of 100 bins are typically utilized, this leads only to a 10 or 20% increase in overall computational cost due to the efficiency of the bin model. This method is highly versatile in that it automatically accommodates any possible number and mixture of different aerosol species. Applications of this model to simulations of Typhoon Nuri will be presented.

  17. Automating the selection of standard parallels for conic map projections

    NASA Astrophysics Data System (ADS)

    Šavriǒ, Bojan; Jenny, Bernhard

    2016-05-01

    Conic map projections are appropriate for mapping regions at medium and large scales with east-west extents at intermediate latitudes. Conic projections are appropriate for these cases because they show the mapped area with less distortion than other projections. In order to minimize the distortion of the mapped area, the two standard parallels of conic projections need to be selected carefully. Rules of thumb exist for placing the standard parallels based on the width-to-height ratio of the map. These rules of thumb are simple to apply, but do not result in maps with minimum distortion. There also exist more sophisticated methods that determine standard parallels such that distortion in the mapped area is minimized. These methods are computationally expensive and cannot be used for real-time web mapping and GIS applications where the projection is adjusted automatically to the displayed area. This article presents a polynomial model that quickly provides the standard parallels for the three most common conic map projections: the Albers equal-area, the Lambert conformal, and the equidistant conic projection. The model defines the standard parallels with polynomial expressions based on the spatial extent of the mapped area. The spatial extent is defined by the length of the mapped central meridian segment, the central latitude of the displayed area, and the width-to-height ratio of the map. The polynomial model was derived from 3825 maps-each with a different spatial extent and computationally determined standard parallels that minimize the mean scale distortion index. The resulting model is computationally simple and can be used for the automatic selection of the standard parallels of conic map projections in GIS software and web mapping applications.

  18. Design of modular control system for grain dryers

    NASA Astrophysics Data System (ADS)

    He, Gaoqing; Liu, Yanhua; Zu, Yuan

    In order to effectively control the temperature of grain drying bin, grain ,air outlet as well as the grain moisture, it designed the control system of 5HCY-35 which is based on MCU to adapt to all grains drying conditions, high drying efficiency, long life usage and less manually. The system includes: the control module of the constant temperature and the temperature difference control in drying bin, the constant temperature control of heating furnace, on-line testing of moisture, variety of grain-circulation speed control and human-computer interaction interface. Spatial curve simulation, which takes moisture as control objectives, controls the constant temperature and the temperature difference in drying bin according to preset parameter by the user or a list to reduce the grains explosive to ensure the seed germination percentage. The system can realize the intelligent control of high efficiency and various drying, the good scalability and the high quality.

  19. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Li, X.; Khain, A.; Simpson, S.

    2005-01-01

    Cloud microphysics are inevitable affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds, Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effect of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, a detailed spectral-bin microphysical scheme was implemented into the the Goddard Cumulus Ensemble (GCE) model. The formulation for the explicit spectral-bim microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e., pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e., 33 bins). Atmospheric aerosols are also described using number density size-distribution functions.

  20. 9. 5TH FLOOR, INTERIOR DETAIL TO EAST OF SOAP BIN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. 5TH FLOOR, INTERIOR DETAIL TO EAST OF SOAP BIN No. 4: UPPER SCREWS MOVED SOAP CHIPS HORIZONTALLY FROM BIN TO BIN; LOWER LEFT-AND RIGHT-HAND SCREWS MOVED CHIPS TO CHUTE LEADING TO 3RD FLOOR SOAP MILLS - Colgate & Company Jersey City Plant, Building No. B-14, 54-58 Grand Street, Jersey City, Hudson County, NJ

  1. 13. OBLIQUE VIEW OF UPPER ORE BIN, LOOKING WEST NORTHWEST. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. OBLIQUE VIEW OF UPPER ORE BIN, LOOKING WEST NORTHWEST. THIS ORE BIN WAS ADDED IN THE LATE 1930'S. IT IS TRAPAZOIDAL IN SHAPE, WIDER AT THE REAR THAN THE FRONT, AND DIVIDED INTO THREE BINS, EACH WITH ITS OWN CONTROL DOOR (SEE CA-290-15). - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  2. 3. EAGLE MILL, DETAIL OF CRUDE ORE BIN FROM NORTH, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. EAGLE MILL, DETAIL OF CRUDE ORE BIN FROM NORTH, c. 1908-10. SHOWS EXPOSED CRUSHER HOUSE IN FRONT OF (SOUTH) CRUDE ORE BIN AND SNOW SHED ADDED OVER TRAM TRACKS. NOTE LACK OF EAST OR WEST CRUDE ORE BINS. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  3. Loss of Bin1 Promotes the Propagation of Tau Pathology.

    PubMed

    Calafate, Sara; Flavin, William; Verstreken, Patrik; Moechars, Diederik

    2016-10-18

    Tau pathology propagates within synaptically connected neuronal circuits, but the underlying mechanisms are unclear. BIN1-amphiphysin2 is the second most prevalent genetic risk factor for late-onset Alzheimer's disease. In diseased brains, the BIN1-amphiphysin2 neuronal isoform is downregulated. Here, we show that lowering BIN1-amphiphysin2 levels in neurons promotes Tau pathology propagation whereas overexpression of neuronal BIN1-amphiphysin2 inhibits the process in two in vitro models. Increased Tau propagation is caused by increased endocytosis, given our finding that BIN1-amphiphysin2 negatively regulates endocytic flux. Furthermore, blocking endocytosis by inhibiting dynamin also reduces Tau pathology propagation. Using a galectin-3-binding assay, we show that internalized Tau aggregates damage the endosomal membrane, allowing internalized aggregates to leak into the cytoplasm to propagate pathology. Our work indicates that lower BIN1 levels promote the propagation of Tau pathology by efficiently increasing aggregate internalization by endocytosis and endosomal trafficking. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Mosquito larvicide BinAB revealed by de novo phasing with an X-ray laser

    PubMed Central

    Colletier, Jacques-Philippe; Sawaya, Michael R.; Gingery, Mari; Rodriguez, Jose A.; Cascio, Duilio; Brewster, Aaron S.; Michels-Clark, Tara; Hice, Robert H.; Coquelle, Nicolas; Boutet, Sébastien; Williams, Garth J.; Messerschmidt, Marc; DePonte, Daniel P.; Sierra, Raymond G.; Laksmono, Hartawan; Koglin, Jason E.; Hunter, Mark S.; Park, Hyun-Woo; Uervirojnangkoorn, Monarin; Bideshi, Dennis K.; Brunger, Axel T.; Federici, Brian A.; Sauter, Nicholas K.; Eisenberg, David S.

    2016-01-01

    Summary BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally toxic oligomeric pores. The small size of the crystals, 50 unit cells per edge, on average, has impeded structural characterization by conventional means. Here, we report the structure of BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser (XFEL). The structure reveals tyrosine and carboxylate-mediated contacts acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears responsible for anchoring BinA to receptor-bound BinB for co-internalization. Remarkably, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation. PMID:27680699

  5. On the Mathematical Consequences of Binning Spike Trains.

    PubMed

    Cessac, Bruno; Le Ny, Arnaud; Löcherbach, Eva

    2017-01-01

    We initiate a mathematical analysis of hidden effects induced by binning spike trains of neurons. Assuming that the original spike train has been generated by a discrete Markov process, we show that binning generates a stochastic process that is no longer Markov but is instead a variable-length Markov chain (VLMC) with unbounded memory. We also show that the law of the binned raster is a Gibbs measure in the DLR (Dobrushin-Lanford-Ruelle) sense coined in mathematical statistical mechanics. This allows the derivation of several important consequences on statistical properties of binned spike trains. In particular, we introduce the DLR framework as a natural setting to mathematically formalize anticipation, that is, to tell "how good" our nervous system is at making predictions. In a probabilistic sense, this corresponds to condition a process by its future, and we discuss how binning may affect our conclusions on this ability. We finally comment on the possible consequences of binning in the detection of spurious phase transitions or in the detection of incorrect evidence of criticality.

  6. Kmerind: A Flexible Parallel Library for K-mer Indexing of Biological Sequences on Distributed Memory Systems.

    PubMed

    Pan, Tony; Flick, Patrick; Jain, Chirag; Liu, Yongchao; Aluru, Srinivas

    2017-10-09

    Counting and indexing fixed length substrings, or k-mers, in biological sequences is a key step in many bioinformatics tasks including genome alignment and mapping, genome assembly, and error correction. While advances in next generation sequencing technologies have dramatically reduced the cost and improved latency and throughput, few bioinformatics tools can efficiently process the datasets at the current generation rate of 1.8 terabases every 3 days. We present Kmerind, a high performance parallel k-mer indexing library for distributed memory environments. The Kmerind library provides a set of simple and consistent APIs with sequential semantics and parallel implementations that are designed to be flexible and extensible. Kmerind's k-mer counter performs similarly or better than the best existing k-mer counting tools even on shared memory systems. In a distributed memory environment, Kmerind counts k-mers in a 120 GB sequence read dataset in less than 13 seconds on 1024 Xeon CPU cores, and fully indexes their positions in approximately 17 seconds. Querying for 1% of the k-mers in these indices can be completed in 0.23 seconds and 28 seconds, respectively. Kmerind is the first k-mer indexing library for distributed memory environments, and the first extensible library for general k-mer indexing and counting. Kmerind is available at https://github.com/ParBLiSS/kmerind.

  7. 7. TROJAN MILL, EXTERIOR FROM NORTHWEST, c. 191828. ADDITIONS FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    7. TROJAN MILL, EXTERIOR FROM NORTHWEST, c. 1918-28. ADDITIONS FOR PRIMARY THICKENERS No. 1 AND No. 2, SECONDARY THICKENERS No. 1, No. 2, AND No. 3, AGITATORS, AIR COMPRESSOR, AND PORTLAND FILTERS ARE SHOWN COMPLETE. STAIR ON NORTH SIDE OF CRUDE ORE BINS IS PRESENT AS IS THE LIME BIN ADJACENT TO THE WEST CRUDE ORE BIN, AND THE SNOW SHED ADDED OVER THE TRAMLINE SERVING THE EAST AND WEST CRUDE ORE BINS. ALSO PRESENT IS THE BABBITT HOUSE AND ROCK BIN. CREDIT JW. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  8. 4D CT amplitude binning for the generation of a time-averaged 3D mid-position CT scan

    NASA Astrophysics Data System (ADS)

    Kruis, Matthijs F.; van de Kamer, Jeroen B.; Belderbos, José S. A.; Sonke, Jan-Jakob; van Herk, Marcel

    2014-09-01

    The purpose of this study was to develop a method to use amplitude binned 4D-CT (A-4D-CT) data for the construction of mid-position CT data and to compare the results with data created from phase-binned 4D-CT (P-4D-CT) data. For the latter purpose we have developed two measures which describe the regularity of the 4D data and we have tried to correlate these measures with the regularity of the external respiration signal. 4D-CT data was acquired for 27 patients on a combined PET-CT scanner. The 4D data were reconstructed twice, using phase and amplitude binning. The 4D frames of each dataset were registered using a quadrature-based optical flow method. After registration the deformation vector field was repositioned to the mid-position. Since amplitude-binned 4D data does not provide temporal information, we corrected the mid-position for the occupancy of the bins. We quantified the differences between the two mid-position datasets in terms of tumour offset and amplitude differences. Furthermore, we measured the standard deviation of the image intensity over the respiration after registration (σregistration) and the regularity of the deformation vector field (\\overline{\\Delta |J|} ) to quantify the quality of the 4D-CT data. These measures were correlated to the regularity of the external respiration signal (σsignal). The two irregularity measures, \\overline{\\Delta |J|} and σregistration, were dependent on each other (p < 0.0001, R2 = 0.80 for P-4D-CT, R2 = 0.74 for A-4D-CT). For all datasets amplitude binning resulted in lower \\overline{\\Delta |J|} and σregistration and large decreases led to visible quality improvements in the mid-position data. The quantity of artefact decrease was correlated to the irregularity of the external respiratory signal. The average tumour offset between the phase and amplitude binned mid-position without occupancy correction was 0.42 mm in the caudal direction (10.6% of the amplitude). After correction this was reduced to 0.16 mm in caudal direction (4.1% of the amplitude). Similar relative offsets were found at the diaphragm. We have devised a method to use amplitude binned 4D-CT to construct motion model and generate a mid-position planning CT for radiotherapy treatment purposes. We have decimated the systematic offset of this mid-position model with a motion model derived from P-4D-CT. We found that the A-4D-CT led to a decrease of local artefacts and that this decrease was correlated to the irregularity of the external respiration signal.

  9. Dark Energy Survey Year 1 Results: Weak Lensing Mass Calibration of redMaPPer Galaxy Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McClintock, T.; et al.

    We constrain the mass--richness scaling relation of redMaPPer galaxy clusters identified in the Dark Energy Survey Year 1 data using weak gravitational lensing. We split clusters intomore » $$4\\times3$$ bins of richness $$\\lambda$$ and redshift $z$ for $$\\lambda\\geq20$$ and $$0.2 \\leq z \\leq 0.65$$ and measure the mean masses of these bins using their stacked weak lensing signal. By modeling the scaling relation as $$\\langle M_{\\rm 200m}|\\lambda,z\\rangle = M_0 (\\lambda/40)^F ((1+z)/1.35)^G$$, we constrain the normalization of the scaling relation at the 5.0 per cent level as $$M_0 = [3.081 \\pm 0.075 ({\\rm stat}) \\pm 0.133 ({\\rm sys})] \\cdot 10^{14}\\ {\\rm M}_\\odot$$ at $$\\lambda=40$$ and $z=0.35$. The richness scaling index is constrained to be $$F=1.356 \\pm 0.051\\ ({\\rm stat})\\pm 0.008\\ ({\\rm sys})$$ and the redshift scaling index $$G=-0.30\\pm 0.30\\ ({\\rm stat})\\pm 0.06\\ ({\\rm sys})$$. These are the tightest measurements of the normalization and richness scaling index made to date. We use a semi-analytic covariance matrix to characterize the statistical errors in the recovered weak lensing profiles. Our analysis accounts for the following sources of systematic error: shear and photometric redshift errors, cluster miscentering, cluster member dilution of the source sample, systematic uncertainties in the modeling of the halo--mass correlation function, halo triaxiality, and projection effects. We discuss prospects for reducing this systematic error budget, which dominates the uncertainty on $$M_0$$. Our result is in excellent agreement with, but has significantly smaller uncertainties than, previous measurements in the literature, and augurs well for the power of the DES cluster survey as a tool for precision cosmology and upcoming galaxy surveys such as LSST, Euclid and WFIRST.« less

  10. SU-F-T-253: Volumetric Comparison Between 4D CT Amplitude and Phase Binning Mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, G; Ma, R; Reyngold, M

    2016-06-15

    Purpose: Motion artifact in 4DCT images can affect radiation treatment quality. To identify the most robust and accurate binning method, we compare the volume difference between targets delineated on amplitude and phase binned 4DCT scans. Methods: Varian RPM system and CT scanner were used to acquire 4DCTs of a Quasar phantom with embedded cubic and spherical objects having superior-inferior motion. Eight patients’ respiration waveforms were used to drive the phantom. The 4DCT scan was reconstructed into 10 phase and 10 amplitude bins (2 mm slices). A scan of the static phantom was also acquired. For each waveform, sphere and cubemore » volumes were generated automatically on each phase using HU thresholding. Phase (amplitude) ITVs were the union of object volumes over all phase (amplitude) binned images. The sphere and cube volumes measured in the static phantom scan were V{sub sphere}=4.19cc and V{sub cube}=27.0cc. Volume difference (VD) and dice similarity coefficient (DSC) of the ITVs, and mean volume error (MVE) defined as the average target volume percentage difference between each phase image and the static image, were used to evaluate the performance of amplitude and phase binning. Results: Averaged over the eight breathing traces, the VD and DSC of the internal target volume (ITV) between amplitude and phase binning were 3.4%±3.2% (mean ± std) and 95.9%±2.1% for sphere; 2.1%±3.3% and 98.0% ±1.5% for cube, respectively.For all waveforms, the average sphere MVE of amplitude and phase binning was 6.5% ± 5.0% and 8.2%±6.3%, respectively; and the average cube MVE of amplitude and phase binning was 5.7%±3.5%and 12.9%±8.9%, respectively. Conclusion: ITV volume and spatial overlap as assessed by VD and DSC are similar between amplitude and phase binning. Compared to phase binning, amplitude binning results in lower MVE suggesting it is less susceptible to motion artifact.« less

  11. Surface contamination of hazardous drug pharmacy storage bins and pharmacy distributor shipping containers.

    PubMed

    Redic, Kimberly A; Fang, Kayleen; Christen, Catherine; Chaffee, Bruce W

    2018-03-01

    Purpose This study was conducted to determine whether there is contamination on exterior drug packaging using shipping totes from the distributor and carousel storage bins as surrogate markers of external packaging contamination. Methods A two-part study was conducted to measure the presence of 5-fluorouracil, ifosfamide, cyclophosphamide, docetaxel and paclitaxel using surrogate markers for external drug packaging. In Part I, 10 drug distributor shipping totes designated for transport of hazardous drugs provided a snapshot view of contamination from regular use and transit in and out of the pharmacy. An additional two totes designated for transport of non-hazardous drugs served as controls. In Part II, old carousel storage bins (i.e. those in use pre-study) were wiped for snapshot view of hazardous drug contamination on storage bins. New carousel storage bins were then put into use for storage of the five tested drugs and used for routine storage and inventory maintenance activities. Carousel bins were wiped at time intervals 0, 8, 16 and 52 weeks to measure surface contamination. Results Two of the 10 hazardous shipping totes were contaminated. Three of the five-old carousel bins were contaminated with cyclophosphamide. One of the old carousel bins was also contaminated with ifosfamide. There were no detectable levels of hazardous drugs on any of the new storage bins at time 0, 8 or 16 weeks. However, at the Week 52, there was a detectable level of 5-FU present in the 5-FU carousel bin. Conclusions Contamination of the surrogate markers suggests that external packaging for hazardous drugs is contaminated, either during the manufacturing process or during routine chain of custody activities. These results demonstrate that occupational exposure may occur due to contamination from shipping totes and storage bins, and that handling practices including use of personal protective equipment is warranted.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  13. Fully integrated sub 100ps photon counting platform

    NASA Astrophysics Data System (ADS)

    Buckley, S. J.; Bellis, S. J.; Rosinger, P.; Jackson, J. C.

    2007-02-01

    Current state of the art high resolution counting modules, specifically designed for high timing resolution applications, are largely based on a computer card format. This has tended to result in a costly solution that is restricted to the computer it resides in. We describe a four channel timing module that interfaces to a computer via a USB port and operates with a resolution of less than 100 picoseconds. The core design of the system is an advanced field programmable gate array (FPGA) interfacing to a precision time interval measurement module, mass memory block and a high speed USB 2.0 serial data port. The FPGA design allows the module to operate in a number of modes allowing both continuous recording of photon events (time-tagging) and repetitive time binning. In time-tag mode the system reports, for each photon event, the high resolution time along with the chronological time (macro time) and the channel ID. The time-tags are uploaded in real time to a host computer via a high speed USB port allowing continuous storage to computer memory of up to 4 millions photons per second. In time-bin mode, binning is carried out with count rates up to 10 million photons per second. Each curve resides in a block of 128,000 time-bins each with a resolution programmable down to less than 100 picoseconds. Each bin has a limit of 65535 hits allowing autonomous curve recording until a bin reaches the maximum count or the system is commanded to halt. Due to the large memory storage, several curves/experiments can be stored in the system prior to uploading to the host computer for analysis. This makes this module ideal for integration into high timing resolution specific applications such as laser ranging and fluorescence lifetime imaging using techniques such as time correlated single photon counting (TCSPC).

  14. The Impact of Aerosols on Cloud and Precipitation Processes: Cloud-Resolving Model Simulations

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Khain, A.; Simpson, S.; Johnson, D.; Li, X.; Remer, L.

    2003-01-01

    Cloud microphysics are inevitably affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds. Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effects of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, two detailed spectral-bin microphysical schemes were implemented into the Goddard Cumulus Ensemble (GCE) model. The formulation for the explicit spectral-bin microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e.,pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e. 33 bins). Atmospheric aerosols are also described using number density size-distribution functions.A spectral-bin microphysical model is very expensive from a from a computational point of view and has only been implemented into the 2D version of the GCE at the present time. The model is tested by studying the evolution of deep tropical clouds in the west Pacific warm pool region using identical thermodynamic conditions but with different concentrations of CCN: a low "clean" concentration and a high "dirty" concentration. Besides the initial differences in aerosol concentration, preliminary results indicate that the low CCN concentration case produces rainfall at the surface sooner than the high CCN case but has less cloud water mass aloft. Because the spectral-bin model explicitly calculates and allows for the examination of both the mass and number concentration of species in each size categor, a detailed analysis of the instantaneous size spectrum can be obtained for the two cases. It is shown that since the low CCN case produces fewer droplets, larger sized develop due to the greater condensational and collectional growth, leading to a broader size spectrum in comparison to the high CCN case.

  15. Characterization of a hybrid energy-resolving photon-counting detector

    NASA Astrophysics Data System (ADS)

    Zang, A.; Pelzer, G.; Anton, G.; Ballabriga Sune, R.; Bisello, F.; Campbell, M.; Fauler, A.; Fiederle, M.; Llopart Cudie, X.; Ritter, I.; Tennert, F.; Wölfel, S.; Wong, W. S.; Michel, T.

    2014-03-01

    Photon-counting detectors in medical x-ray imaging provide a higher dose efficiency than integrating detectors. Even further possibilities for imaging applications arise, if the energy of each photon counted is measured, as for example K-edge-imaging or optimizing image quality by applying energy weighting factors. In this contribution, we show results of the characterization of the Dosepix detector. This hybrid photon- counting pixel detector allows energy resolved measurements with a novel concept of energy binning included in the pixel electronics. Based on ideas of the Medipix detector family, it provides three different modes of operation: An integration mode, a photon-counting mode, and an energy-binning mode. In energy-binning mode, it is possible to set 16 energy thresholds in each pixel individually to derive a binned energy spectrum in every pixel in one acquisition. The hybrid setup allows using different sensor materials. For the measurements 300 μm Si and 1 mm CdTe were used. The detector matrix consists of 16 x 16 square pixels for CdTe (16 x 12 for Si) with a pixel pitch of 220 μm. The Dosepix was originally intended for applications in the field of radiation measurement. Therefore it is not optimized towards medical imaging. The detector concept itself still promises potential as an imaging detector. We present spectra measured in one single pixel as well as in the whole pixel matrix in energy-binning mode with a conventional x-ray tube. In addition, results concerning the count rate linearity for the different sensor materials are shown as well as measurements regarding energy resolution.

  16. Recipe creation for automated defect classification with a 450mm surface scanning inspection system based on the bidirectional reflectance distribution function of native defects

    NASA Astrophysics Data System (ADS)

    Yathapu, Nithin; McGarvey, Steve; Brown, Justin; Zhivotovsky, Alexander

    2016-03-01

    This study explores the feasibility of Automated Defect Classification (ADC) with a Surface Scanning Inspection System (SSIS). The defect classification was based upon scattering sensitivity sizing curves created via modeling of the Bidirectional Reflectance Distribution Function (BRDF). The BRDF allowed for the creation of SSIS sensitivity/sizing curves based upon the optical properties of both the filmed wafer samples and the optical architecture of the SSIS. The elimination of Polystyrene Latex Sphere (PSL) and Silica deposition on both filmed and bare Silicon wafers prior to SSIS recipe creation and ADC creates a challenge for light scattering surface intensity based defect binning. This study explored the theoretical maximal SSIS sensitivity based on native defect recipe creation in conjunction with the maximal sensitivity derived from BRDF modeling recipe creation. Single film and film stack wafers were inspected with recipes based upon BRDF modeling. Following SSIS recipe creation, initially targeting maximal sensitivity, selected recipes were optimized to classify defects commonly found on non-patterned wafers. The results were utilized to determine the ADC binning accuracy of the native defects and evaluate the SSIS recipe creation methodology. A statistically valid sample of defects from the final inspection results of each SSIS recipe and filmed substrate were reviewed post SSIS ADC processing on a Defect Review Scanning Electron Microscope (SEM). Native defect images were collected from each statistically valid defect bin category/size for SEM Review. The data collected from the Defect Review SEM was utilized to determine the statistical purity and accuracy of each SSIS defect classification bin. This paper explores both, commercial and technical, considerations of the elimination of PSL and Silica deposition as a precursor to SSIS recipe creation targeted towards ADC. Successful integration of SSIS ADC in conjunction with recipes created via BRDF modeling has the potential to dramatically reduce the workload requirements of a Defect Review SEM and save a significant amount of capital expenditure for 450mm SSIS recipe creation.

  17. Scale-dependent Normalized Amplitude and Weak Spectral Anisotropy of Magnetic Field Fluctuations in the Solar Wind Turbulence

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Tu, Chuanyi; Marsch, Eckart; He, Jiansen; Wang, Linghua

    2016-01-01

    Turbulence in the solar wind was recently reported to be anisotropic, with the average power spectral index close to -2 when sampling parallel to the local mean magnetic field B0 and close to -5/3 when sampling perpendicular to the local B0. This result was widely considered to be observational evidence for the critical balance theory (CBT), which is derived by making the assumption that the turbulence strength is close to one. However, this basic assumption has not yet been checked carefully with observational data. Here we present for the first time the scale-dependent magnetic-field fluctuation amplitude, which is normalized by the local B0 and evaluated for both parallel and perpendicular sampling directions, using two 30-day intervals of Ulysses data. From our results, the turbulence strength is evaluated as much less than one at small scales in the parallel direction. An even stricter criterion is imposed when selecting the wavelet coefficients for a given sampling direction, so that the time stationarity of the local B0 is better ensured during the local sampling interval. The spectral index for the parallel direction is then found to be -1.75, whereas the spectral index in the perpendicular direction remains close to -1.65. These two new results, namely that the value of the turbulence strength is much less than one in the parallel direction and that the angle dependence of the spectral index is weak, cannot be explained by existing turbulence theories, like CBT, and thus will require new theoretical considerations and promote further observations of solar-wind turbulence.

  18. 16. Coke 'fines' bin at Furnace D. After delivery to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. Coke 'fines' bin at Furnace D. After delivery to the trestle bins, the coke was screened and the coke 'fines' or breeze, were transported by conveyor to the coke fines bins where it was collected and leaded into dump trucks. The coke fines were then sold for fuel to a sinter plant in Lorain, Ohio. - Central Furnaces, 2650 Broadway, east bank of Cuyahoga River, Cleveland, Cuyahoga County, OH

  19. Qatar: Background and U.S. Relations

    DTIC Science & Technology

    2014-11-04

    Ahmed bin Abdullah bin Ziad Al Mahmoud Foreign Minister Khalid Bin Mohammed Al Attiyah Minister of Energy and Industry Mohammed bin Saleh al Sada...about voter franchise extension were resolved.5 The Advisory Council would have oversight authority over the Council of Ministers and would be able...Sunni armed groups in Syria has the potential to have a more lasting impact on the region, but has challenged the traditional Qatari preference for

  20. Toward zero waste events: Reducing contamination in waste streams with volunteer assistance.

    PubMed

    Zelenika, Ivana; Moreau, Tara; Zhao, Jiaying

    2018-06-01

    Public festivals and events generate a tremendous amount of waste, especially when they involve food and drink. To reduce contamination across waste streams, we evaluated three types of interventions at a public event. In a randomized control trial, we examined the impact of volunteer staff assistance, bin tops, and sample 3D items with bin tops, on the amount of contamination and the weight of the organics, recyclable containers, paper, and garbage bins at a public event. The event was the annual Apple Festival held at the University of British Columbia, which was attended by around 10,000 visitors. We found that contamination was the lowest in the volunteer staff condition among all conditions. Specifically, volunteer staff reduced contamination by 96.1% on average in the organics bin, 96.9% in the recyclable containers bin, 97.0% in the paper bin, and 84.9% in the garbage bin. Our interventions did not influence the weight of the materials in the bins. This finding highlights the impact of volunteers on reducing contamination in waste streams at events, and provides suggestions and implications for waste management for event organizers to minimize contamination in all waste streams to achieve zero waste goals. Copyright © 2018. Published by Elsevier Ltd.

  1. A broadband polarization-insensitive cloak based on mode conversion

    PubMed Central

    Gu, Chendong; Xu, Yadong; Li, Sucheng; Lu, Weixin; Li, Jensen; Chen, Huanyang; Hou, Bo

    2015-01-01

    In this work, we demonstrate an one-dimensional cloak consisting of parallel-plated waveguide with two slabs of gradient index metamaterials attached to its metallic walls. In it objects are hidden without limitation of polarizations, and good performance is observed for a broadband of frequencies. The experiments at microwave frequencies are carried out, supporting the theoretical results very well. The essential principle behind the proposed cloaking device is based on mode conversion, which provides a new strategy to manipulate wave propagation. PMID:26175114

  2. Evaluation of a simple method for the automatic assignment of MeSH descriptors to health resources in a French online catalogue.

    PubMed

    Névéol, Aurélie; Pereira, Suzanne; Kerdelhué, Gaetan; Dahamna, Badisse; Joubert, Michel; Darmoni, Stéfan J

    2007-01-01

    The growing number of resources to be indexed in the catalogue of online health resources in French (CISMeF) calls for curating strategies involving automatic indexing tools while maintaining the catalogue's high indexing quality standards. To develop a simple automatic tool that retrieves MeSH descriptors from documents titles. In parallel to research on advanced indexing methods, a bag-of-words tool was developed for timely inclusion in CISMeF's maintenance system. An evaluation was carried out on a corpus of 99 documents. The indexing sets retrieved by the automatic tool were compared to manual indexing based on the title and on the full text of resources. 58% of the major main headings were retrieved by the bag-of-words algorithm and the precision on main heading retrieval was 69%. Bag-of-words indexing has effectively been used on selected resources to be included in CISMeF since August 2006. Meanwhile, on going work aims at improving the current version of the tool.

  3. Histogram-based adaptive gray level scaling for texture feature classification of colorectal polyps

    NASA Astrophysics Data System (ADS)

    Pomeroy, Marc; Lu, Hongbing; Pickhardt, Perry J.; Liang, Zhengrong

    2018-02-01

    Texture features have played an ever increasing role in computer aided detection (CADe) and diagnosis (CADx) methods since their inception. Texture features are often used as a method of false positive reduction for CADe packages, especially for detecting colorectal polyps and distinguishing them from falsely tagged residual stool and healthy colon wall folds. While texture features have shown great success there, the performance of texture features for CADx have lagged behind primarily because of the more similar features among different polyps types. In this paper, we present an adaptive gray level scaling and compare it to the conventional equal-spacing of gray level bins. We use a dataset taken from computed tomography colonography patients, with 392 polyp regions of interest (ROIs) identified and have a confirmed diagnosis through pathology. Using the histogram information from the entire ROI dataset, we generate the gray level bins such that each bin contains roughly the same number of voxels Each image ROI is the scaled down to two different numbers of gray levels, using both an equal spacing of Hounsfield units for each bin, and our adaptive method. We compute a set of texture features from the scaled images including 30 gray level co-occurrence matrix (GLCM) features and 11 gray level run length matrix (GLRLM) features. Using a random forest classifier to distinguish between hyperplastic polyps and all others (adenomas and adenocarcinomas), we find that the adaptive gray level scaling can improve performance based on the area under the receiver operating characteristic curve by up to 4.6%.

  4. Improved Taxation Rate for Bin Packing Games

    NASA Astrophysics Data System (ADS)

    Kern, Walter; Qiu, Xian

    A cooperative bin packing game is a N-person game, where the player set N consists of k bins of capacity 1 each and n items of sizes a 1, ⋯ ,a n . The value of a coalition of players is defined to be the maximum total size of items in the coalition that can be packed into the bins of the coalition. We present an alternative proof for the non-emptiness of the 1/3-core for all bin packing games and show how to improve this bound ɛ= 1/3 (slightly). We conjecture that the true best possible value is ɛ= 1/7.

  5. G-Bean: an ontology-graph based web tool for biomedical literature retrieval

    PubMed Central

    2014-01-01

    Background Currently, most people use NCBI's PubMed to search the MEDLINE database, an important bibliographical information source for life science and biomedical information. However, PubMed has some drawbacks that make it difficult to find relevant publications pertaining to users' individual intentions, especially for non-expert users. To ameliorate the disadvantages of PubMed, we developed G-Bean, a graph based biomedical search engine, to search biomedical articles in MEDLINE database more efficiently. Methods G-Bean addresses PubMed's limitations with three innovations: (1) Parallel document index creation: a multithreaded index creation strategy is employed to generate the document index for G-Bean in parallel; (2) Ontology-graph based query expansion: an ontology graph is constructed by merging four major UMLS (Version 2013AA) vocabularies, MeSH, SNOMEDCT, CSP and AOD, to cover all concepts in National Library of Medicine (NLM) database; a Personalized PageRank algorithm is used to compute concept relevance in this ontology graph and the Term Frequency - Inverse Document Frequency (TF-IDF) weighting scheme is used to re-rank the concepts. The top 500 ranked concepts are selected for expanding the initial query to retrieve more accurate and relevant information; (3) Retrieval and re-ranking of documents based on user's search intention: after the user selects any article from the existing search results, G-Bean analyzes user's selections to determine his/her true search intention and then uses more relevant and more specific terms to retrieve additional related articles. The new articles are presented to the user in the order of their relevance to the already selected articles. Results Performance evaluation with 106 OHSUMED benchmark queries shows that G-Bean returns more relevant results than PubMed does when using these queries to search the MEDLINE database. PubMed could not even return any search result for some OHSUMED queries because it failed to form the appropriate Boolean query statement automatically from the natural language query strings. G-Bean is available at http://bioinformatics.clemson.edu/G-Bean/index.php. Conclusions G-Bean addresses PubMed's limitations with ontology-graph based query expansion, automatic document indexing, and user search intention discovery. It shows significant advantages in finding relevant articles from the MEDLINE database to meet the information need of the user. PMID:25474588

  6. G-Bean: an ontology-graph based web tool for biomedical literature retrieval.

    PubMed

    Wang, James Z; Zhang, Yuanyuan; Dong, Liang; Li, Lin; Srimani, Pradip K; Yu, Philip S

    2014-01-01

    Currently, most people use NCBI's PubMed to search the MEDLINE database, an important bibliographical information source for life science and biomedical information. However, PubMed has some drawbacks that make it difficult to find relevant publications pertaining to users' individual intentions, especially for non-expert users. To ameliorate the disadvantages of PubMed, we developed G-Bean, a graph based biomedical search engine, to search biomedical articles in MEDLINE database more efficiently. G-Bean addresses PubMed's limitations with three innovations: (1) Parallel document index creation: a multithreaded index creation strategy is employed to generate the document index for G-Bean in parallel; (2) Ontology-graph based query expansion: an ontology graph is constructed by merging four major UMLS (Version 2013AA) vocabularies, MeSH, SNOMEDCT, CSP and AOD, to cover all concepts in National Library of Medicine (NLM) database; a Personalized PageRank algorithm is used to compute concept relevance in this ontology graph and the Term Frequency - Inverse Document Frequency (TF-IDF) weighting scheme is used to re-rank the concepts. The top 500 ranked concepts are selected for expanding the initial query to retrieve more accurate and relevant information; (3) Retrieval and re-ranking of documents based on user's search intention: after the user selects any article from the existing search results, G-Bean analyzes user's selections to determine his/her true search intention and then uses more relevant and more specific terms to retrieve additional related articles. The new articles are presented to the user in the order of their relevance to the already selected articles. Performance evaluation with 106 OHSUMED benchmark queries shows that G-Bean returns more relevant results than PubMed does when using these queries to search the MEDLINE database. PubMed could not even return any search result for some OHSUMED queries because it failed to form the appropriate Boolean query statement automatically from the natural language query strings. G-Bean is available at http://bioinformatics.clemson.edu/G-Bean/index.php. G-Bean addresses PubMed's limitations with ontology-graph based query expansion, automatic document indexing, and user search intention discovery. It shows significant advantages in finding relevant articles from the MEDLINE database to meet the information need of the user.

  7. An application of an optimal statistic for characterizing relative orientations

    NASA Astrophysics Data System (ADS)

    Jow, Dylan L.; Hill, Ryley; Scott, Douglas; Soler, J. D.; Martin, P. G.; Devlin, M. J.; Fissel, L. M.; Poidevin, F.

    2018-02-01

    We present the projected Rayleigh statistic (PRS), a modification of the classic Rayleigh statistic, as a test for non-uniform relative orientation between two pseudo-vector fields. In the application here, this gives an effective way of investigating whether polarization pseudo-vectors (spin-2 quantities) are preferentially parallel or perpendicular to filaments in the interstellar medium. For example, there are other potential applications in astrophysics, e.g. when comparing small-scale orientations with larger scale shear patterns. We compare the efficiency of the PRS against histogram binning methods that have previously been used for characterizing the relative orientations of gas column density structures with the magnetic field projected on the plane of the sky. We examine data for the Vela C molecular cloud, where the column density is inferred from Herschel submillimetre observations, and the magnetic field from observations by the Balloon-borne Large-Aperture Submillimetre Telescope in the 250-, 350- and 500-μm wavelength bands. We find that the PRS has greater statistical power than approaches that bin the relative orientation angles, as it makes more efficient use of the information contained in the data. In particular, the use of the PRS to test for preferential alignment results in a higher statistical significance, in each of the four Vela C regions, with the greatest increase being by a factor 1.3 in the South-Nest region in the 250 - μ m band.

  8. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinez-Rovira, I.; Sempau, J.; Prezado, Y.

    Purpose: Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-{mu}m-wide microbeams spaced by 200-400 {mu}m) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct featuresmore » of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. Methods: The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Results: Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. Conclusions: The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.« less

  9. Monte Carlo-based treatment planning system calculation engine for microbeam radiation therapy.

    PubMed

    Martinez-Rovira, I; Sempau, J; Prezado, Y

    2012-05-01

    Microbeam radiation therapy (MRT) is a synchrotron radiotherapy technique that explores the limits of the dose-volume effect. Preclinical studies have shown that MRT irradiations (arrays of 25-75-μm-wide microbeams spaced by 200-400 μm) are able to eradicate highly aggressive animal tumor models while healthy tissue is preserved. These promising results have provided the basis for the forthcoming clinical trials at the ID17 Biomedical Beamline of the European Synchrotron Radiation Facility (ESRF). The first step includes irradiation of pets (cats and dogs) as a milestone before treatment of human patients. Within this context, accurate dose calculations are required. The distinct features of both beam generation and irradiation geometry in MRT with respect to conventional techniques require the development of a specific MRT treatment planning system (TPS). In particular, a Monte Carlo (MC)-based calculation engine for the MRT TPS has been developed in this work. Experimental verification in heterogeneous phantoms and optimization of the computation time have also been performed. The penelope/penEasy MC code was used to compute dose distributions from a realistic beam source model. Experimental verification was carried out by means of radiochromic films placed within heterogeneous slab-phantoms. Once validation was completed, dose computations in a virtual model of a patient, reconstructed from computed tomography (CT) images, were performed. To this end, decoupling of the CT image voxel grid (a few cubic millimeter volume) to the dose bin grid, which has micrometer dimensions in the transversal direction of the microbeams, was performed. Optimization of the simulation parameters, the use of variance-reduction (VR) techniques, and other methods, such as the parallelization of the simulations, were applied in order to speed up the dose computation. Good agreement between MC simulations and experimental results was achieved, even at the interfaces between two different media. Optimization of the simulation parameters and the use of VR techniques saved a significant amount of computation time. Finally, parallelization of the simulations improved even further the calculation time, which reached 1 day for a typical irradiation case envisaged in the forthcoming clinical trials in MRT. An example of MRT treatment in a dog's head is presented, showing the performance of the calculation engine. The development of the first MC-based calculation engine for the future TPS devoted to MRT has been accomplished. This will constitute an essential tool for the future clinical trials on pets at the ESRF. The MC engine is able to calculate dose distributions in micrometer-sized bins in complex voxelized CT structures in a reasonable amount of time. Minimization of the computation time by using several approaches has led to timings that are adequate for pet radiotherapy at synchrotron facilities. The next step will consist in its integration into a user-friendly graphical front-end.

  10. Spectra-first feature analysis in clinical proteomics - A case study in renal cancer.

    PubMed

    Goh, Wilson Wen Bin; Wong, Limsoon

    2016-10-01

    In proteomics, useful signal may be unobserved or lost due to the lack of confident peptide-spectral matches. Selection of differential spectra, followed by associative peptide/protein mapping may be a complementary strategy for improving sensitivity and comprehensiveness of analysis (spectra-first paradigm). This approach is complementary to the standard approach where functional analysis is performed only on the finalized protein list assembled from identified peptides from the spectra (protein-first paradigm). Based on a case study of renal cancer, we introduce a simple spectra-binning approach, MZ-bin. We demonstrate that differential spectra feature selection using MZ-bin is class-discriminative and can trace relevant proteins via spectra associative mapping. Moreover, proteins identified in this manner are more biologically coherent than those selected directly from the finalized protein list. Analysis of constituent peptides per protein reveals high expression inconsistency, suggesting that the measured protein expressions are in fact, poor approximations of true protein levels. Moreover, analysis at the level of constituent peptides may provide higher resolution insight into the underlying biology: Via MZ-bin, we identified for the first time differential splice forms for the known renal cancer marker MAPT. We conclude that the spectra-first analysis paradigm is a complementary strategy to the traditional protein-first paradigm and can provide deeper level insight.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madduri, Kamesh; Wu, Kesheng

    The Resource Description Framework (RDF) is a popular data model for representing linked data sets arising from the web, as well as large scienti c data repositories such as UniProt. RDF data intrinsically represents a labeled and directed multi-graph. SPARQL is a query language for RDF that expresses subgraph pattern- nding queries on this implicit multigraph in a SQL- like syntax. SPARQL queries generate complex intermediate join queries; to compute these joins e ciently, we propose a new strategy based on bitmap indexes. We store the RDF data in column-oriented structures as compressed bitmaps along with two dictionaries. This papermore » makes three new contributions. (i) We present an e cient parallel strategy for parsing the raw RDF data, building dictionaries of unique entities, and creating compressed bitmap indexes of the data. (ii) We utilize the constructed bitmap indexes to e ciently answer SPARQL queries, simplifying the join evaluations. (iii) To quantify the performance impact of using bitmap indexes, we compare our approach to the state-of-the-art triple-store RDF-3X. We nd that our bitmap index-based approach to answering queries is up to an order of magnitude faster for a variety of SPARQL queries, on gigascale RDF data sets.« less

  12. Frequency-bin entanglement of ultra-narrow band non-degenerate photon pairs

    NASA Astrophysics Data System (ADS)

    Rieländer, Daniel; Lenhard, Andreas; Jime`nez Farìas, Osvaldo; Máttar, Alejandro; Cavalcanti, Daniel; Mazzera, Margherita; Acín, Antonio; de Riedmatten, Hugues

    2018-01-01

    We demonstrate frequency-bin entanglement between ultra-narrowband photons generated by cavity enhanced spontaneous parametric down conversion. Our source generates photon pairs in widely non-degenerate discrete frequency modes, with one photon resonant with a quantum memory material based on praseodymium doped crystals and the other photon at telecom wavelengths. Correlations between the frequency modes are analyzed using phase modulators and narrowband filters before detection. We show high-visibility two photon interference between the frequency modes, allowing us to infer a coherent superposition of the modes. We develop a model describing the state that we create and use it to estimate optimal measurements to achieve a violation of the Clauser-Horne (CH) Bell inequality under realistic assumptions. With these settings we perform a Bell test and show a significant violation of the CH inequality, thus proving the entanglement of the photons. Finally we demonstrate the compatibility with a quantum memory material by using a spectral hole in the praseodymium (Pr) doped crystal as spectral filter for measuring high-visibility two-photon interference. This demonstrates the feasibility of combining frequency-bin entangled photon pairs with Pr-based solid state quantum memories.

  13. cljam: a library for handling DNA sequence alignment/map (SAM) with parallel processing.

    PubMed

    Takeuchi, Toshiki; Yamada, Atsuo; Aoki, Takashi; Nishimura, Kunihiro

    2016-01-01

    Next-generation sequencing can determine DNA bases and the results of sequence alignments are generally stored in files in the Sequence Alignment/Map (SAM) format and the compressed binary version (BAM) of it. SAMtools is a typical tool for dealing with files in the SAM/BAM format. SAMtools has various functions, including detection of variants, visualization of alignments, indexing, extraction of parts of the data and loci, and conversion of file formats. It is written in C and can execute fast. However, SAMtools requires an additional implementation to be used in parallel with, for example, OpenMP (Open Multi-Processing) libraries. For the accumulation of next-generation sequencing data, a simple parallelization program, which can support cloud and PC cluster environments, is required. We have developed cljam using the Clojure programming language, which simplifies parallel programming, to handle SAM/BAM data. Cljam can run in a Java runtime environment (e.g., Windows, Linux, Mac OS X) with Clojure. Cljam can process and analyze SAM/BAM files in parallel and at high speed. The execution time with cljam is almost the same as with SAMtools. The cljam code is written in Clojure and has fewer lines than other similar tools.

  14. The light wave flow effect in a plane-parallel layer with a quasi-zero refractive index under the action of bounded light beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gadomsky, O. N., E-mail: gadomsky@mail.ru; Shchukarev, I. A., E-mail: blacxpress@gmail.com

    2016-08-15

    It is shown that external optical radiation in the 450–1200 nm range can be efficiently transformed under the action of bounded light beams to a surface wave that propagates along the external and internal boundaries of a plane-parallel layer with a quasi-zero refractive index. Reflection regimes with complex and real angles of refraction in the layer are considered. The layer with a quasi-zero refractive index in this boundary problem is located on a highly reflective metal substrate; it is shown that the uniform low reflection of light is achieved in the wavelength range under study.

  15. 8. EAST ELEVATION OF SKIDOO MILL AND UPPER ORE BIN, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. EAST ELEVATION OF SKIDOO MILL AND UPPER ORE BIN, LOOKING WEST FROM ACCESS ROAD. THE ROADWAY ON THIS LEVEL (CENTER) WAS USED FOR UNLOADING ORE BROUGHT ON BURROWS INTO THE ORE BIN AT THE TOP LEVEL OF THE MILL. THE ORE BIN IN THE UPPER LEFT WAS ADDED LATER WHEN ORE WAS BROUGHT TO THE MILL BY TRUCKS. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  16. Effect of horizontal pick and place locations on shoulder kinematics.

    PubMed

    Könemann, R; Bosch, T; Kingma, I; Van Dieën, J H; De Looze, M P

    2015-01-01

    In this study the effects of horizontal bin locations in an order picking workstation on upper arm elevation, trunk inclination and hand use were investigated. Eight subjects moved (self-paced) light or heavy products (0.2 and 3.0 kg) from a central product bin to an inner or outer order bin (at 60 or 150 cm) on the left or right side of the workstation, while movements were recorded. The outer compared to inner bin location resulted in more upper arm elevation and trunk inclination per work cycle, both in terms of number of peak values and in terms of time integrals of angles (which is a dose measure over time). Considering the peak values and time integrals per minute (instead of per work cycle), these effects are reduced, due to the higher cycle times for outer bins. Hand use (left, right or both) was not affected by order bin locations.

  17. Distribution of hybrid entanglement and hyperentanglement with time-bin for secure quantum channel under noise via weak cross-Kerr nonlinearity.

    PubMed

    Heo, Jino; Kang, Min-Sung; Hong, Chang-Ho; Yang, Hyung-Jin; Choi, Seong-Gon; Hong, Jong-Phil

    2017-08-31

    We design schemes to generate and distribute hybrid entanglement and hyperentanglement correlated with degrees of freedom (polarization and time-bin) via weak cross-Kerr nonlinearities (XKNLs) and linear optical devices (including time-bin encoders). In our scheme, the multi-photon gates (which consist of XKNLs, quantum bus [qubus] beams, and photon-number-resolving [PNR] measurement) with time-bin encoders can generate hyperentanglement or hybrid entanglement. And we can also purify the entangled state (polarization) of two photons using only linear optical devices and time-bin encoders under a noisy (bit-flip) channel. Subsequently, through local operations (using a multi-photon gate via XKNLs) and classical communications, it is possible to generate a four-qubit hybrid entangled state (polarization and time-bin). Finally, we discuss how the multi-photon gate using XKNLs, qubus beams, and PNR measurement can be reliably performed under the decoherence effect.

  18. Ropes: Support for collective opertions among distributed threads

    NASA Technical Reports Server (NTRS)

    Haines, Matthew; Mehrotra, Piyush; Cronk, David

    1995-01-01

    Lightweight threads are becoming increasingly useful in supporting parallelism and asynchronous control structures in applications and language implementations. Recently, systems have been designed and implemented to support interprocessor communication between lightweight threads so that threads can be exploited in a distributed memory system. Their use, in this setting, has been largely restricted to supporting latency hiding techniques and functional parallelism within a single application. However, to execute data parallel codes independent of other threads in the system, collective operations and relative indexing among threads are required. This paper describes the design of ropes: a scoping mechanism for collective operations and relative indexing among threads. We present the design of ropes in the context of the Chant system, and provide performance results evaluating our initial design decisions.

  19. The nuisance due to the noise of automobile traffic: An investigation in the neighborhoods of freeways

    NASA Technical Reports Server (NTRS)

    Lamure, C.; Bacelon, M.

    1980-01-01

    An inquiry was held among 400 people living near freeways in an attempt to determine the characteristics of traffic noise nuisance. A nuisance index was compiled, based on the answers to a questionnaire. Nuisance expressed in these terms was then compared with the noise level measured on the most exposed side of each building. Correlation between the nuisance indexes and the average noise levels is quite good for dwellings with facades parallel to the freeway. At equal noise levels on the most exposed side, the nuisance given for these latter dwellings is lower than for others.

  20. Lithography hotspot discovery at 70nm DRAM 300mm fab: process window qualification using design base binning

    NASA Astrophysics Data System (ADS)

    Chen, Daniel; Chen, Damian; Yen, Ray; Cheng, Mingjen; Lan, Andy; Ghaskadvi, Rajesh

    2008-11-01

    Identifying hotspots--structures that limit the lithography process window--become increasingly important as the industry relies heavily on RET to print sub-wavelength designs. KLA-Tencor's patented Process Window Qualification (PWQ) methodology has been used for this purpose in various fabs. PWQ methodology has three key advantages (a) PWQ Layout--to obtain the best sensitivity (b) Design Based Binning--for pattern repeater analysis (c) Intelligent sampling--for the best DOI sampling rate. This paper evaluates two different analysis strategies for SEM review sampling successfully deployed at Inotera Memories, Inc. We propose a new approach combining the location repeater and pattern repeaters. Based on a recent case study the new sampling flow reduces the data analysis and sampling time from 6 hours to 1.5 hour maintaining maximum DOI sample rate.

  1. Binning in Gaussian Kernel Regularization

    DTIC Science & Technology

    2005-04-01

    OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, but reduces the...71.40%) on 966 randomly sampled data. Using the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the...the OSU-SVM Matlab package, the SVM trained on 966 bins has a comparable test classification rate as the SVM trained on 27,179 samples, and reduces

  2. General Khalid Bin Waleed: Understanding the 7th Century Campaign against Sassanid Persian Empire from the Perspective of Operational Art

    DTIC Science & Technology

    2012-12-06

    ABSTRACT This monograph investigates Khalid Bin Waleed’s seventh century (AD 633-634) campaign against the Sassanid Persian Empire in Mesopotamia to...Khalid Bin Waleed’s seventh century (AD 633-634) campaign against the Sassanid Persian Empire in Mesopotamia to trace the evidence that substantiates...Bin Waleed employed characteristics and elements of operational art to defeat the Persian forces in Mesopotamia . He established operational objectives

  3. Statistical power analysis of cardiovascular safety pharmacology studies in conscious rats.

    PubMed

    Bhatt, Siddhartha; Li, Dingzhou; Flynn, Declan; Wisialowski, Todd; Hemkens, Michelle; Steidl-Nichols, Jill

    2016-01-01

    Cardiovascular (CV) toxicity and related attrition are a major challenge for novel therapeutic entities and identifying CV liability early is critical for effective derisking. CV safety pharmacology studies in rats are a valuable tool for early investigation of CV risk. Thorough understanding of data analysis techniques and statistical power of these studies is currently lacking and is imperative for enabling sound decision-making. Data from 24 crossover and 12 parallel design CV telemetry rat studies were used for statistical power calculations. Average values of telemetry parameters (heart rate, blood pressure, body temperature, and activity) were logged every 60s (from 1h predose to 24h post-dose) and reduced to 15min mean values. These data were subsequently binned into super intervals for statistical analysis. A repeated measure analysis of variance was used for statistical analysis of crossover studies and a repeated measure analysis of covariance was used for parallel studies. Statistical power analysis was performed to generate power curves and establish relationships between detectable CV (blood pressure and heart rate) changes and statistical power. Additionally, data from a crossover CV study with phentolamine at 4, 20 and 100mg/kg are reported as a representative example of data analysis methods. Phentolamine produced a CV profile characteristic of alpha adrenergic receptor antagonism, evidenced by a dose-dependent decrease in blood pressure and reflex tachycardia. Detectable blood pressure changes at 80% statistical power for crossover studies (n=8) were 4-5mmHg. For parallel studies (n=8), detectable changes at 80% power were 6-7mmHg. Detectable heart rate changes for both study designs were 20-22bpm. Based on our results, the conscious rat CV model is a sensitive tool to detect and mitigate CV risk in early safety studies. Furthermore, these results will enable informed selection of appropriate models and study design for early stage CV studies. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Investigating the effect of forestry on leaf-litter arthropods (Algonquin Park, Ontario, Canada)

    PubMed Central

    Boyd, Amanda; Chan, Amelia; Clout, Simonne; des Brisay, Paulson; Dolson, Sarah; Eagalle, Thanushi; Espinola, Sean; Fairweather, Aaron; Frank, Sydney; Fruetel, Christopher; Garrido Cortes, Cristina; Hall, James; Ho, Chris; Matczak, Eryk; McCubbin, Sandra; McPhee, Megan; Pare, Kate A.; Paris, Kelsie; Richard, Ellen; Roblin, Morgan; Russell, Cassandra; Snyder, Ryan; Trombley, Carolyn; Schmitt, Tyler; Vandermeer, Caitlin; Warne, Connor; Welch, Natasha; Xavier-Blower, Chelsie

    2017-01-01

    Arthropods are the most diverse taxonomic group of terrestrial eukaryotes and are sensitive to physical alterations in their environment such as those caused by forestry. With their enormous diversity and physical omnipresence, arthropods could be powerful indicators of the effects of disturbance following forestry. When arthropods have been used to measure the effects of disturbance, the total diversity of some groups is often found to increase following forestry. However, these findings are frequently derived using a coarse taxonomic grain (family or order) to accommodate for various taxonomic impediments (including cryptic diversity and poorly resourced taxonomists). Our intent with this work was to determine the diversity of arthropods in and around Algonquin Park, and how this diversity was influenced by disturbance (in this case, forestry within the past 25 years). We used DNA barcode-derived diversity estimates (Barcode Index Number (BIN) richness) to avoid taxonomic impediments and as a source of genetic information with which we could conduct phylogenetic estimates of diversity (PD). Diversity patterns elucidated with PD are often, but not always congruent with taxonomic estimates–and departures from these expectations can help clarify disturbance effects that are hidden from richness studies alone. We found that BIN richness and PD were greater in disturbed (forested) areas, however when we controlled for the expected relationship between PD and BIN richness, we found that cut sites contained less PD than expected and that this diversity was more phylogenetically clustered than would be predicted by taxonomic richness. While disturbance may cause an evident increase in diversity, this diversity may not reflect the full evolutionary history of the assemblage within that area and thus a subtle effect of disturbance can be found decades following forestry. PMID:28575022

  5. Noise reduction in spectral CT: reducing dose and breaking the trade-off between image noise and energy bin selection.

    PubMed

    Leng, Shuai; Yu, Lifeng; Wang, Jia; Fletcher, Joel G; Mistretta, Charles A; McCollough, Cynthia H

    2011-09-01

    Our purpose was to reduce image noise in spectral CT by exploiting data redundancies in the energy domain to allow flexible selection of the number, width, and location of the energy bins. Using a variety of spectral CT imaging methods, conventional filtered backprojection (FBP) reconstructions were performed and resulting images were compared to those processed using a Local HighlY constrained backPRojection Reconstruction (HYPR-LR) algorithm. The mean and standard deviation of CT numbers were measured within regions of interest (ROIs), and results were compared between FBP and HYPR-LR. For these comparisons, the following spectral CT imaging methods were used:(i) numerical simulations based on a photon-counting, detector-based CT system, (ii) a photon-counting, detector-based micro CT system using rubidium and potassium chloride solutions, (iii) a commercial CT system equipped with integrating detectors utilizing tube potentials of 80, 100, 120, and 140 kV, and (iv) a clinical dual-energy CT examination. The effects of tube energy and energy bin width were evaluated appropriate to each CT system. The mean CT number in each ROI was unchanged between FBP and HYPR-LR images for each of the spectral CT imaging scenarios, irrespective of bin width or tube potential. However, image noise, as represented by the standard deviation of CT numbers in each ROI, was reduced by 36%-76%. In all scenarios, image noise after HYPR-LR algorithm was similar to that of composite images, which used all available photons. No difference in spatial resolution was observed between HYPR-LR processing and FBP. Dual energy patient data processed using HYPR-LR demonstrated reduced noise in the individual, low- and high-energy images, as well as in the material-specific basis images. Noise reduction can be accomplished for spectral CT by exploiting data redundancies in the energy domain. HYPR-LR is a robust method for reducing image noise in a variety of spectral CT imaging systems without losing spatial resolution or CT number accuracy. This method improves the flexibility to select energy bins in the manner that optimizes material identification and separation without paying the penalty of increased image noise or its corollary, increased patient dose.

  6. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    DOE PAGES

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; ...

    2016-11-25

    Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less

  7. A technique for rapid source apportionment applied to ambient organic aerosol measurements from a thermal desorption aerosol gas chromatograph (TAG)

    NASA Astrophysics Data System (ADS)

    Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; Docherty, Kenneth S.; Jimenez, Jose L.

    2016-11-01

    We present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography-mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arranged into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.

  8. De novo phasing with X-ray laser reveals mosquito larvicide BinAB structure [A potent binary mosquito larvicide revealed by de novo phasing with an X-ray free-electron laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colletier, Jacques -Philippe; Sawaya, Michael R.; Gingery, Mari

    BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally to toxic oligomeric pores. The small size of the crystals—50 unit cells per edge, on average—has impeded structural characterization by conventional means. Here we report the structure of Lysinibacillus sphaericus BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser. The structure reveals tyrosine- and carboxylate-mediated contactsmore » acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears to be responsible for anchoring BinA to receptor-bound BinB for co-internalization. Furthermore, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation.« less

  9. De novo phasing with X-ray laser reveals mosquito larvicide BinAB structure [A potent binary mosquito larvicide revealed by de novo phasing with an X-ray free-electron laser

    DOE PAGES

    Colletier, Jacques -Philippe; Sawaya, Michael R.; Gingery, Mari; ...

    2016-09-28

    BinAB is a naturally occurring paracrystalline larvicide distributed worldwide to combat the devastating diseases borne by mosquitoes. These crystals are composed of homologous molecules, BinA and BinB, which play distinct roles in the multi-step intoxication process, transforming from harmless, robust crystals, to soluble protoxin heterodimers, to internalized mature toxin, and finally to toxic oligomeric pores. The small size of the crystals—50 unit cells per edge, on average—has impeded structural characterization by conventional means. Here we report the structure of Lysinibacillus sphaericus BinAB solved de novo by serial-femtosecond crystallography at an X-ray free-electron laser. The structure reveals tyrosine- and carboxylate-mediated contactsmore » acting as pH switches to release soluble protoxin in the alkaline larval midgut. An enormous heterodimeric interface appears to be responsible for anchoring BinA to receptor-bound BinB for co-internalization. Furthermore, this interface is largely composed of propeptides, suggesting that proteolytic maturation would trigger dissociation of the heterodimer and progression to pore formation.« less

  10. Kinematical Comparison Analysis on the Discus Athletes Throwing Techniques Based on Data Project

    NASA Astrophysics Data System (ADS)

    Junming, Li; Jihe, Zhou; Ting, Long

    2017-09-01

    In the discus final site of throwing event series game of China’s track and field sport in April, 2015, three dimensional camera analytical method which is an application of kinematical data project was used on female discus athletes’ discus throwing technology. And analysis was made for the top four discus throwers’ last exertion action, related kinematics parameter was thus obtained. Analysis results show that: first, Lu Xiaoxin behaves better in body twist tight effect when it is left foot on the ground and in capacity of beyond devices, followed by Su Xinyue and Tan Jian, with Feng Bin relatively weaker; second, our athletes’ discus shots speed is to be upgraded compared with world excellent female discus athletes; third, discus is left slightly earlier, with Tan Jian throwing in a reasonable angle, Feng Bin, Lu Xiaoxin in a larger angle, and Sue Xinyue in a smaller angle. Feng bin has a higher height of release, followed by Lu Xiaoxin and Tan jian.

  11. LSE investigation of the thermal effect on band gap energy and thermodynamic parameters of BInGaAs/GaAs Single Quantum Well

    NASA Astrophysics Data System (ADS)

    Hidouri, T.; Saidi, F.; Maaref, H.; Rodriguez, Ph.; Auvray, L.

    2016-12-01

    In this paper, we report on the experimental and theoretical study of BInGaAs/GaAs Single Quantum Well elaborated by Metal Organic Chemical Vapor Deposition (MOCVD). We carried out the photoluminescence (PL) peak energy temperature-dependence over a temperature range of 10-300 K. It shows the S-shaped behavior as a result of a competition process between localized and delocalized states. We simulate the peak evolution by the empirical model and modified models. The first one is limited at high PL temperature. For the second one, a correction due to the thermal redistribution based on the Localized State Ensemble model (LSE). The new fit gives a good agreement between theoretical and experimental data in the entire temperature range. Furthermore, we have investigated an approximate analytical expressions and interpretation for the entropy and enthalpy of formation of electron-hole pairs in quaternary BInGaAs/GaAs SQW.

  12. FPGA implementation of sparse matrix algorithm for information retrieval

    NASA Astrophysics Data System (ADS)

    Bojanic, Slobodan; Jevtic, Ruzica; Nieto-Taladriz, Octavio

    2005-06-01

    Information text data retrieval requires a tremendous amount of processing time because of the size of the data and the complexity of information retrieval algorithms. In this paper the solution to this problem is proposed via hardware supported information retrieval algorithms. Reconfigurable computing may adopt frequent hardware modifications through its tailorable hardware and exploits parallelism for a given application through reconfigurable and flexible hardware units. The degree of the parallelism can be tuned for data. In this work we implemented standard BLAS (basic linear algebra subprogram) sparse matrix algorithm named Compressed Sparse Row (CSR) that is showed to be more efficient in terms of storage space requirement and query-processing timing over the other sparse matrix algorithms for information retrieval application. Although inverted index algorithm is treated as the de facto standard for information retrieval for years, an alternative approach to store the index of text collection in a sparse matrix structure gains more attention. This approach performs query processing using sparse matrix-vector multiplication and due to parallelization achieves a substantial efficiency over the sequential inverted index. The parallel implementations of information retrieval kernel are presented in this work targeting the Virtex II Field Programmable Gate Arrays (FPGAs) board from Xilinx. A recent development in scientific applications is the use of FPGA to achieve high performance results. Computational results are compared to implementations on other platforms. The design achieves a high level of parallelism for the overall function while retaining highly optimised hardware within processing unit.

  13. Parallel Geospatial Data Management for Multi-Scale Environmental Data Analysis on GPUs

    NASA Astrophysics Data System (ADS)

    Wang, D.; Zhang, J.; Wei, Y.

    2013-12-01

    As the spatial and temporal resolutions of Earth observatory data and Earth system simulation outputs are getting higher, in-situ and/or post- processing such large amount of geospatial data increasingly becomes a bottleneck in scientific inquires of Earth systems and their human impacts. Existing geospatial techniques that are based on outdated computing models (e.g., serial algorithms and disk-resident systems), as have been implemented in many commercial and open source packages, are incapable of processing large-scale geospatial data and achieve desired level of performance. In this study, we have developed a set of parallel data structures and algorithms that are capable of utilizing massively data parallel computing power available on commodity Graphics Processing Units (GPUs) for a popular geospatial technique called Zonal Statistics. Given two input datasets with one representing measurements (e.g., temperature or precipitation) and the other one represent polygonal zones (e.g., ecological or administrative zones), Zonal Statistics computes major statistics (or complete distribution histograms) of the measurements in all regions. Our technique has four steps and each step can be mapped to GPU hardware by identifying its inherent data parallelisms. First, a raster is divided into blocks and per-block histograms are derived. Second, the Minimum Bounding Boxes (MBRs) of polygons are computed and are spatially matched with raster blocks; matched polygon-block pairs are tested and blocks that are either inside or intersect with polygons are identified. Third, per-block histograms are aggregated to polygons for blocks that are completely within polygons. Finally, for blocks that intersect with polygon boundaries, all the raster cells within the blocks are examined using point-in-polygon-test and cells that are within polygons are used to update corresponding histograms. As the task becomes I/O bound after applying spatial indexing and GPU hardware acceleration, we have developed a GPU-based data compression technique by reusing our previous work on Bitplane Quadtree (or BPQ-Tree) based indexing of binary bitmaps. Results have shown that our GPU-based parallel Zonal Statistic technique on 3000+ US counties over 20+ billion NASA SRTM 30 meter resolution Digital Elevation (DEM) raster cells has achieved impressive end-to-end runtimes: 101 seconds and 46 seconds a low-end workstation equipped with a Nvidia GTX Titan GPU using cold and hot cache, respectively; and, 60-70 seconds using a single OLCF TITAN computing node and 10-15 seconds using 8 nodes. Our experiment results clearly show the potentials of using high-end computing facilities for large-scale geospatial processing.

  14. 37. VIEW NORTH FROM EAST CRUDE ORE BIN TO CRUSHER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    37. VIEW NORTH FROM EAST CRUDE ORE BIN TO CRUSHER ADDITION AND CRUSHED OXIDIZED ORE BIN. VISIBLE ARE DINGS MAGNETIC PULLEY (CENTER), THE 100-TON STEEL CRUSHED UNOXIDIZED ORE BIN, AND UPPER PORTION OF THE STEPHENS-ADAMSON 25 TON/HR BUCKET ELEVATOR. THE UPPER TAILINGS POND LIES BEYOND THE MILL WITH THE UPPER TAILINGS DAM UNDER THE GRAVEL ROAD IN THE UPPER RIGHT CORNER. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  15. Cosmological constraints with clustering-based redshifts

    NASA Astrophysics Data System (ADS)

    Kovetz, Ely D.; Raccanelli, Alvise; Rahman, Mubdi

    2017-07-01

    We demonstrate that observations lacking reliable redshift information, such as photometric and radio continuum surveys, can produce robust measurements of cosmological parameters when empowered by clustering-based redshift estimation. This method infers the redshift distribution based on the spatial clustering of sources, using cross-correlation with a reference data set with known redshifts. Applying this method to the existing Sloan Digital Sky Survey (SDSS) photometric galaxies, and projecting to future radio continuum surveys, we show that sources can be efficiently divided into several redshift bins, increasing their ability to constrain cosmological parameters. We forecast constraints on the dark-energy equation of state and on local non-Gaussianity parameters. We explore several pertinent issues, including the trade-off between including more sources and minimizing the overlap between bins, the shot-noise limitations on binning and the predicted performance of the method at high redshifts, and most importantly pay special attention to possible degeneracies with the galaxy bias. Remarkably, we find that once this technique is implemented, constraints on dynamical dark energy from the SDSS imaging catalogue can be competitive with, or better than, those from the spectroscopic BOSS survey and even future planned experiments. Further, constraints on primordial non-Gaussianity from future large-sky radio-continuum surveys can outperform those from the Planck cosmic microwave background experiment and rival those from future spectroscopic galaxy surveys. The application of this method thus holds tremendous promise for cosmology.

  16. Holocaust Exposure Induced Intergenerational Effects on FKBP5 Methylation.

    PubMed

    Yehuda, Rachel; Daskalakis, Nikolaos P; Bierer, Linda M; Bader, Heather N; Klengel, Torsten; Holsboer, Florian; Binder, Elisabeth B

    2016-09-01

    The involvement of epigenetic mechanisms in intergenerational transmission of stress effects has been demonstrated in animals but not in humans. Cytosine methylation within the gene encoding for FK506 binding protein 5 (FKBP5) was measured in Holocaust survivors (n = 32), their adult offspring (n = 22), and demographically comparable parent (n = 8) and offspring (n = 9) control subjects, respectively. Cytosine-phosphate-guanine sites for analysis were chosen based on their spatial proximity to the intron 7 glucocorticoid response elements. Holocaust exposure had an effect on FKBP5 methylation that was observed in exposed parents as well in their offspring. These effects were observed at bin 3/site 6. Interestingly, in Holocaust survivors, methylation at this site was higher in comparison with control subjects, whereas in Holocaust offspring, methylation was lower. Methylation levels for exposed parents and their offspring were significantly correlated. In contrast to the findings at bin 3/site 6, offspring methylation at bin 2/sites 3 to 5 was associated with childhood physical and sexual abuse in interaction with an FKBP5 risk allele previously associated with vulnerability to psychological consequences of childhood adversity. The findings suggest the possibility of site specificity to environmental influences, as sites in bins 3 and 2 were differentially associated with parental trauma and the offspring's own childhood trauma, respectively. FKBP5 methylation averaged across the three bins examined was associated with wake-up cortisol levels, indicating functional relevance of the methylation measures. This is the first demonstration of an association of preconception parental trauma with epigenetic alterations that is evident in both exposed parent and offspring, providing potential insight into how severe psychophysiological trauma can have intergenerational effects. Published by Elsevier Inc.

  17. Low Frequency Variants, Collapsed Based on Biological Knowledge, Uncover Complexity of Population Stratification in 1000 Genomes Project Data

    PubMed Central

    Moore, Carrie B.; Wallace, John R.; Wolfe, Daniel J.; Frase, Alex T.; Pendergrass, Sarah A.; Weiss, Kenneth M.; Ritchie, Marylyn D.

    2013-01-01

    Analyses investigating low frequency variants have the potential for explaining additional genetic heritability of many complex human traits. However, the natural frequencies of rare variation between human populations strongly confound genetic analyses. We have applied a novel collapsing method to identify biological features with low frequency variant burden differences in thirteen populations sequenced by the 1000 Genomes Project. Our flexible collapsing tool utilizes expert biological knowledge from multiple publicly available database sources to direct feature selection. Variants were collapsed according to genetically driven features, such as evolutionary conserved regions, regulatory regions genes, and pathways. We have conducted an extensive comparison of low frequency variant burden differences (MAF<0.03) between populations from 1000 Genomes Project Phase I data. We found that on average 26.87% of gene bins, 35.47% of intergenic bins, 42.85% of pathway bins, 14.86% of ORegAnno regulatory bins, and 5.97% of evolutionary conserved regions show statistically significant differences in low frequency variant burden across populations from the 1000 Genomes Project. The proportion of bins with significant differences in low frequency burden depends on the ancestral similarity of the two populations compared and types of features tested. Even closely related populations had notable differences in low frequency burden, but fewer differences than populations from different continents. Furthermore, conserved or functionally relevant regions had fewer significant differences in low frequency burden than regions under less evolutionary constraint. This degree of low frequency variant differentiation across diverse populations and feature elements highlights the critical importance of considering population stratification in the new era of DNA sequencing and low frequency variant genomic analyses. PMID:24385916

  18. The cyclic-di-GMP phosphodiesterase BinA negatively regulates cellulose-containing biofilms in Vibrio fischeri.

    PubMed

    Bassis, Christine M; Visick, Karen L

    2010-03-01

    Bacteria produce different types of biofilms under distinct environmental conditions. Vibrio fischeri has the capacity to produce at least two distinct types of biofilms, one that relies on the symbiosis polysaccharide Syp and another that depends upon cellulose. A key regulator of biofilm formation in bacteria is the intracellular signaling molecule cyclic diguanylate (c-di-GMP). In this study, we focused on a predicted c-di-GMP phosphodiesterase encoded by the gene binA, located directly downstream of syp, a cluster of 18 genes critical for biofilm formation and the initiation of symbiotic colonization of the squid Euprymna scolopes. Disruption or deletion of binA increased biofilm formation in culture and led to increased binding of Congo red and calcofluor, which are indicators of cellulose production. Using random transposon mutagenesis, we determined that the phenotypes of the DeltabinA mutant strain could be disrupted by insertions in genes in the bacterial cellulose biosynthesis cluster (bcs), suggesting that cellulose production is negatively regulated by BinA. Replacement of critical amino acids within the conserved EAL residues of the EAL domain disrupted BinA activity, and deletion of binA increased c-di-GMP levels in the cell. Together, these data support the hypotheses that BinA functions as a phosphodiesterase and that c-di-GMP activates cellulose biosynthesis. Finally, overexpression of the syp regulator sypG induced binA expression. Thus, this work reveals a mechanism by which V. fischeri inhibits cellulose-dependent biofilm formation and suggests that the production of two different polysaccharides may be coordinated through the action of the cellulose inhibitor BinA.

  19. Sensitivity of a Cloud-Resolving Model to the Bulk and Explicit Bin Microphysical Schemes. Part 1; Validations with a PRE-STORM Case

    NASA Technical Reports Server (NTRS)

    Li, Xiao-Wen; Tao, Wei-Kuo; Khain, Alexander P.; Simpson, Joanne; Johnson, Daniel E.

    2004-01-01

    A cloud-resolving model is used to study sensitivities of two different microphysical schemes, one is the bulk type, and the other is an explicit bin scheme, in simulating a mid-latitude squall line case (PRE-STORM, June 10-11, 1985). Simulations using different microphysical schemes are compared with each other and also with the observations. Both the bulk and bin models reproduce the general features during the developing and mature stage of the system. The leading convective zone, the trailing stratiform region, the horizontal wind flow patterns, pressure perturbation associated with the storm dynamics, and the cool pool in front of the system all agree well with the observations. Both the observations and the bulk scheme simulation serve as validations for the newly incorporated bin scheme. However, it is also shown that, the bulk and bin simulations have distinct differences, most notably in the stratiform region. Weak convective cells exist in the stratiform region in the bulk simulation, but not in the bin simulation. These weak convective cells in the stratiform region are remnants of the previous stronger convections at the leading edge of the system. The bin simulation, on the other hand, has a horizontally homogeneous stratiform cloud structure, which agrees better with the observations. Preliminary examinations of the downdraft core strength, the potential temperature perturbation, and the evaporative cooling rate show that the differences between the bulk and bin models are due mainly to the stronger low-level evaporative cooling in convective zone simulated in the bulk model. Further quantitative analysis and sensitivity tests for this case using both the bulk and bin models will be presented in a companion paper.

  20. DETAIL VIEW OF LOWER TRAM TERMINAL, SECONDARY ORE BIN, CRUSHER ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL VIEW OF LOWER TRAM TERMINAL, SECONDARY ORE BIN, CRUSHER FOUNDATION, AND BALL MILL FOUNDATIONS, LOOKING NORTH NORTHWEST. ORE FROM THE MINES WAS DUMPED FROM THE TRAM BUCKETS INTO THE PRIMARY ORE BIN UNDER THE TRAM TERMINAL. A SLIDING CONTROL DOOR INTRODUCED THE INTO THE JAW CRUSHER (FOUNDATIONS,CENTER). THE CRUSHED ORE WAS THEN CONVEYED INTO THE SECONDARY ORE BIN AT CENTER LEFT. A HOLE IN THE FLOOR OF THE ORE BIN PASSED ORE ONTO ANOTHER CONVEYOR THAT BROUGHT IT OUT TO THE BALL MILL(FOUNDATIONS,CENTER BOTTOM). THIS SYSTEM IS MOST LIKELY NOT THE ORIGINAL SET UP, PROBABLY INSTALLED IN THE MINE'S LAST OCCUPATION IN THE EARLY 1940s. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  1. Long-range barcode labeling-sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Feng; Zhang, Tao; Singh, Kanwar K.

    Methods for sequencing single large DNA molecules by clonal multiple displacement amplification using barcoded primers. Sequences are binned based on barcode sequences and sequenced using a microdroplet-based method for sequencing large polynucleotide templates to enable assembly of haplotype-resolved complex genomes and metagenomes.

  2. 25. DETAIL OF STRUCTURAL TIMBERS, ORE BIN, AND STAIRWAY TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. DETAIL OF STRUCTURAL TIMBERS, ORE BIN, AND STAIRWAY TO TOP FLOOR OF MILL, LOOKING SOUTH FROM SECOND FLOOR OF MILL. PORTION OF ORE BIN ON RIGHT, STAIRS ON LEFT. - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  3. 14. Interior view, grain tanks (bins). Barrel view of tunnel ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. Interior view, grain tanks (bins). Barrel view of tunnel for load-out belt conveyor system located below tanks. Square, numbered spouts gravity-feed grain from overhead bins onto belt. - Saint Anthony Elevator No. 3, 620 Malcom Avenue, Southeast, Minneapolis, Hennepin County, MN

  4. Spectral CT Reconstruction with Image Sparsity and Spectral Mean

    PubMed Central

    Zhang, Yi; Xi, Yan; Yang, Qingsong; Cong, Wenxiang; Zhou, Jiliu

    2017-01-01

    Photon-counting detectors can acquire x-ray intensity data in different energy bins. The signal to noise ratio of resultant raw data in each energy bin is generally low due to the narrow bin width and quantum noise. To address this problem, here we propose an image reconstruction approach for spectral CT to simultaneously reconstructs x-ray attenuation coefficients in all the energy bins. Because the measured spectral data are highly correlated among the x-ray energy bins, the intra-image sparsity and inter-image similarity are important prior acknowledge for image reconstruction. Inspired by this observation, the total variation (TV) and spectral mean (SM) measures are combined to improve the quality of reconstructed images. For this purpose, a linear mapping function is used to minimalize image differences between energy bins. The split Bregman technique is applied to perform image reconstruction. Our numerical and experimental results show that the proposed algorithms outperform competing iterative algorithms in this context. PMID:29034267

  5. Green material selection for sustainability: A hybrid MCDM approach.

    PubMed

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection.

  6. Green material selection for sustainability: A hybrid MCDM approach

    PubMed Central

    Zhang, Honghao; Peng, Yong; Tian, Guangdong; Wang, Danqi; Xie, Pengpeng

    2017-01-01

    Green material selection is a crucial step for the material industry to comprehensively improve material properties and promote sustainable development. However, because of the subjectivity and conflicting evaluation criteria in its process, green material selection, as a multi-criteria decision making (MCDM) problem, has been a widespread concern to the relevant experts. Thus, this study proposes a hybrid MCDM approach that combines decision making and evaluation laboratory (DEMATEL), analytical network process (ANP), grey relational analysis (GRA) and technique for order performance by similarity to ideal solution (TOPSIS) to select the optimal green material for sustainability based on the product's needs. A nonlinear programming model with constraints was proposed to obtain the integrated closeness index. Subsequently, an empirical application of rubbish bins was used to illustrate the proposed method. In addition, a sensitivity analysis and a comparison with existing methods were employed to validate the accuracy and stability of the obtained final results. We found that this method provides a more accurate and effective decision support tool for alternative evaluation or strategy selection. PMID:28498864

  7. Exenatide once weekly versus daily basal insulin as add-on treatment to metformin with or without a sulfonylurea: a retrospective pooled analysis in patients with poor glycemic control.

    PubMed

    Grimm, Michael; Li, Yan; Brunell, Steven C; Blase, Erich

    2013-09-01

    Basal insulin (b-INS) is typically the add-on treatment of choice for patients with poor glycemic control (ie, glycated hemoglobin [HbA1c] level ≥ 8.5%), but it is unclear whether b-INS is the best option. In this post hoc analysis, the efficacy and tolerability of exenatide once weekly (EQW) were compared with those of b-INS in patients with type 2 diabetes mellitus and a baseline HbA1c level 8.5% who were undergoing treatment with metformin ± a sulfonylurea. Data were pooled from two 26-week, randomized, controlled trials (EQW vs insulin glargine and EQW vs insulin detemir [EQW, N = 137; b-INS, N = 126]). Treatment with either EQW or b-INS for 26 weeks was associated with significant improvements in HbA1c level compared with baseline, although patients treated with EQW experienced a significantly greater decrease in HbA1c level than those treated with b-INS (least squares [LS] mean ± SE: -2.0% ± 0.08% vs -1.6% ± 0.08%; P = 0.0008). Treatment with EQW was associated with a weight loss of 2.4 kg ± 0.23 kg (LS mean ± SE), whereas treatment with b-INS was associated with a weight gain of 2.0 kg ± 0.24 kg (LS mean difference between groups, -4.4 kg ± 0.33; P < 0.0001). Patients in the EQW group were significantly more likely to achieve the composite endpoint of an HbA1c level < 7.0%, no weight gain, and no hypoglycemic events (defined as a blood glucose level < 54 mg/dL requiring self-treatment or assistance to resolve) than patients in the b-INS group (33.6% vs 3.2%; P < 0.0001). The exposure-adjusted hypoglycemic event rates were 0.08 and 0.37 events per patient-year in the EQW and b-INS groups, respectively. Gastrointestinal adverse events occurred at a higher rate in patients who underwent EQW treatment than those who were treated with b-INS. These results show that EQW treatment was associated with significantly greater improvement in HbA1c level compared with b-INS treatment among patients with poor glycemic control, with the added benefits of weight loss (vs weight gain with b-INS therapy) and a lower incidence of hypoglycemic events. These results suggest that EQW is an alternative treatment to b-INS for patients with type 2 diabetes mellitus and a baseline HbA1c level ≥ 8.5%.

  8. The generation and cost of litter resulting from the curbside collection of recycling.

    PubMed

    Wagner, Travis P; Broaddus, Nathan

    2016-04-01

    This study examined the generation of litter, defined as spillage and uncollected residue, from a curbside collection system for residential recycling. The primary recycling containers used in the study were 18-gal (68 L), open-top bins. The study, conducted over a seven-week period, was comprised of both an urban and suburban area. Six litter characterizations were conducted in which all new litter larger than 1 in.(2) was collected, segregated, counted, and weighed. We found that each week the open-top recycling bins contributed approximately 20,590 pieces of litter over 1 in. in size per every 1000 households, which resulted in the generation of 3.74 tons of litter per 1000 households per year. In addition to the bins having no top, the primary root causes of the litter were constantly overflowing recycling bins, the method of collection, and material scavenging. Based on an estimated cost of litter cleanup ranging from $0.17 to $0.79 per piece of litter, the direct economic costs from the collection of litter and loss in recycling revenues were estimated at US$3920 to US$19,250 per 1000 households per year. Other notable impacts from the litter, such as increased risk of flood damage from storm drain impairment and marine ecosystem damages exist, but were not monetized. The results strongly suggest that modification of the curbside collection system would decrease the amount and associated cost of litter by replacing existing curbside collection containers with larger volume containers with covers and by modifying the task-based incentive system to emphasize litter prevention rather than the current aim of completing the task most quickly. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Storage of sparse files using parallel log-structured file system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bent, John M.; Faibish, Sorin; Grider, Gary

    A sparse file is stored without holes by storing a data portion of the sparse file using a parallel log-structured file system; and generating an index entry for the data portion, the index entry comprising a logical offset, physical offset and length of the data portion. The holes can be restored to the sparse file upon a reading of the sparse file. The data portion can be stored at a logical end of the sparse file. Additional storage efficiency can optionally be achieved by (i) detecting a write pattern for a plurality of the data portions and generating a singlemore » patterned index entry for the plurality of the patterned data portions; and/or (ii) storing the patterned index entries for a plurality of the sparse files in a single directory, wherein each entry in the single directory comprises an identifier of a corresponding sparse file.« less

  10. Parallel In Situ Indexing for Data-intensive Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Abbasi, Hasan; Chacon, Luis

    2011-09-09

    As computing power increases exponentially, vast amount of data is created by many scientific re- search activities. However, the bandwidth for storing the data to disks and reading the data from disks has been improving at a much slower pace. These two trends produce an ever-widening data access gap. Our work brings together two distinct technologies to address this data access issue: indexing and in situ processing. From decades of database research literature, we know that indexing is an effective way to address the data access issue, particularly for accessing relatively small fraction of data records. As data sets increasemore » in sizes, more and more analysts need to use selective data access, which makes indexing an even more important for improving data access. The challenge is that most implementations of in- dexing technology are embedded in large database management systems (DBMS), but most scientific datasets are not managed by any DBMS. In this work, we choose to include indexes with the scientific data instead of requiring the data to be loaded into a DBMS. We use compressed bitmap indexes from the FastBit software which are known to be highly effective for query-intensive workloads common to scientific data analysis. To use the indexes, we need to build them first. The index building procedure needs to access the whole data set and may also require a significant amount of compute time. In this work, we adapt the in situ processing technology to generate the indexes, thus removing the need of read- ing data from disks and to build indexes in parallel. The in situ data processing system used is ADIOS, a middleware for high-performance I/O. Our experimental results show that the indexes can improve the data access time up to 200 times depending on the fraction of data selected, and using in situ data processing system can effectively reduce the time needed to create the indexes, up to 10 times with our in situ technique when using identical parallel settings.« less

  11. A Comprehensive Expedient Methods Field Manual.

    DTIC Science & Technology

    1984-09-01

    structures. " Revetments may be constructed of sandbags, sod blocks , and other expedients [17:933." Bunkers are emplacements with overhead protective...Lapland Fence............................. 75 19. Hardening: Dimensional Timber (Soil Bin) Revetment ............................................. 76...20. Hardening: Log Bulkhead (Soil Bin) Revetment ... 77 21. Hardening: Landing Mat Bulkhead (Soil Bin) Revetment

  12. Moleculo Long-Read Sequencing Facilitates Assembly and Genomic Binning from Complex Soil Metagenomes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Richard Allen; Bottos, Eric M.; Roy Chowdhury, Taniya

    ABSTRACT Soil metagenomics has been touted as the “grand challenge” for metagenomics, as the high microbial diversity and spatial heterogeneity of soils make them unamenable to current assembly platforms. Here, we aimed to improve soil metagenomic sequence assembly by applying the Moleculo synthetic long-read sequencing technology. In total, we obtained 267 Gbp of raw sequence data from a native prairie soil; these data included 109.7 Gbp of short-read data (~100 bp) from the Joint Genome Institute (JGI), an additional 87.7 Gbp of rapid-mode read data (~250 bp), plus 69.6 Gbp (>1.5 kbp) from Moleculo sequencing. The Moleculo data alone yielded over 5,600more » reads of >10 kbp in length, and over 95% of the unassembled reads mapped to contigs of >1.5 kbp. Hybrid assembly of all data resulted in more than 10,000 contigs over 10 kbp in length. We mapped three replicate metatranscriptomes derived from the same parent soil to the Moleculo subassembly and found that 95% of the predicted genes, based on their assignments to Enzyme Commission (EC) numbers, were expressed. The Moleculo subassembly also enabled binning of >100 microbial genome bins. We obtained via direct binning the first complete genome, that of “CandidatusPseudomonas sp. strain JKJ-1” from a native soil metagenome. By mapping metatranscriptome sequence reads back to the bins, we found that several bins corresponding to low-relative-abundanceAcidobacteriawere highly transcriptionally active, whereas bins corresponding to high-relative-abundanceVerrucomicrobiawere not. These results demonstrate that Moleculo sequencing provides a significant advance for resolving complex soil microbial communities. IMPORTANCESoil microorganisms carry out key processes for life on our planet, including cycling of carbon and other nutrients and supporting growth of plants. However, there is poor molecular-level understanding of their functional roles in ecosystem stability and responses to environmental perturbations. This knowledge gap is largely due to the difficulty in culturing the majority of soil microbes. Thus, use of culture-independent approaches, such as metagenomics, promises the direct assessment of the functional potential of soil microbiomes. Soil is, however, a challenge for metagenomic assembly due to its high microbial diversity and variable evenness, resulting in low coverage and uneven sampling of microbial genomes. Despite increasingly large soil metagenome data volumes (>200 Gbp), the majority of the data do not assemble. Here, we used the cutting-edge approach of synthetic long-read sequencing technology (Moleculo) to assemble soil metagenome sequence data into long contigs and used the assemblies for binning of genomes. Author Video: Anauthor video summaryof this article is available.« less

  13. Moleculo Long-Read Sequencing Facilitates Assembly and Genomic Binning from Complex Soil Metagenomes

    PubMed Central

    White, Richard Allen; Bottos, Eric M.; Roy Chowdhury, Taniya; Zucker, Jeremy D.; Brislawn, Colin J.; Nicora, Carrie D.; Fansler, Sarah J.; Glaesemann, Kurt R.; Glass, Kevin

    2016-01-01

    ABSTRACT Soil metagenomics has been touted as the “grand challenge” for metagenomics, as the high microbial diversity and spatial heterogeneity of soils make them unamenable to current assembly platforms. Here, we aimed to improve soil metagenomic sequence assembly by applying the Moleculo synthetic long-read sequencing technology. In total, we obtained 267 Gbp of raw sequence data from a native prairie soil; these data included 109.7 Gbp of short-read data (~100 bp) from the Joint Genome Institute (JGI), an additional 87.7 Gbp of rapid-mode read data (~250 bp), plus 69.6 Gbp (>1.5 kbp) from Moleculo sequencing. The Moleculo data alone yielded over 5,600 reads of >10 kbp in length, and over 95% of the unassembled reads mapped to contigs of >1.5 kbp. Hybrid assembly of all data resulted in more than 10,000 contigs over 10 kbp in length. We mapped three replicate metatranscriptomes derived from the same parent soil to the Moleculo subassembly and found that 95% of the predicted genes, based on their assignments to Enzyme Commission (EC) numbers, were expressed. The Moleculo subassembly also enabled binning of >100 microbial genome bins. We obtained via direct binning the first complete genome, that of “Candidatus Pseudomonas sp. strain JKJ-1” from a native soil metagenome. By mapping metatranscriptome sequence reads back to the bins, we found that several bins corresponding to low-relative-abundance Acidobacteria were highly transcriptionally active, whereas bins corresponding to high-relative-abundance Verrucomicrobia were not. These results demonstrate that Moleculo sequencing provides a significant advance for resolving complex soil microbial communities. IMPORTANCE Soil microorganisms carry out key processes for life on our planet, including cycling of carbon and other nutrients and supporting growth of plants. However, there is poor molecular-level understanding of their functional roles in ecosystem stability and responses to environmental perturbations. This knowledge gap is largely due to the difficulty in culturing the majority of soil microbes. Thus, use of culture-independent approaches, such as metagenomics, promises the direct assessment of the functional potential of soil microbiomes. Soil is, however, a challenge for metagenomic assembly due to its high microbial diversity and variable evenness, resulting in low coverage and uneven sampling of microbial genomes. Despite increasingly large soil metagenome data volumes (>200 Gbp), the majority of the data do not assemble. Here, we used the cutting-edge approach of synthetic long-read sequencing technology (Moleculo) to assemble soil metagenome sequence data into long contigs and used the assemblies for binning of genomes. Author Video: An author video summary of this article is available. PMID:27822530

  14. An Index and Test of Linear Moderated Mediation.

    PubMed

    Hayes, Andrew F

    2015-01-01

    I describe a test of linear moderated mediation in path analysis based on an interval estimate of the parameter of a function linking the indirect effect to values of a moderator-a parameter that I call the index of moderated mediation. This test can be used for models that integrate moderation and mediation in which the relationship between the indirect effect and the moderator is estimated as linear, including many of the models described by Edwards and Lambert ( 2007 ) and Preacher, Rucker, and Hayes ( 2007 ) as well as extensions of these models to processes involving multiple mediators operating in parallel or in serial. Generalization of the method to latent variable models is straightforward. Three empirical examples describe the computation of the index and the test, and its implementation is illustrated using Mplus and the PROCESS macro for SPSS and SAS.

  15. Binning and filtering: the six-color solution

    NASA Astrophysics Data System (ADS)

    Ashdown, Ian; Robinson, Shane; Salsbury, Marc

    2006-08-01

    The use of LED backlighting for LCD displays requires careful binning of red, green, and blue LEDs by dominant wavelength to maintain the color gamuts as specified by NTSC, SMPTE, and EBU/ITU standards. This problem also occurs to a lesser extent with RGB and RGBA assemblies for solid-state lighting, where color gamut consistency is required for color-changing luminaires. In this paper, we propose a "six-color solution," based on Grassman's laws, that does not require color binning, but nevertheless guarantees a fixed color gamut that subsumes the color gamuts of carefully-binned RGB assemblies. A further advantage of this solution is that it solves the problem of peak wavelength shifts with varying junction temperatures. The color gamut can thus remain fixed over the full range of LED intensities and ambient temperatures. A related problem occurs with integrated circuit (IC) colorimeters used for optical feedback with LED backlighting and RGB(A) solid-state lighting, wherein it can be difficult to distinguish between peak wavelength shifts and changes in LED intensity. We apply our six-color solution to the design of a novel colorimeter for LEDs that independently measures changes in peak wavelength and intensity. The design is compatible with current manufacturing techniques for tristimulus colorimeter ICs. Together, the six-color solution for LEDs and colorimeters enables less expensive LED backlighting and solid-state lighting systems with improved color stability.

  16. Uncorrelated measurements of the cosmic expansion history and dark energy from supernovae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Yun; Tegmark, Max; Department of Physics, University of Pennsylvania, Philadelphia, Pennsylvania 19104

    We present a method for measuring the cosmic expansion history H(z) in uncorrelated redshift bins, and apply it to current and simulated type Ia supernova data assuming spatial flatness. If the matter density parameter {omega}{sub m} can be accurately measured from other data, then the dark-energy density history X(z)={rho}{sub X}(z)/{rho}{sub X}(0) can trivially be derived from this expansion history H(z). In contrast to customary 'black box' parameter fitting, our method is transparent and easy to interpret: the measurement of H(z){sup -1} in a redshift bin is simply a linear combination of the measured comoving distances for supernovae in that bin,more » making it obvious how systematic errors propagate from input to output. We find the Riess et al. (2004) gold sample to be consistent with the vanilla concordance model where the dark energy is a cosmological constant. We compare two mission concepts for the NASA/DOE Joint Dark-Energy Mission (JDEM), the Joint Efficient Dark-energy Investigation (JEDI), and the Supernova Accelaration Probe (SNAP), using simulated data including the effect of weak lensing (based on numerical simulations) and a systematic bias from K corrections. Estimating H(z) in seven uncorrelated redshift bins, we find that both provide dramatic improvements over current data: JEDI can measure H(z) to about 10% accuracy and SNAP to 30%-40% accuracy.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Larry K.; Gustafson, William I.; Kassianov, Evgueni I.

    A new treatment for shallow clouds has been introduced into the Weather Research and Forecasting (WRF) model. The new scheme, called the cumulus potential (CuP) scheme, replaces the ad-hoc trigger function used in the Kain-Fritsch cumulus parameterization with a trigger function related to the distribution of temperature and humidity in the convective boundary layer via probability density functions (PDFs). An additional modification to the default version of WRF is the computation of a cumulus cloud fraction based on the time scales relevant for shallow cumuli. Results from three case studies over the U.S. Department of Energy’s Atmospheric Radiation Measurement (ARM)more » site in north central Oklahoma are presented. These days were selected because of the presence of shallow cumuli over the ARM site. The modified version of WRF does a much better job predicting the cloud fraction and the downwelling shortwave irradiance thancontrol simulations utilizing the default Kain-Fritsch scheme. The modified scheme includes a number of additional free parameters, including the number and size of bins used to define the PDF, the minimum frequency of a bin within the PDF before that bin is considered for shallow clouds to form, and the critical cumulative frequency of bins required to trigger deep convection. A series of tests were undertaken to evaluate the sensitivity of the simulations to these parameters. Overall, the scheme was found to be relatively insensitive to each of the parameters.« less

  18. Multi-threading: A new dimension to massively parallel scientific computation

    NASA Astrophysics Data System (ADS)

    Nielsen, Ida M. B.; Janssen, Curtis L.

    2000-06-01

    Multi-threading is becoming widely available for Unix-like operating systems, and the application of multi-threading opens new ways for performing parallel computations with greater efficiency. We here briefly discuss the principles of multi-threading and illustrate the application of multi-threading for a massively parallel direct four-index transformation of electron repulsion integrals. Finally, other potential applications of multi-threading in scientific computing are outlined.

  19. Suppression of Prostate Tumor Progression by Bin 1

    DTIC Science & Technology

    2006-02-01

    experiment). The full protocol was approved by IACUC review. Cohort A. Castration + Testostrone propionate s.c. + MNU i.v. Cohort B. Castration... Testostrone propionate s.c. + MNU i.v. + testosterone pellet Strain 1. mosaic Bin1 flox)/KO  Strain 2. Bin1 flox/+ (control for strain 1) Strain

  20. Bioinformatics and Astrophysics Cluster (BinAc)

    NASA Astrophysics Data System (ADS)

    Krüger, Jens; Lutz, Volker; Bartusch, Felix; Dilling, Werner; Gorska, Anna; Schäfer, Christoph; Walter, Thomas

    2017-09-01

    BinAC provides central high performance computing capacities for bioinformaticians and astrophysicists from the state of Baden-Württemberg. The bwForCluster BinAC is part of the implementation concept for scientific computing for the universities in Baden-Württemberg. Community specific support is offered through the bwHPC-C5 project.

  1. A neural network-based method for spectral distortion correction in photon counting x-ray CT

    NASA Astrophysics Data System (ADS)

    Touch, Mengheng; Clark, Darin P.; Barber, William; Badea, Cristian T.

    2016-08-01

    Spectral CT using a photon counting x-ray detector (PCXD) shows great potential for measuring material composition based on energy dependent x-ray attenuation. Spectral CT is especially suited for imaging with K-edge contrast agents to address the otherwise limited contrast in soft tissues. We have developed a micro-CT system based on a PCXD. This system enables both 4 energy bins acquisition, as well as full-spectrum mode in which the energy thresholds of the PCXD are swept to sample the full energy spectrum for each detector element and projection angle. Measurements provided by the PCXD, however, are distorted due to undesirable physical effects in the detector and can be very noisy due to photon starvation in narrow energy bins. To address spectral distortions, we propose and demonstrate a novel artificial neural network (ANN)-based spectral distortion correction mechanism, which learns to undo the distortion in spectral CT, resulting in improved material decomposition accuracy. To address noise, post-reconstruction denoising based on bilateral filtration, which jointly enforces intensity gradient sparsity between spectral samples, is used to further improve the robustness of ANN training and material decomposition accuracy. Our ANN-based distortion correction method is calibrated using 3D-printed phantoms and a model of our spectral CT system. To enable realistic simulations and validation of our method, we first modeled the spectral distortions using experimental data acquired from 109Cd and 133Ba radioactive sources measured with our PCXD. Next, we trained an ANN to learn the relationship between the distorted spectral CT projections and the ideal, distortion-free projections in a calibration step. This required knowledge of the ground truth, distortion-free spectral CT projections, which were obtained by simulating a spectral CT scan of the digital version of a 3D-printed phantom. Once the training was completed, the trained ANN was used to perform distortion correction on any subsequent scans of the same system with the same parameters. We used joint bilateral filtration to perform noise reduction by jointly enforcing intensity gradient sparsity between the reconstructed images for each energy bin. Following reconstruction and denoising, the CT data was spectrally decomposed using the photoelectric effect, Compton scattering, and a K-edge material (i.e. iodine). The ANN-based distortion correction approach was tested using both simulations and experimental data acquired in phantoms and a mouse with our PCXD-based micro-CT system for 4 bins and full-spectrum acquisition modes. The iodine detectability and decomposition accuracy were assessed using the contrast-to-noise ratio and relative error in iodine concentration estimation metrics in images with and without distortion correction. In simulation, the material decomposition accuracy in the reconstructed data was vastly improved following distortion correction and denoising, with 50% and 20% reductions in material concentration measurement error in full-spectrum and 4 energy bins cases, respectively. Overall, experimental data confirms that full-spectrum mode provides superior results to 4-energy mode when the distortion corrections are applied. The material decomposition accuracy in the reconstructed data was vastly improved following distortion correction and denoising, with as much as a 41% reduction in material concentration measurement error for full-spectrum mode, while also bringing the iodine detectability to 4-6 mg ml-1. Distortion correction also improved the 4 bins mode data, but to a lesser extent. The results demonstrate the experimental feasibility and potential advantages of ANN-based distortion correction and joint bilateral filtration-based denoising for accurate K-edge imaging with a PCXD. Given the computational efficiency with which the ANN can be applied to projection data, the proposed scheme can be readily integrated into existing CT reconstruction pipelines.

  2. EFFECTS OF NUMBER AND LOCATION OF BINS ON PLASTIC RECYCLING AT A UNIVERSITY

    PubMed Central

    O'Connor, Ryan T; Lerman, Dorothea C; Fritz, Jennifer N; Hodde, Henry B

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location without the use of posted signs. Manipulating the appearance or number of recycling bins in common areas did not increase recycling. Consumers recycled substantially more plastic bottles when the recycling bins were located in classrooms. PMID:21541154

  3. Effects of number and location of bins on plastic recycling at a university.

    PubMed

    O'Connor, Ryan T; Lerman, Dorothea C; Fritz, Jennifer N; Hodde, Henry B

    2010-01-01

    The proportion of plastic bottles that consumers placed in appropriate recycling receptacles rather than trash bins was examined across 3 buildings on a university campus. We extended previous research on interventions to increase recycling by controlling the number of recycling receptacles across conditions and by examining receptacle location without the use of posted signs. Manipulating the appearance or number of recycling bins in common areas did not increase recycling. Consumers recycled substantially more plastic bottles when the recycling bins were located in classrooms.

  4. BinMag: Widget for comparing stellar observed with theoretical spectra

    NASA Astrophysics Data System (ADS)

    Kochukhov, O.

    2018-05-01

    BinMag examines theoretical stellar spectra computed with Synth/SynthMag/Synmast/Synth3/SME spectrum synthesis codes and compare them to observations. An IDL widget program, BinMag applies radial velocity shift and broadening to the theoretical spectra to account for the effects of stellar rotation, radial-tangential macroturbulence, instrumental smearing. The code can also simulate spectra of spectroscopic binary stars by appropriate coaddition of two synthetic spectra. Additionally, BinMag can be used to measure equivalent width, fit line profile shapes with analytical functions, and to automatically determine radial velocity and broadening parameters. BinMag interfaces with the Synth3 (ascl:1212.010) and SME (ascl:1202.013) codes, allowing the user to determine chemical abundances and stellar atmospheric parameters from the observed spectra.

  5. Scaling for the SOL/separatrix χ ⊥ following from the heuristic drift model for the power scrape-off layer width

    NASA Astrophysics Data System (ADS)

    Huber, A.; Chankin, A. V.

    2017-06-01

    A simple two-point representation of the tokamak scrape-off layer (SOL) in the conduction limited regime, based on the parallel and perpendicular energy balance equations in combination with the heat flux width predicted by a heuristic drift-based model, was used to derive a scaling for the cross-field thermal diffusivity {χ }\\perp . For fixed plasma shape and neglecting weak power dependence indexes 1/8, the scaling {χ }\\perp \\propto {P}{{S}{{O}}{{L}}}/(n{B}θ {R}2) is derived.

  6. a Novel Approach of Indexing and Retrieving Spatial Polygons for Efficient Spatial Region Queries

    NASA Astrophysics Data System (ADS)

    Zhao, J. H.; Wang, X. Z.; Wang, F. Y.; Shen, Z. H.; Zhou, Y. C.; Wang, Y. L.

    2017-10-01

    Spatial region queries are more and more widely used in web-based applications. Mechanisms to provide efficient query processing over geospatial data are essential. However, due to the massive geospatial data volume, heavy geometric computation, and high access concurrency, it is difficult to get response in real time. Spatial indexes are usually used in this situation. In this paper, based on k-d tree, we introduce a distributed KD-Tree (DKD-Tree) suitbable for polygon data, and a two-step query algorithm. The spatial index construction is recursive and iterative, and the query is an in memory process. Both the index and query methods can be processed in parallel, and are implemented based on HDFS, Spark and Redis. Experiments on a large volume of Remote Sensing images metadata have been carried out, and the advantages of our method are investigated by comparing with spatial region queries executed on PostgreSQL and PostGIS. Results show that our approach not only greatly improves the efficiency of spatial region query, but also has good scalability, Moreover, the two-step spatial range query algorithm can also save cluster resources to support a large number of concurrent queries. Therefore, this method is very useful when building large geographic information systems.

  7. Semiconductor Based Transverse Bragg Resonance (TBR) Optical Amplifiers and Lasers

    DTIC Science & Technology

    2007-02-14

    modes with small modal angles experience zero or very low radiation loss. We call these modes small modal angle (SMA) modes. SMA modes include both...lossless effective index-guided modes and low loss leaky modes. They are almost parallel to the graing and do not radiate significantly. As the modal...angle increases, all the modes experience higher radiation loss. However, around the transverse resonance angle of 13.80, low loss modes exist. These

  8. A tomographic test of cosmological principle using the JLA compilation of type Ia supernovae

    NASA Astrophysics Data System (ADS)

    Chang, Zhe; Lin, Hai-Nan; Sang, Yu; Wang, Sai

    2018-05-01

    We test the cosmological principle by fitting a dipolar modulation of distance modulus and searching for an evolution of this modulation with respect to cosmological redshift. Based on a redshift tomographic method, we divide the Joint Light-curve Analysis compilation of supernovae of type Ia into different redshift bins, and employ a Markov-Chain Monte-Carlo method to infer the anisotropic amplitude and direction in each redshift bin. However, we do not find any significant deviations from the cosmological principle, and the anisotropic amplitude is stringently constrained to be less than a few thousandths at 95% confidence level.

  9. Meta-analysis of 32 genome-wide linkage studies of schizophrenia

    PubMed Central

    Ng, MYM; Levinson, DF; Faraone, SV; Suarez, BK; DeLisi, LE; Arinami, T; Riley, B; Paunio, T; Pulver, AE; Irmansyah; Holmans, PA; Escamilla, M; Wildenauer, DB; Williams, NM; Laurent, C; Mowry, BJ; Brzustowicz, LM; Maziade, M; Sklar, P; Garver, DL; Abecasis, GR; Lerer, B; Fallin, MD; Gurling, HMD; Gejman, PV; Lindholm, E; Moises, HW; Byerley, W; Wijsman, EM; Forabosco, P; Tsuang, MT; Hwu, H-G; Okazaki, Y; Kendler, KS; Wormley, B; Fanous, A; Walsh, D; O’Neill, FA; Peltonen, L; Nestadt, G; Lasseter, VK; Liang, KY; Papadimitriou, GM; Dikeos, DG; Schwab, SG; Owen, MJ; O’Donovan, MC; Norton, N; Hare, E; Raventos, H; Nicolini, H; Albus, M; Maier, W; Nimgaonkar, VL; Terenius, L; Mallet, J; Jay, M; Godard, S; Nertney, D; Alexander, M; Crowe, RR; Silverman, JM; Bassett, AS; Roy, M-A; Mérette, C; Pato, CN; Pato, MT; Roos, J Louw; Kohn, Y; Amann-Zalcenstein, D; Kalsi, G; McQuillin, A; Curtis, D; Brynjolfson, J; Sigmundsson, T; Petursson, H; Sanders, AR; Duan, J; Jazin, E; Myles-Worsley, M; Karayiorgou, M; Lewis, CM

    2009-01-01

    A genome scan meta-analysis (GSMA) was carried out on 32 independent genome-wide linkage scan analyses that included 3255 pedigrees with 7413 genotyped cases affected with schizophrenia (SCZ) or related disorders. The primary GSMA divided the autosomes into 120 bins, rank-ordered the bins within each study according to the most positive linkage result in each bin, summed these ranks (weighted for study size) for each bin across studies and determined the empirical probability of a given summed rank (PSR) by simulation. Suggestive evidence for linkage was observed in two single bins, on chromosomes 5q (142-168 Mb) and 2q (103-134 Mb). Genome-wide evidence for linkage was detected on chromosome 2q (119-152 Mb) when bin boundaries were shifted to the middle of the previous bins. The primary analysis met empirical criteria for ‘aggregate’ genome-wide significance, indicating that some or all of 10 bins are likely to contain loci linked to SCZ, including regions of chromosomes 1, 2q, 3q, 4q, 5q, 8p and 10q. In a secondary analysis of 22 studies of European-ancestry samples, suggestive evidence for linkage was observed on chromosome 8p (16-33 Mb). Although the newer genome-wide association methodology has greater power to detect weak associations to single common DNA sequence variants, linkage analysis can detect diverse genetic effects that segregate in families, including multiple rare variants within one locus or several weakly associated loci in the same region. Therefore, the regions supported by this meta-analysis deserve close attention in future studies. PMID:19349958

  10. Differentiation of Benign and Malignant Breast Tumors by In-Vivo Three-Dimensional Parallel-Plate Diffuse Optical Tomography

    PubMed Central

    Choe, Regine; Konecky, Soren D.; Corlu, Alper; Lee, Kijoon; Durduran, Turgut; Busch, David R.; Pathak, Saurav; Czerniecki, Brian J.; Tchou, Julia; Fraker, Douglas L.; DeMichele, Angela; Chance, Britton; Arridge, Simon R.; Schweiger, Martin; Culver, Joseph P.; Schnall, Mitchell D.; Putt, Mary E.; Rosen, Mark A.; Yodh, Arjun G.

    2009-01-01

    We have developed a novel parallel-plate diffuse optical tomography (DOT) system for three-dimensional in vivo imaging of human breast tumor based on large optical data sets. Images of oxy-, deoxy-, total-hemoglobin concentration, blood oxygen saturation, and tissue scattering were reconstructed. Tumor margins were derived using the optical data with guidance from radiology reports and Magnetic Resonance Imaging. Tumor-to-normal ratios of these endogenous physiological parameters and an optical index were computed for 51 biopsy-proven lesions from 47 subjects. Malignant cancers (N=41) showed statistically significant higher total hemoglobin, oxy-hemoglobin concentration, and scattering compared to normal tissue. Furthermore, malignant lesions exhibited a two-fold average increase in optical index. The influence of core biopsy on DOT results was also explored; the difference between the malignant group measured before core biopsy and the group measured more than one week after core biopsy was not significant. Benign tumors (N=10) did not exhibit statistical significance in the tumor-to-normal ratios of any parameter. Optical index and tumor-to-normal ratios of total hemoglobin, oxy-hemoglobin concentration, and scattering exhibited high area under the receiver operating characteristic curve values from 0.90 to 0.99, suggesting good discriminatory power. The data demonstrate that benign and malignant lesions can be distinguished by quantitative three-dimensional DOT. PMID:19405750

  11. Uncertainty in Modeling Dust Mass Balance and Radiative Forcing from Size Parameterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Chun; Chen, Siyu; Leung, Lai-Yung R.

    2013-11-05

    This study examines the uncertainties in simulating mass balance and radiative forcing of mineral dust due to biases in the aerosol size parameterization. Simulations are conducted quasi-globally (180oW-180oE and 60oS-70oN) using the WRF24 Chem model with three different approaches to represent aerosol size distribution (8-bin, 4-bin, and 3-mode). The biases in the 3-mode or 4-bin approaches against a relatively more accurate 8-bin approach in simulating dust mass balance and radiative forcing are identified. Compared to the 8-bin approach, the 4-bin approach simulates similar but coarser size distributions of dust particles in the atmosphere, while the 3-mode pproach retains more finemore » dust particles but fewer coarse dust particles due to its prescribed og of each mode. Although the 3-mode approach yields up to 10 days longer dust mass lifetime over the remote oceanic regions than the 8-bin approach, the three size approaches produce similar dust mass lifetime (3.2 days to 3.5 days) on quasi-global average, reflecting that the global dust mass lifetime is mainly determined by the dust mass lifetime near the dust source regions. With the same global dust emission (~6000 Tg yr-1), the 8-bin approach produces a dust mass loading of 39 Tg, while the 4-bin and 3-mode approaches produce 3% (40.2 Tg) and 25% (49.1 Tg) higher dust mass loading, respectively. The difference in dust mass loading between the 8-bin approach and the 4-bin or 3-mode approaches has large spatial variations, with generally smaller relative difference (<10%) near the surface over the dust source regions. The three size approaches also result in significantly different dry and wet deposition fluxes and number concentrations of dust. The difference in dust aerosol optical depth (AOD) (a factor of 3) among the three size approaches is much larger than their difference (25%) in dust mass loading. Compared to the 8-bin approach, the 4-bin approach yields stronger dust absorptivity, while the 3-mode approach yields weaker dust absorptivity. Overall, on quasi-global average, the three size parameterizations result in a significant difference of a factor of 2~3 in dust surface cooling (-1.02~-2.87 W m-2) and atmospheric warming (0.39~0.96 W m-2) and in a tremendous difference of a factor of ~10 in dust TOA cooling (-0.24~-2.20 W m-2). An uncertainty of a factor of 2 is quantified in dust emission estimation due to the different size parameterizations. This study also highlights the uncertainties in modeling dust mass and number loading, deposition fluxes, and radiative forcing resulting from different size parameterizations, and motivates further investigation of the impact of size parameterizations on modeling dust impacts on air quality, climate, and ecosystem.« less

  12. Low-Temperature Effects on the Design and Performance of Composting of Explosives-Contaminated Soils

    DTIC Science & Technology

    1991-03-01

    7 7. Aerated bins used in field composting tests on dairy manure ............................. 10 8. Typical temperature developed...during bin compostiag of dairy manure under conditions of constant airflow and optimum moisture ................. 10 9. Effect of agitation on the...temperature profile during bin composting of dairy manure

  13. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 2 2011-01-01 2011-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  14. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 16 Commercial Practices 2 2014-01-01 2014-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  15. 16 CFR 1301.6 - Test conditions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 2 2010-01-01 2010-01-01 false Test conditions. 1301.6 Section 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  16. 90. Photographic copy of plan of bins, section of boot, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    90. Photographic copy of plan of bins, section of boot, and photograph of construction originally published in Plans of Grain Elevators (Chicago; Grain Dealers Journal, 1918), p.53. PLAN OF BINS; SECTION OF BOOT; VIEW OF CONSTRUCTION LOOKING NORTHWEST - Northwestern Consolidated Elevator "A", 119 Fifth Avenue South, Minneapolis, Hennepin County, MN

  17. Increasing donations to supermarket food-bank bins using proximal prompts.

    PubMed

    Farrimond, Samantha J; Leland, Louis S

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin.

  18. 64. NORTH WALL OF CRUSHED OXIDIZED ORE BIN. THE PRIMARY ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    64. NORTH WALL OF CRUSHED OXIDIZED ORE BIN. THE PRIMARY MILL FEEDS AT BOTTOM. MILL SOLUTION TANKS WERE TO THE LEFT (EAST) AND BARREN SOLUTION TANK TO THE RIGHT (WEST) OR THE CRUSHED ORE BIN. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  19. Shapes on a plane: Evaluating the impact of projection distortion on spatial binning

    USGS Publications Warehouse

    Battersby, Sarah E.; Strebe, Daniel “daan”; Finn, Michael P.

    2017-01-01

    One method for working with large, dense sets of spatial point data is to aggregate the measure of the data into polygonal containers, such as political boundaries, or into regular spatial bins such as triangles, squares, or hexagons. When mapping these aggregations, the map projection must inevitably distort relationships. This distortion can impact the reader’s ability to compare count and density measures across the map. Spatial binning, particularly via hexagons, is becoming a popular technique for displaying aggregate measures of point data sets. Increasingly, we see questionable use of the technique without attendant discussion of its hazards. In this work, we discuss when and why spatial binning works and how mapmakers can better understand the limitations caused by distortion from projecting to the plane. We introduce equations for evaluating distortion’s impact on one common projection (Web Mercator) and discuss how the methods used generalize to other projections. While we focus on hexagonal binning, these same considerations affect spatial bins of any shape, and more generally, any analysis of geographic data performed in planar space.

  20. Tensor contraction engine: Abstraction and automated parallel implementation of configuration-interaction, coupled-cluster, and many-body perturbation theories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirata, So

    2003-11-20

    We develop a symbolic manipulation program and program generator (Tensor Contraction Engine or TCE) that automatically derives the working equations of a well-defined model of second-quantized many-electron theories and synthesizes efficient parallel computer programs on the basis of these equations. Provided an ansatz of a many-electron theory model, TCE performs valid contractions of creation and annihilation operators according to Wick's theorem, consolidates identical terms, and reduces the expressions into the form of multiple tensor contractions acted by permutation operators. Subsequently, it determines the binary contraction order for each multiple tensor contraction with the minimal operation and memory cost, factorizes commonmore » binary contractions (defines intermediate tensors), and identifies reusable intermediates. The resulting ordered list of binary tensor contractions, additions, and index permutations is translated into an optimized program that is combined with the NWChem and UTChem computational chemistry software packages. The programs synthesized by TCE take advantage of spin symmetry, Abelian point-group symmetry, and index permutation symmetry at every stage of calculations to minimize the number of arithmetic operations and storage requirement, adjust the peak local memory usage by index range tiling, and support parallel I/O interfaces and dynamic load balancing for parallel executions. We demonstrate the utility of TCE through automatic derivation and implementation of parallel programs for various models of configuration-interaction theory (CISD, CISDT, CISDTQ), many-body perturbation theory [MBPT(2), MBPT(3), MBPT(4)], and coupled-cluster theory (LCCD, CCD, LCCSD, CCSD, QCISD, CCSDT, and CCSDTQ).« less

  1. Proceedings: Sisal `93

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feo, J.T.

    1993-10-01

    This report contain papers on: Programmability and performance issues; The case of an iterative partial differential equation solver; Implementing the kernal of the Australian Region Weather Prediction Model in Sisal; Even and quarter-even prime length symmetric FFTs and their Sisal Implementations; Top-down thread generation for Sisal; Overlapping communications and computations on NUMA architechtures; Compiling technique based on dataflow analysis for funtional programming language Valid; Copy elimination for true multidimensional arrays in Sisal 2.0; Increasing parallelism for an optimization that reduces copying in IF2 graphs; Caching in on Sisal; Cache performance of Sisal Vs. FORTRAN; FFT algorithms on a shared-memory multiprocessor;more » A parallel implementation of nonnumeric search problems in Sisal; Computer vision algorithms in Sisal; Compilation of Sisal for a high-performance data driven vector processor; Sisal on distributed memory machines; A virtual shared addressing system for distributed memory Sisal; Developing a high-performance FFT algorithm in Sisal for a vector supercomputer; Implementation issues for IF2 on a static data-flow architechture; and Systematic control of parallelism in array-based data-flow computation. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.« less

  2. 3013/9975 Surveillance Program Interim Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, K.; Hackney, B.; McClard, J.

    2011-06-22

    The K-Area Materials Storage (KAMS) Documented Safety Analysis (DSA) requires a surveillance program to monitor the safety performance of 3013 containers and 9975 shipping packages stored in KAMS. The SRS surveillance program [Reference 1] outlines activities for field surveillance and laboratory tests that demonstrate the packages meet the functional performance requirements described in the DSA. The SRS program also supports the complexwide Integrated Surveillance Program (ISP) [Reference 2] for 3013 containers. The purpose of this report is to provide a summary of the SRS portion of the surveillance program activities through fiscal year 2010 (FY10) and formally communicate the interpretationmore » of these results by the Surveillance Program Authority (SPA). Surveillance for the initial 3013 container random sampling of the Innocuous bin and the Pressure bin has been completed and there has been no indication of corrosion or significant pressurization. The maximum pressure observed was less than 50 psig, which is well below the design pressure of 699 psig for the 3013 container [Reference 3]. The data collected during surveillance of these bins has been evaluated by the Materials Identification and Surveillance (MIS) Working Group and no additional surveillance is necessary for these bins at least through FY13. A decision will be made whether additional surveillance of these bins is needed during future years of storage and as additional containers are generated. Based on the data collected to date, the SPA concludes that 3013 containers in these bins can continue to be safely stored in KAMS. This year, 13 destructive examinations (DE) were performed on random samples from the Pressure & Corrosion bin. To date, DE has been completed for approximately 30% of the random samples from the Pressure & Corrosion bin. In addition, DE has been performed on 6 engineering judgment (EJ) containers, for a total of 17 to date. This includes one container that exceeded the 3013 Standard moisture limit which was opened at LANL. The container pieces and an oxide sample were sent to SRNL for examination in FY11. No significant pressurization has been observed for the Pressure & Corrosion bin containers. Relatively minor corrosion has been observed on some convenience containers and the inside of two inner containers. While the limited extent of corrosion does not jeopardize the integrity of the outer 3013 containers, it does highlight the importance of continuing to perform DE and the Shelf Life program to assure that the corrosion rate is not accelerating or changing to a different corrosion mechanism (e.g., stress corrosion cracking). Statistical sampling is currently scheduled to be completed in FY17, but there is a proposed reduction of the number of DE's per year for FY11 and beyond which may delay the completion date. Since 3013 containers are stored inside 9975 containers, surveillances of 9975 containers are performed in conjunction with 3013 container surveillances. Results of 9975 container nondestructive examinations (NDEs) and DEs indicate that the containers will provide adequate protection of the 3013 containers in K-Area storage for at least 15 years [Reference 4].« less

  3. Measuring the 2D baryon acoustic oscillation signal of galaxies in WiggleZ: cosmological constraints

    PubMed Central

    Hinton, Samuel R.; Kazin, Eyal; Davis, Tamara M.; Blake, Chris; Brough, Sarah; Colless, Matthew; Couch, Warrick J.; Drinkwater, Michael J.; Glazebrook, Karl; Jurek, Russell J.; Parkinson, David; Pimbblet, Kevin A.; Poole, Gregory B.; Pracy, Michael; Woods, David

    2016-01-01

    We present results from the 2D anisotropic baryon acoustic oscillation (BAO) signal present in the final data set from the WiggleZ Dark Energy Survey. We analyse the WiggleZ data in two ways: first using the full shape of the 2D correlation function and secondly focusing only on the position of the BAO peak in the reconstructed data set. When fitting for the full shape of the 2D correlation function we use a multipole expansion to compare with theory. When we use the reconstructed data we marginalize over the shape and just measure the position of the BAO peak, analysing the data in wedges separating the signal along the line of sight from that parallel to the line of sight. We verify our method with mock data and find the results to be free of bias or systematic offsets. We also redo the pre-reconstruction angle-averaged (1D) WiggleZ BAO analysis with an improved covariance and present an updated result. The final results are presented in the form of Ωc h2, H(z), and DA(z) for three redshift bins with effective redshifts z = 0.44, 0.60, and 0.73. Within these bins and methodologies, we recover constraints between 5 and 22 per cent error. Our cosmological constraints are consistent with flat ΛCDM cosmology and agree with results from the Baryon Oscillation Spectroscopic Survey. PMID:28066154

  4. Measuring the 2D baryon acoustic oscillation signal of galaxies in WiggleZ: cosmological constraints.

    PubMed

    Hinton, Samuel R; Kazin, Eyal; Davis, Tamara M; Blake, Chris; Brough, Sarah; Colless, Matthew; Couch, Warrick J; Drinkwater, Michael J; Glazebrook, Karl; Jurek, Russell J; Parkinson, David; Pimbblet, Kevin A; Poole, Gregory B; Pracy, Michael; Woods, David

    2017-02-01

    We present results from the 2D anisotropic baryon acoustic oscillation (BAO) signal present in the final data set from the WiggleZ Dark Energy Survey. We analyse the WiggleZ data in two ways: first using the full shape of the 2D correlation function and secondly focusing only on the position of the BAO peak in the reconstructed data set. When fitting for the full shape of the 2D correlation function we use a multipole expansion to compare with theory. When we use the reconstructed data we marginalize over the shape and just measure the position of the BAO peak, analysing the data in wedges separating the signal along the line of sight from that parallel to the line of sight. We verify our method with mock data and find the results to be free of bias or systematic offsets. We also redo the pre-reconstruction angle-averaged (1D) WiggleZ BAO analysis with an improved covariance and present an updated result. The final results are presented in the form of Ω c   h 2 , H ( z ), and D A ( z ) for three redshift bins with effective redshifts z = 0.44, 0.60, and 0.73. Within these bins and methodologies, we recover constraints between 5 and 22 per cent error. Our cosmological constraints are consistent with flat ΛCDM cosmology and agree with results from the Baryon Oscillation Spectroscopic Survey.

  5. Measuring the 2D baryon acoustic oscillation signal of galaxies in WiggleZ: cosmological constraints

    NASA Astrophysics Data System (ADS)

    Hinton, Samuel R.; Kazin, Eyal; Davis, Tamara M.; Blake, Chris; Brough, Sarah; Colless, Matthew; Couch, Warrick J.; Drinkwater, Michael J.; Glazebrook, Karl; Jurek, Russell J.; Parkinson, David; Pimbblet, Kevin A.; Poole, Gregory B.; Pracy, Michael; Woods, David

    2017-02-01

    We present results from the 2D anisotropic baryon acoustic oscillation (BAO) signal present in the final data set from the WiggleZ Dark Energy Survey. We analyse the WiggleZ data in two ways: first using the full shape of the 2D correlation function and secondly focusing only on the position of the BAO peak in the reconstructed data set. When fitting for the full shape of the 2D correlation function we use a multipole expansion to compare with theory. When we use the reconstructed data we marginalize over the shape and just measure the position of the BAO peak, analysing the data in wedges separating the signal along the line of sight from that parallel to the line of sight. We verify our method with mock data and find the results to be free of bias or systematic offsets. We also redo the pre-reconstruction angle-averaged (1D) WiggleZ BAO analysis with an improved covariance and present an updated result. The final results are presented in the form of Ωc h2, H(z), and DA(z) for three redshift bins with effective redshifts z = 0.44, 0.60, and 0.73. Within these bins and methodologies, we recover constraints between 5 and 22 per cent error. Our cosmological constraints are consistent with flat ΛCDM cosmology and agree with results from the Baryon Oscillation Spectroscopic Survey.

  6. Taxator-tk: precise taxonomic assignment of metagenomes by fast approximation of evolutionary neighborhoods

    PubMed Central

    Dröge, J.; Gregor, I.; McHardy, A. C.

    2015-01-01

    Motivation: Metagenomics characterizes microbial communities by random shotgun sequencing of DNA isolated directly from an environment of interest. An essential step in computational metagenome analysis is taxonomic sequence assignment, which allows identifying the sequenced community members and reconstructing taxonomic bins with sequence data for the individual taxa. For the massive datasets generated by next-generation sequencing technologies, this cannot be performed with de-novo phylogenetic inference methods. We describe an algorithm and the accompanying software, taxator-tk, which performs taxonomic sequence assignment by fast approximate determination of evolutionary neighbors from sequence similarities. Results: Taxator-tk was precise in its taxonomic assignment across all ranks and taxa for a range of evolutionary distances and for short as well as for long sequences. In addition to the taxonomic binning of metagenomes, it is well suited for profiling microbial communities from metagenome samples because it identifies bacterial, archaeal and eukaryotic community members without being affected by varying primer binding strengths, as in marker gene amplification, or copy number variations of marker genes across different taxa. Taxator-tk has an efficient, parallelized implementation that allows the assignment of 6 Gb of sequence data per day on a standard multiprocessor system with 10 CPU cores and microbial RefSeq as the genomic reference data. Availability and implementation: Taxator-tk source and binary program files are publicly available at http://algbio.cs.uni-duesseldorf.de/software/. Contact: Alice.McHardy@uni-duesseldorf.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25388150

  7. Modelling nanoscale objects in order to conduct an empirical research into their properties as part of an engineering system designed

    NASA Astrophysics Data System (ADS)

    Makarov, M.; Shchanikov, S.; Trantina, N.

    2017-01-01

    We have conducted a research into the major, in terms of their future application, properties of nanoscale objects, based on modelling these objects as free-standing physical elements beyond the structure of an engineering system designed for their integration as well as a part of a system that operates under the influence of the external environment. For the empirical research suggested within the scope of this work, we have chosen a nanoscale electronic element intended to be used while designing information processing systems with the parallel architecture - a memristor. The target function of the research was to provide the maximum fault-tolerance index of a memristor-based system when affected by all possible impacts of the internal destabilizing factors and external environment. The research results have enabled us to receive and classify all the factors predetermining the fault-tolerance index of the hardware implementation of a computing system based on the nanoscale electronic element base.

  8. Clinical appraisal of arterial stiffness: the Argonauts in front of the Golden Fleece

    PubMed Central

    Vlachopoulos, C; Aznaouridis, K; Stefanadis, C

    2006-01-01

    Interest in evaluating arterial elastic properties has grown in parallel with the widespread availability of non‐invasive methods for assessing arterial stiffness. A clinically useful diagnostic index must be pathophysiologically relevant, must be readily measurable, and must indicate the severity of the disease and predict the corresponding risk. Interventional modification of this index must parallel disease regression and benefit prognosis. The current evidence for the clinical value of estimating arterial stiffness (mainly of large, elastic‐type arteries, such as the aorta and the carotids) in the contemporary era of cardiovascular medicine is reviewed. PMID:16339817

  9. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  10. Establishment of key grid-connected performance index system for integrated PV-ES system

    NASA Astrophysics Data System (ADS)

    Li, Q.; Yuan, X. D.; Qi, Q.; Liu, H. M.

    2016-08-01

    In order to further promote integrated optimization operation of distributed new energy/ energy storage/ active load, this paper studies the integrated photovoltaic-energy storage (PV-ES) system which is connected with the distribution network, and analyzes typical structure and configuration selection for integrated PV-ES generation system. By combining practical grid- connected characteristics requirements and technology standard specification of photovoltaic generation system, this paper takes full account of energy storage system, and then proposes several new grid-connected performance indexes such as paralleled current sharing characteristic, parallel response consistency, adjusting characteristic, virtual moment of inertia characteristic, on- grid/off-grid switch characteristic, and so on. A comprehensive and feasible grid-connected performance index system is then established to support grid-connected performance testing on integrated PV-ES system.

  11. Political orientation moderates worldview defense in response to Osama bin Laden’s death

    PubMed Central

    Chopik, William J.; Konrath, Sara H.

    2016-01-01

    The current study examines Americans’ psychological responses to Osama bin Laden’s death. We tracked changes in how different participants responded to dissimilar others from the night of bin Laden’s death for five weeks. Liberal participants reported lower worldview defense (i.e., a defensive reaction to uphold one’s cultural worldview) immediately after bin Laden’s death but then returned to similar levels as their conservative counterparts over time. Conservative participants reported greater worldview defense during each point of the study and did not significantly change over time. These temporal differences between liberals and conservatives were only present in the year of bin Laden’s death and not one year prior before. The current findings demonstrate that liberals and conservatives may react differently after major societal events in predictable ways considering their moral foundations. PMID:28239251

  12. Use of nonpathogenic, green fluorescent protein-marked Escherichia coli Biotype I cultures to evaluate the self-cleansing capabilities of a commercial beef grinding system after a contamination event.

    PubMed

    Wages, Jennifer A; Williams, Jennifer; Adams, Jacquelyn; George, Bruce; Oxford, Eric; Zelenka, Dan

    2014-11-01

    Inoculated beef trim containing a cocktail of green fluorescent protein-marked Escherichia coli biotype I cultures as surrogates for E. coli O157:H7 was introduced into two large, commercial grinding facilities capable of producing 180,000 kg of ground product in 1 day. Three repetitions were performed over 3 days. Sampling occurred at three different points within the process: postprimary grind, postsecondary grind-blender, and postpackaging. Resulting data show that, as the inoculated meat passes through the system, the presence of the marked surrogate quickly diminishes. The depletion rates are directly related to the amount of product in kilograms (represented by time) that has passed through the system, but these rates vary with each step of the process. The primary grinder appears to rid itself of the contaminant the most quickly; in all repetitions, the contaminant was not detected within 5 min of introduction of the contaminated combo bin into the system, which in all cases, was prior to the introduction of a second combo bin and within 1,800 kg of product. After the blending step and subsequent secondary grinding, the contaminant was detected in product produced from both the parent combo and the combo bin added directly after the parent combo bin; however, for those days on which three combo bins (approximately 2,700 kg) were available for sampling, the contaminant was not detected from product representing the third combo bin. Similarly, at the packaging step, the contaminant was detected in the product produced by both the parent and second combo bins; however, on those days when a third combo bin was available for sampling (repetitions 2 and 3), the contaminant was not detected from product produced from the third combo bin.

  13. Small cell ovarian carcinoma: genomic stability and responsiveness to therapeutics.

    PubMed

    Gamwell, Lisa F; Gambaro, Karen; Merziotis, Maria; Crane, Colleen; Arcand, Suzanna L; Bourada, Valerie; Davis, Christopher; Squire, Jeremy A; Huntsman, David G; Tonin, Patricia N; Vanderhyden, Barbara C

    2013-02-21

    The biology of small cell ovarian carcinoma of the hypercalcemic type (SCCOHT), which is a rare and aggressive form of ovarian cancer, is poorly understood. Tumourigenicity, in vitro growth characteristics, genetic and genomic anomalies, and sensitivity to standard and novel chemotherapeutic treatments were investigated in the unique SCCOHT cell line, BIN-67, to provide further insight in the biology of this rare type of ovarian cancer. The tumourigenic potential of BIN-67 cells was determined and the tumours formed in a xenograft model was compared to human SCCOHT. DNA sequencing, spectral karyotyping and high density SNP array analysis was performed. The sensitivity of the BIN-67 cells to standard chemotherapeutic agents and to vesicular stomatitis virus (VSV) and the JX-594 vaccinia virus was tested. BIN-67 cells were capable of forming spheroids in hanging drop cultures. When xenografted into immunodeficient mice, BIN-67 cells developed into tumours that reflected the hypercalcemia and histology of human SCCOHT, notably intense expression of WT-1 and vimentin, and lack of expression of inhibin. Somatic mutations in TP53 and the most common activating mutations in KRAS and BRAF were not found in BIN-67 cells by DNA sequencing. Spectral karyotyping revealed a largely normal diploid karyotype (in greater than 95% of cells) with a visibly shorter chromosome 20 contig. High density SNP array analysis also revealed few genomic anomalies in BIN-67 cells, which included loss of heterozygosity of an estimated 16.7 Mb interval on chromosome 20. SNP array analyses of four SCCOHT samples also indicated a low frequency of genomic anomalies in the majority of cases. Although resistant to platinum chemotherapeutic drugs, BIN-67 cell viability in vitro was reduced by > 75% after infection with oncolytic viruses. These results show that SCCOHT differs from high-grade serous carcinomas by exhibiting few chromosomal anomalies and lacking TP53 mutations. Although BIN-67 cells are resistant to standard chemotherapeutic agents, their sensitivity to oncolytic viruses suggests that their therapeutic use in SCCOHT should be considered.

  14. Method of multiplexed analysis using ion mobility spectrometer

    DOEpatents

    Belov, Mikhail E [Richland, WA; Smith, Richard D [Richland, WA

    2009-06-02

    A method for analyzing analytes from a sample introduced into a Spectrometer by generating a pseudo random sequence of a modulation bins, organizing each modulation bin as a series of submodulation bins, thereby forming an extended pseudo random sequence of submodulation bins, releasing the analytes in a series of analyte packets into a Spectrometer, thereby generating an unknown original ion signal vector, detecting the analytes at a detector, and characterizing the sample using the plurality of analyte signal subvectors. The method is advantageously applied to an Ion Mobility Spectrometer, and an Ion Mobility Spectrometer interfaced with a Time of Flight Mass Spectrometer.

  15. An ultra-high-density bin map facilitates high-throughput QTL mapping of horticultural traits in pepper (Capsicum annuum).

    PubMed

    Han, Koeun; Jeong, Hee-Jin; Yang, Hee-Bum; Kang, Sung-Min; Kwon, Jin-Kyung; Kim, Seungill; Choi, Doil; Kang, Byoung-Cheorl

    2016-04-01

    Most agricultural traits are controlled by quantitative trait loci (QTLs); however, there are few studies on QTL mapping of horticultural traits in pepper (Capsicum spp.) due to the lack of high-density molecular maps and the sequence information. In this study, an ultra-high-density map and 120 recombinant inbred lines (RILs) derived from a cross between C. annuum'Perennial' and C. annuum'Dempsey' were used for QTL mapping of horticultural traits. Parental lines and RILs were resequenced at 18× and 1× coverage, respectively. Using a sliding window approach, an ultra-high-density bin map containing 2,578 bins was constructed. The total map length of the map was 1,372 cM, and the average interval between bins was 0.53 cM. A total of 86 significant QTLs controlling 17 horticultural traits were detected. Among these, 32 QTLs controlling 13 traits were major QTLs. Our research shows that the construction of bin maps using low-coverage sequence is a powerful method for QTL mapping, and that the short intervals between bins are helpful for fine-mapping of QTLs. Furthermore, bin maps can be used to improve the quality of reference genomes by elucidating the genetic order of unordered regions and anchoring unassigned scaffolds to linkage groups. © The Author 2016. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  16. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling... to two decimal places. Where we allow you to group multiple configurations together, measure the drag... the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin I or Bin II...

  17. 40 CFR 1037.520 - Modeling CO2 emissions to show compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS CONTROL OF EMISSIONS FROM NEW HEAVY-DUTY MOTOR VEHICLES Test and Modeling... to two decimal places. Where we allow you to group multiple configurations together, measure the drag... the drag area bin of an equivalent high-roof tractor. If the high-roof tractor is in Bin I or Bin II...

  18. A compost bin for handling privy wastes: its fabrication and use

    Treesearch

    R.E. Leonard; S.C. Fay

    1978-01-01

    A 24-ft3 (6.8-m3) fiberglass bin was constructed and tested for its effectiveness in composting privy wastes. A mixture of ground hardwood bark and raw sewage was used for composting. Temperatures in excess of 60°C for 36 hours were produced in the bin by aerobic, thermophilic composting. This temperature is...

  19. Development of a Novel Therapeutic Paradigm Utilizing a Mammary Gland-Targeted, Bin-1 Knockout Mouse Model

    DTIC Science & Technology

    2007-03-01

    Cell. Biol. 23, 4295 (Jun, 2003). Bin1 Ablation in Mammary Gland Delays Tissue Remodeling and Drives Cancer Progression Mee Young Chang, 1...Basu A, et al. Bin1 functionally interacts with Myc in cells and inhibits cell proliferation by multiple mechanisms. Oncogene 1999;18:3564–73. 5. Pineda

  20. Causes of Students' Violence at Al-Hussein Bin Talal University

    ERIC Educational Resources Information Center

    Alrawwad, Theeb M.; Alrfooh, Atif Eid

    2014-01-01

    This study aimed at identifying the causes of students' violence from the student's point of view, and also aimed at investigating the proper solutions to reduce the spread of violence at Al-Hussein Bin Talal University. The study sample consisted of (906) male and female students from Al-Hussein Bin Talal University, who have enrolled the summer…

  1. Development and preliminary evaluation of a new bin filler for apple harvesting and and infield sorting

    USDA-ARS?s Scientific Manuscript database

    The bin filler, which is used for filling the fruit container or bin with apples coming from the sorting system, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges in developing the bi...

  2. Development of a new bin filler for apple harvesting and infield sorting with a review of existing technologies

    USDA-ARS?s Scientific Manuscript database

    The bin filler, which receives apples from the sorting system and then places them in the bin evenly without causing bruise damage, plays a critical role for the self-propelled apple harvest and infield sorting (HIS) machine that is being developed in our laboratory. Two major technical challenges ...

  3. A Software Assurance Framework for Mitigating the Risks of Malicious Software in Embedded Systems Used in Aircraft

    DTIC Science & Technology

    2011-09-01

    to show cryptographic signature # generation on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp/csdb CODEBASE=. touch "$CSDB" find "$CODEBASE" -type f...artifacts generated earlier. 81 #! /bin/sh # # Demo program to show cryptographic signature # verification on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp

  4. Increasing Donations to Supermarket Food-Bank Bins Using Proximal Prompts

    ERIC Educational Resources Information Center

    Farrimond, Samantha J.; Leland, Louis S., Jr.

    2006-01-01

    There has been little research into interventions to increase participation in donating items to food-bank bins. In New Zealand, there has been an increased demand from food banks (Stewart, 2002). This study demonstrated that point-of-sale prompts can be an effective method of increasing donations to a supermarket food-bank bin. (Contains 1…

  5. 14. OBLIQUE VIEW OF UPPER ORE BIN AND LOADING DECK, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    14. OBLIQUE VIEW OF UPPER ORE BIN AND LOADING DECK, LOOKING WEST. DETAIL OF SUPPORTING TIMBERS. THE LOCATION OF THIS ORE BIN IN RELATION TO THE MILL CAN BE SEEN IN MANY OF THE MILL OVERVIEWS. (CA-290-4 THROUGH CA-290-8). - Skidoo Mine, Park Route 38 (Skidoo Road), Death Valley Junction, Inyo County, CA

  6. A Neural Network Architecture For Rapid Model Indexing In Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Pawlicki, Ted

    1988-03-01

    Models of objects stored in memory have been shown to be useful for guiding the processing of computer vision systems. A major consideration in such systems, however, is how stored models are initially accessed and indexed by the system. As the number of stored models increases, the time required to search memory for the correct model becomes high. Parallel distributed, connectionist, neural networks' have been shown to have appealing content addressable memory properties. This paper discusses an architecture for efficient storage and reference of model memories stored as stable patterns of activity in a parallel, distributed, connectionist, neural network. The emergent properties of content addressability and resistance to noise are exploited to perform indexing of the appropriate object centered model from image centered primitives. The system consists of three network modules each of which represent information relative to a different frame of reference. The model memory network is a large state space vector where fields in the vector correspond to ordered component objects and relative, object based spatial relationships between the component objects. The component assertion network represents evidence about the existence of object primitives in the input image. It establishes local frames of reference for object primitives relative to the image based frame of reference. The spatial relationship constraint network is an intermediate representation which enables the association between the object based and the image based frames of reference. This intermediate level represents information about possible object orderings and establishes relative spatial relationships from the image based information in the component assertion network below. It is also constrained by the lawful object orderings in the model memory network above. The system design is consistent with current psychological theories of recognition by component. It also seems to support Marr's notions of hierarchical indexing. (i.e. the specificity, adjunct, and parent indices) It supports the notion that multiple canonical views of an object may have to be stored in memory to enable its efficient identification. The use of variable fields in the state space vectors appears to keep the number of required nodes in the network down to a tractable number while imposing a semantic value on different areas of the state space. This semantic imposition supports an interface between the analogical aspects of neural networks and the propositional paradigms of symbolic processing.

  7. Optimizing and evaluating the reconstruction of Metagenome-assembled microbial genomes.

    PubMed

    Papudeshi, Bhavya; Haggerty, J Matthew; Doane, Michael; Morris, Megan M; Walsh, Kevin; Beattie, Douglas T; Pande, Dnyanada; Zaeri, Parisa; Silva, Genivaldo G Z; Thompson, Fabiano; Edwards, Robert A; Dinsdale, Elizabeth A

    2017-11-28

    Microbiome/host interactions describe characteristics that affect the host's health. Shotgun metagenomics includes sequencing a random subset of the microbiome to analyze its taxonomic and metabolic potential. Reconstruction of DNA fragments into genomes from metagenomes (called metagenome-assembled genomes) assigns unknown fragments to taxa/function and facilitates discovery of novel organisms. Genome reconstruction incorporates sequence assembly and sorting of assembled sequences into bins, characteristic of a genome. However, the microbial community composition, including taxonomic and phylogenetic diversity may influence genome reconstruction. We determine the optimal reconstruction method for four microbiome projects that had variable sequencing platforms (IonTorrent and Illumina), diversity (high or low), and environment (coral reefs and kelp forests), using a set of parameters to select for optimal assembly and binning tools. We tested the effects of the assembly and binning processes on population genome reconstruction using 105 marine metagenomes from 4 projects. Reconstructed genomes were obtained from each project using 3 assemblers (IDBA, MetaVelvet, and SPAdes) and 2 binning tools (GroopM and MetaBat). We assessed the efficiency of assemblers using statistics that including contig continuity and contig chimerism and the effectiveness of binning tools using genome completeness and taxonomic identification. We concluded that SPAdes, assembled more contigs (143,718 ± 124 contigs) of longer length (N50 = 1632 ± 108 bp), and incorporated the most sequences (sequences-assembled = 19.65%). The microbial richness and evenness were maintained across the assembly, suggesting low contig chimeras. SPAdes assembly was responsive to the biological and technological variations within the project, compared with other assemblers. Among binning tools, we conclude that MetaBat produced bins with less variation in GC content (average standard deviation: 1.49), low species richness (4.91 ± 0.66), and higher genome completeness (40.92 ± 1.75) across all projects. MetaBat extracted 115 bins from the 4 projects of which 66 bins were identified as reconstructed metagenome-assembled genomes with sequences belonging to a specific genus. We identified 13 novel genomes, some of which were 100% complete, but show low similarity to genomes within databases. In conclusion, we present a set of biologically relevant parameters for evaluation to select for optimal assembly and binning tools. For the tools we tested, SPAdes assembler and MetaBat binning tools reconstructed quality metagenome-assembled genomes for the four projects. We also conclude that metagenomes from microbial communities that have high coverage of phylogenetically distinct, and low taxonomic diversity results in highest quality metagenome-assembled genomes.

  8. Solar Radiation Pressure Binning for the Geosynchronous Orbit

    NASA Technical Reports Server (NTRS)

    Hejduk, M. D.; Ghrist, R. W.

    2011-01-01

    Orbital maintenance parameters for individual satellites or groups of satellites have traditionally been set by examining orbital parameters alone, such as through apogee and perigee height binning; this approach ignored the other factors that governed an individual satellite's susceptibility to non-conservative forces. In the atmospheric drag regime, this problem has been addressed by the introduction of the "energy dissipation rate," a quantity that represents the amount of energy being removed from the orbit; such an approach is able to consider both atmospheric density and satellite frontal area characteristics and thus serve as a mechanism for binning satellites of similar behavior. The geo-synchronous orbit (of broader definition than the geostationary orbit -- here taken to be from 1300 to 1800 minutes in orbital period) is not affected by drag; rather, its principal non-conservative force is that of solar radiation pressure -- the momentum imparted to the satellite by solar radiometric energy. While this perturbation is solved for as part of the orbit determination update, no binning or division scheme, analogous to the drag regime, has been developed for the geo-synchronous orbit. The present analysis has begun such an effort by examining the behavior of geosynchronous rocket bodies and non-stabilized payloads as a function of solar radiation pressure susceptibility. A preliminary examination of binning techniques used in the drag regime gives initial guidance regarding the criteria for useful bin divisions. Applying these criteria to the object type, solar radiation pressure, and resultant state vector accuracy for the analyzed dataset, a single division of "large" satellites into two bins for the purposes of setting related sensor tasking and orbit determination (OD) controls is suggested. When an accompanying analysis of high area-to-mass objects is complete, a full set of binning recommendations for the geosynchronous orbit will be available.

  9. Context adaptive binary arithmetic coding-based data hiding in partially encrypted H.264/AVC videos

    NASA Astrophysics Data System (ADS)

    Xu, Dawen; Wang, Rangding

    2015-05-01

    A scheme of data hiding directly in a partially encrypted version of H.264/AVC videos is proposed which includes three parts, i.e., selective encryption, data embedding and data extraction. Selective encryption is performed on context adaptive binary arithmetic coding (CABAC) bin-strings via stream ciphers. By careful selection of CABAC entropy coder syntax elements for selective encryption, the encrypted bitstream is format-compliant and has exactly the same bit rate. Then a data-hider embeds the additional data into partially encrypted H.264/AVC videos using a CABAC bin-string substitution technique without accessing the plaintext of the video content. Since bin-string substitution is carried out on those residual coefficients with approximately the same magnitude, the quality of the decrypted video is satisfactory. Video file size is strictly preserved even after data embedding. In order to adapt to different application scenarios, data extraction can be done either in the encrypted domain or in the decrypted domain. Experimental results have demonstrated the feasibility and efficiency of the proposed scheme.

  10. A scoring metric for multivariate data for reproducibility analysis using chemometric methods

    PubMed Central

    Sheen, David A.; de Carvalho Rocha, Werickson Fortunato; Lippa, Katrice A.; Bearden, Daniel W.

    2017-01-01

    Process quality control and reproducibility in emerging measurement fields such as metabolomics is normally assured by interlaboratory comparison testing. As a part of this testing process, spectral features from a spectroscopic method such as nuclear magnetic resonance (NMR) spectroscopy are attributed to particular analytes within a mixture, and it is the metabolite concentrations that are returned for comparison between laboratories. However, data quality may also be assessed directly by using binned spectral data before the time-consuming identification and quantification. Use of the binned spectra has some advantages, including preserving information about trace constituents and enabling identification of process difficulties. In this paper, we demonstrate the use of binned NMR spectra to conduct a detailed interlaboratory comparison and composition analysis. Spectra of synthetic and biologically-obtained metabolite mixtures, taken from a previous interlaboratory study, are compared with cluster analysis using a variety of distance and entropy metrics. The individual measurements are then evaluated based on where they fall within their clusters, and a laboratory-level scoring metric is developed, which provides an assessment of each laboratory’s individual performance. PMID:28694553

  11. Solid State Spin-Wave Quantum Memory for Time-Bin Qubits.

    PubMed

    Gündoğan, Mustafa; Ledingham, Patrick M; Kutluer, Kutlu; Mazzera, Margherita; de Riedmatten, Hugues

    2015-06-12

    We demonstrate the first solid-state spin-wave optical quantum memory with on-demand read-out. Using the full atomic frequency comb scheme in a Pr(3+):Y2SiO5 crystal, we store weak coherent pulses at the single-photon level with a signal-to-noise ratio >10. Narrow-band spectral filtering based on spectral hole burning in a second Pr(3+):Y2SiO5 crystal is used to filter out the excess noise created by control pulses to reach an unconditional noise level of (2.0±0.3)×10(-3) photons per pulse. We also report spin-wave storage of photonic time-bin qubits with conditional fidelities higher than achievable by a measure and prepare strategy, demonstrating that the spin-wave memory operates in the quantum regime. This makes our device the first demonstration of a quantum memory for time-bin qubits, with on-demand read-out of the stored quantum information. These results represent an important step for the use of solid-state quantum memories in scalable quantum networks.

  12. Vicarious revenge and the death of Osama bin Laden.

    PubMed

    Gollwitzer, Mario; Skitka, Linda J; Wisneski, Daniel; Sjöström, Arne; Liberman, Peter; Nazir, Syed Javed; Bushman, Brad J

    2014-05-01

    Three hypotheses were derived from research on vicarious revenge and tested in the context of the assassination of Osama bin Laden in 2011. In line with the notion that revenge aims at delivering a message (the "message hypothesis"), Study 1 shows that Americans' vengeful desires in the aftermath of 9/11 predicted a sense of justice achieved after bin Laden's death, and that this effect was mediated by perceptions that his assassination sent a message to the perpetrators to not "mess" with the United States. In line with the "blood lust hypothesis," his assassination also sparked a desire to take further revenge and to continue the "war on terror." Finally, in line with the "intent hypothesis," Study 2 shows that Americans (but not Pakistanis or Germans) considered the fact that bin Laden was killed intentionally more satisfactory than the possibility of bin Laden being killed accidentally (e.g., in an airplane crash).

  13. Garbage monitoring system using IoT

    NASA Astrophysics Data System (ADS)

    Anitha, A.

    2017-11-01

    Nowadays certain actions are taken to improve the level of cleanliness in the country. People are getting more active in doing all the things possible to clean their surroundings. Various movements are also started by the government to increase cleanliness. We will try to build a system which will notify the corporations to empty the bin on time. In this system, we will put a sensor on top of the garbage bin which will detect the total level of garbage inside it according to the total size of the bin. When the garbage will reach the maximum level, a notification will be sent to the corporation's office, then the employees can take further actions to empty the bin. This system will help in cleaning the city in a better way. By using this system people do not have to check all the systems manually but they will get a notification when the bin will get filled.

  14. Deterministically swapping frequency-bin entanglement from photon-photon to atom-photon hybrid systems

    NASA Astrophysics Data System (ADS)

    Ou, Bao-Quan; Liu, Chang; Sun, Yuan; Chen, Ping-Xing

    2018-02-01

    Inspired by the recent developments of the research on the atom-photon quantum interface and energy-time entanglement between single-photon pulses, we are motivated to study the deterministic protocol for the frequency-bin entanglement of the atom-photon hybrid system, which is analogous to the frequency-bin entanglement between single-photon pulses. We show that such entanglement arises naturally in considering the interaction between a frequency-bin entangled single-photon pulse pair and a single atom coupled to an optical cavity, via straightforward atom-photon phase gate operations. Its anticipated properties and preliminary examples of its potential application in quantum networking are also demonstrated. Moreover, we construct a specific quantum entanglement witness tool to detect such extended frequency-bin entanglement from a reasonably general set of separable states, and prove its capability theoretically. We focus on the energy-time considerations throughout the analysis.

  15. The electron drift velocity, ion acoustic speed and irregularity drifts in high-latitude E-region

    NASA Astrophysics Data System (ADS)

    Uspensky, M. V.; Pellinen, R. J.; Janhunen, P.

    2008-10-01

    The purpose of this study is to examine the STARE irregularity drift velocity dependence on the EISCAT line-of-sight (los or l-o-s) electron drift velocity magnitude, VE×Blos, and the flow angle ΘN,F (superscript N and/or F refer to the STARE Norway and Finland radar). In the noon-evening sector the flow angle dependence of Doppler velocities, VirrN,F, inside and outside the Farley-Buneman (FB) instability cone (|VE×Blos|>Cs and |VE×Blos||VE×Blos|. Both features (a and b) as well as the weak flow angle velocity dependence indicate that the l-o-s electron drift velocity cannot be the sole factor which controls the motion of the backscatter ~1-m irregularities at large flow angles. Importantly, the backscatter was collected at aspect angle ~1° and flow angle Θ>60°, where linear fluid and kinetic theories invariably predict negative growth rates. At least qualitatively, all the facts can be reasonably explained by nonlinear wave-wave coupling found and described by Kudeki and Farley (1989), Lu et al. (2008) for the equatorial electrojet and studied in numerical simulation by Otani and Oppenheim (1998, 2006).

  16. Temporal and spatial variations in fly ash quality

    USGS Publications Warehouse

    Hower, J.C.; Trimble, A.S.; Eble, C.F.

    2001-01-01

    Fly ash quality, both as the amount of petrographically distinguishable carbons and in chemistry, varies in both time and space. Temporal variations are a function of a number of variables. Variables can include variations in the coal blend organic petrography, mineralogy, and chemistry; variations in the pulverization of the coal, both as a function of the coal's Hardgrove grindability index and as a function of the maintenance and settings of the pulverizers; and variations in the operating conditions of the boiler, including changes in the pollution control system. Spatial variation, as an instantaneous measure of fly ash characteristics, should not involve changes in the first two sets of variables listed above. Spatial variations are a function of the gas flow within the boiler and ducts, certain flow conditions leading to a tendency for segregation of the less-dense carbons in one portion of the gas stream. Caution must be applied in sampling fly ash. Samples from a single bin, or series of bins, m ay not be representative of the whole fly ash, providing a biased view of the nature of the material. Further, it is generally not possible to be certain about variation until the analysis of the ash is complete. ?? 2001 Elsevier Science B.V. All rights reserved.

  17. Improved protocols to accelerate the assembly of DNA barcode reference libraries for freshwater zooplankton.

    PubMed

    Elías-Gutiérrez, Manuel; Valdez-Moreno, Martha; Topan, Janet; Young, Monica R; Cohuo-Colli, José Angel

    2018-03-01

    Currently, freshwater zooplankton sampling and identification methodologies have remained virtually unchanged since they were first established in the beginning of the XX century. One major contributing factor to this slow progress is the limited success of modern genetic methodologies, such as DNA barcoding, in several of the main groups. This study demonstrates improved protocols which enable the rapid assessment of most animal taxa inhabiting any freshwater system by combining the use of light traps, careful fixation at low temperatures using ethanol, and zooplankton-specific primers. We DNA-barcoded 2,136 specimens from a diverse array of taxonomic assemblages (rotifers, mollusks, mites, crustaceans, insects, and fishes) from several Canadian and Mexican lakes with an average sequence success rate of 85.3%. In total, 325 Barcode Index Numbers (BINs) were detected with only three BINs (two cladocerans and one copepod) shared between Canada and Mexico, suggesting a much narrower distribution range of freshwater zooplankton than previously thought. This study is the first to broadly explore the metazoan biodiversity of freshwater systems with DNA barcodes to construct a reference library that represents the first step for future programs which aim to monitor ecosystem health, track invasive species, or improve knowledge of the ecology and distribution of freshwater zooplankton.

  18. DNA barcoding as a useful tool in the systematic study of wild bees of the tribe Augochlorini (Hymenoptera: Halictidae).

    PubMed

    González-Vaquero, Rocío Ana; Roig-Alsina, Arturo; Packer, Laurence

    2016-10-01

    Special care is needed in the delimitation and identification of halictid bee species, which are renowned for being morphologically monotonous. Corynura Spinola and Halictillus Moure (Halictidae: Augochlorini) contain species that are key elements in southern South American ecosystems. These bees are very difficult to identify due to close morphological similarity among species and high sexual dimorphism. We analyzed 170 barcode-compliant COI sequences from 19 species. DNA barcodes were useful to confirm gender associations and to detect two new cryptic species. Interspecific distances were significantly higher than those reported for other bees. Maximum intraspecific divergence was less than 1% in 14 species. Barcode index numbers (BINs) were useful to identify putative species that need further study. More than one BIN was assigned to five species. The name Corynura patagonica (Cockerell) probably refers to two cryptic species. The results suggest that Corynura and Halictillus species can be identified using DNA barcodes. The sequences of the species included in this study can be used as a reference to assess the identification of unknown specimens. This study provides additional support for the use of DNA barcodes in bee taxonomy and the identification of specimens, which is particularly relevant in insects of ecological importance such as pollinators.

  19. DNA barcodes and citizen science provoke a diversity reappraisal for the "ring" butterflies of Peninsular Malaysia (Ypthima: Satyrinae: Nymphalidae: Lepidoptera).

    PubMed

    Jisming-See, Shi-Wei; Sing, Kong-Wah; Wilson, John-James

    2016-10-01

    The "rings" belonging to the genus Ypthima are amongst the most common butterflies in Peninsular Malaysia. However, the species can be difficult to tell apart, with keys relying on minor and often non-discrete ring characters found on the hindwing. Seven species have been reported from Peninsular Malaysia, but this is thought to be an underestimate of diversity. DNA barcodes of 165 individuals, and wing and genital morphology, were examined to reappraise species diversity of this genus in Peninsular Malaysia. DNA barcodes collected during citizen science projects-School Butterfly Project and Peninsular Malaysia Butterfly Count-recently conducted in Peninsular Malaysia were included. The new DNA barcodes formed six groups with different Barcode Index Numbers (BINs) representing four species reported in Peninsular Malaysia. When combined with public DNA barcodes from the Barcode Of Life Datasystems, several taxonomic issues arose. We consider the taxon Y. newboldi, formerly treated as a subspecies of Y. baldus, as a distinct species. DNA barcodes also supported an earlier suggestion that Y. nebulosa is a synonym under Y. horsfieldii humei. Two BINs of the genus Ypthima comprising DNA barcodes collected during citizen science projects did not correspond to any species previously reported in Peninsular Malaysia.

  20. DNA barcoding of human-biting black flies (Diptera: Simuliidae) in Thailand.

    PubMed

    Pramual, Pairot; Thaijarern, Jiraporn; Wongpakam, Komgrit

    2016-12-01

    Black flies (Diptera: Simuliidae) are important insect vectors and pests of humans and animals. Accurate identification, therefore, is important for control and management. In this study, we used mitochondrial cytochrome oxidase I (COI) barcoding sequences to test the efficiency of species identification for the human-biting black flies in Thailand. We used human-biting specimens because they enabled us to link information with previous studies involving the immature stages. Three black fly taxa, Simulium nodosum, S. nigrogilvum and S. doipuiense complex, were collected. The S. doipuiense complex was confirmed for the first time as having human-biting habits. The COI sequences revealed considerable genetic diversity in all three species. Comparisons to a COI sequence library of black flies in Thailand and in a public database indicated a high efficiency for specimen identification for S. nodosum and S. nigrogilvum, but this method was not successful for the S. doipuiense complex. Phylogenetic analyses revealed two divergent lineages in the S. doipuiense complex. Human-biting specimens formed a separate clade from other members of this complex. The results are consistent with the Barcoding Index Number System (BINs) analysis that found six BINs in the S. doipuiense complex. Further taxonomic work is needed to clarify the species status of these human-biting specimens. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Testing DNA barcode performance in 1000 species of European lepidoptera: large geographic distances have small genetic impacts.

    PubMed

    Huemer, Peter; Mutanen, Marko; Sefc, Kristina M; Hebert, Paul D N

    2014-01-01

    This study examines the performance of DNA barcodes (mt cytochrome c oxidase 1 gene) in the identification of 1004 species of Lepidoptera shared by two localities (Finland, Austria) that are 1600 km apart. Maximum intraspecific distances for the pooled data were less than 2% for 880 species (87.6%), while deeper divergence was detected in 124 species. Despite such variation, the overall DNA barcode library possessed diagnostic COI sequences for 98.8% of the taxa. Because a reference library based on Finnish specimens was highly effective in identifying specimens from Austria, we conclude that barcode libraries based on regional sampling can often be effective for a much larger area. Moreover, dispersal ability (poor, good) and distribution patterns (disjunct, fragmented, continuous, migratory) had little impact on levels of intraspecific geographic divergence. Furthermore, the present study revealed that, despite the intensity of past taxonomic work on European Lepidoptera, nearly 20% of the species shared by Austria and Finland require further work to clarify their status. Particularly discordant BIN (Barcode Index Number) cases should be checked to ascertain possible explanatory factors such as incorrect taxonomy, hybridization, introgression, and Wolbachia infections.

  2. Bioremediation of diesel oil-contaminated soil by composting with biowaste.

    PubMed

    Van Gestel, Kristin; Mergaert, Joris; Swings, Jean; Coosemans, Jozef; Ryckeboer, Jaak

    2003-01-01

    Soil spiked with diesel oil was mixed with biowaste (vegetable, fruit and garden waste) at a 1:10 ratio (fresh weight) and composted in a monitored composting bin system for 12 weeks. Pure biowaste was composted in parallel. In order to discern the temperature effect from the additional biowaste effect on diesel degradation, one recipient with contaminated soil was hold at room temperature, while another was kept at the actual composting temperature. Measurements of composting parameters together with enumerations and identifications of microorganisms demonstrate that the addition of the contaminated soil had a minor impact on the composting process. The first-order rate constant of diesel degradation in the biowaste mixture was four times higher than in the soil at room temperature, and 1.2 times higher than in the soil at composting temperature.

  3. Influences of Local Sea-Surface Temperatures and Large-scale Dynamics on Monthly Precipitation Inferred from Two 10-year GCM-Simulations

    NASA Technical Reports Server (NTRS)

    Sud, Y. C.; Walker, G. K.; Zhou, Y.; Lau, W. K.-M.

    2007-01-01

    Two parallel sets of 10-year long: January 1, 1982 to December 31, 1991, simulations were made with the finite volume General Circulation Model (fvGCM) in which the model integrations were forced with prescribed sea-surface temperature fields (SSTs) available as two separate SST-datasets. One dataset contained naturally varying monthly SSTs for the chosen period, and the oth& had the 12-monthly mean SSTs for the same period. Plots of evaporation, precipitation, and atmosphere-column moisture convergence, binned by l C SST intervals show that except for the tropics, the precipitation is more strongly constrained by large-scale dynamics as opposed to local SST. Binning data by SST naturally provided an ensemble average of data contributed from disparate locations with same SST; such averages could be expected to mitigate all location related influences. However, the plots revealed: i) evaporation, vertical velocity, and precipitation are very robust and remarkably similar for each of the two simulations and even for the data from 1987-ENSO-year simulation; ii) while the evaporation increased monotonically with SST up to about 27 C, the precipitation did not; iii) precipitation correlated much better with the column vertical velocity as opposed to SST suggesting that the influence of dynamical circulation including non-local SSTs is stronger than local-SSTs. The precipitation fields were doubly binned with respect to SST and boundary-layer mass and/or moisture convergence. The analysis discerned the rate of change of precipitation with local SST as a sum of partial derivative of precipitation with local SST plus partial derivative of precipitation with boundary layer moisture convergence multiplied by the rate of change of boundary-layer moisture convergence with SST (see Eqn. 3 of Section 4.5). This analysis is mathematically rigorous as well as provides a quantitative measure of the influence of local SST on the local precipitation. The results were recast to examine the dependence of local rainfall on local SSTs; it was discernible only in the tropics. Our methodology can be used for computing relationship between any forcing function and its effect(s) on a chosen field.

  4. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  5. Evolution of column density distributions within Orion A⋆

    NASA Astrophysics Data System (ADS)

    Stutz, A. M.; Kainulainen, J.

    2015-05-01

    We compare the structure of star-forming molecular clouds in different regions of Orion A to determine how the column density probability distribution function (N-PDF) varies with environmental conditions such as the fraction of young protostars. A correlation between the N-PDF slope and Class 0 protostar fraction has been previously observed in a low-mass star-formation region (Perseus); here we test whether a similar correlation is observed in a high-mass star-forming region. We used Herschel PACS and SPIRE cold dust emission observations to derive a column density map of Orion A. We used the Herschel Orion Protostar Survey catalog to accurately identify and classify the Orion A young stellar object content, including the cold and relatively short-lived Class 0 protostars (with a lifetime of ~0.14 Myr). We divided Orion A into eight independent regions of 0.25 square degrees (13.5 pc2); in each region we fit the N-PDF distribution with a power law, and we measured the fraction of Class 0 protostars. We used a maximum-likelihood method to measure the N-PDF power-law index without binning the column density data. We find that the Class 0 fraction is higher in regions with flatter column density distributions. We tested the effects of incompleteness, extinction-driven misclassification of Class 0 sources, resolution, and adopted pixel-scales. We show that these effects cannot account for the observed trend. Our observations demonstrate an association between the slope of the power-law N-PDF and the Class 0 fractions within Orion A. Various interpretations are discussed, including timescales based on the Class 0 protostar fraction assuming a constant star-formation rate. The observed relation suggests that the N-PDF can be related to an evolutionary state of the gas. If universal, such a relation permits evaluating the evolutionary state from the N-PDF power-law index at much greater distances than those accessible with protostar counts. Appendices are available in electronic form at http://www.aanda.orgThe N(H) map as a FITS file is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/577/L6

  6. EFFECT OF A NOVEL ESSENTIAL OIL MOUTHRINSE WITHOUT ALCOHOL ON GINGIVITIS: A DOUBLE-BLINDED RANDOMIZED CONTROLLED TRIAL

    PubMed Central

    Botelho, Marco Antonio; Bezerra, José Gomes; Correa, Luciano Lima; Fonseca, Said Gonçalves da Cruz; Montenegro, Danusa; Gapski, Ricardo; Brito, Gerly Anne Castro; Heukelbach, Jörg

    2007-01-01

    Several different plant extracts have been evaluated with respect to their antimicrobial effects against oral pathogens and for reduction of gingivitis. Given that a large number of these substances have been associated with significant side effects that contraindicate their long-term use, new compounds need to be tested. The aim of this study was to assess the short-term safety and efficacy of a Lippia sidoides ("alecrim pimenta")-based essential oil mouthrinse on gingival inflammation and bacterial plaque. Fifty-five patients were enrolled into a pilot, double-blinded, randomized, parallel-armed study. Patients were randomly assigned to undergo a 7-day treatment regimen with either the L. sidoides-based mouthrinse or 0.12% chlorhexidine mouthrinse. The results demonstrated decreased plaque index, gingival index and gingival bleeding index scores at 7 days, as compared to baseline. There was no statistically significance difference (p>0.05) between test and control groups for any of the clinical parameters assessed throughout the study. Adverse events were mild and transient. The findings of this study demonstrated that the L. sidoides-based mouthrinse was safe and efficacious in reducing bacterial plaque and gingival inflammation. PMID:19089126

  7. 16 CFR § 1301.6 - Test conditions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 2 2013-01-01 2013-01-01 false Test conditions. § 1301.6 Section § 1301.6... UNSTABLE REFUSE BINS § 1301.6 Test conditions. (a) The refuse bin shall be empty and have its lids or covers in a position which would most adversely affect the stability of the bin when tested. (b) The...

  8. 29 CFR 1917.49 - Spouts, chutes, hoppers, bins, and associated equipment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... the entry; and (2) The power supply to the equipment carrying the cargo to the bin shall be turned off... been notified of the entry; (2) The power supply to the equipment carrying the cargo to the bin is... adjustments are made to a power shovel, wire, or associated equipment, the power supply to the shovel shall be...

  9. 29 CFR 1917.49 - Spouts, chutes, hoppers, bins, and associated equipment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... the entry; and (2) The power supply to the equipment carrying the cargo to the bin shall be turned off... been notified of the entry; (2) The power supply to the equipment carrying the cargo to the bin is... adjustments are made to a power shovel, wire, or associated equipment, the power supply to the shovel shall be...

  10. DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN. CONVEYOR PLATFORM,TRAM TRESTLE, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN. CONVEYOR PLATFORM,TRAM TRESTLE, AND LOADING PLATFORM. LOOKING SOUTHWEST. THE HOLE IN THE ORE BIN FLOOR CAN BE SEEN, AND BALL MILL FOUNDATION AT LOWER LEFT CORNER. SEE CA-291-47(CT) FOR IDENTICAL COLOR TRANSPARENCY. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  11. DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN, CONVEYOR PLATFORM TRAM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    DETAIL OVERHEAD VIEW OF SECONDARY ORE BIN, CONVEYOR PLATFORM TRAM TRESTLE, AND LOADING PLATFORM, LOOKING SOUTHWEST. THE HOLE IN THE ORE BIN FLOOR CAN BE SEEN, AND BALL MILL FOUNDATION AT LOWER LEFT CORNER. SEE CA-291-13 FOR IDENTICAL B&W NEGATIVE. - Keane Wonder Mine, Park Route 4 (Daylight Pass Cutoff), Death Valley Junction, Inyo County, CA

  12. 19 CFR 19.29 - Sealing of bins or other bonded space.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Bonded for the Storage of Wheat § 19.29 Sealing of bins or other bonded space. The outlets to all bins or other space bonded for the storage of imported wheat shall be sealed by affixing locks or in bond seals... which will effectively prevent the removal of, or access to, the wheat in the bonded space except under...

  13. Fungal volatiles associated with moldy grain in ventilated and non-ventilated bin-stored wheat.

    PubMed

    Sinha, R N; Tuma, D; Abramson, D; Muir, W E

    1988-01-01

    The fungal odor compounds 3-methyl-1-butanol, 1-octen-3-ol and 3-octanone were monitored in nine experimental bins in Winnipeg, Manitoba containing a hard red spring wheat during the autumn, winter and summer seasons of 1984-85. Quality changes were associated with seed-borne microflora and moisture content in both ventilated and non-ventilated bins containing wheat of 15.6 and 18.2% initial moisture content. All three odor compounds occurred in considerably greater amounts in bulk wheat in non-ventilated than in ventilated bins, particularly in those with wheat having 18.2% moisture content. The presence of these compounds usually coincided with infection of the seeds by the fungi Alternaria alternata (Fr.) Keissler, Aspergillus repens DeBarry, A. versicolor (Vuill.) Tiraboschi, Penicillium crustosum Thom, P. oxalicum Currie and Thom, P. aurantiogriesum Dierckx, and P. citrinum Thom. High production of all three odor compounds in damp wheat stored in non-ventilated bins was associated with heavy fungal infection of the seeds and reduction in seed germinability. High initial moisture content of the harvested grain accelerated the production of all three fungal volatiles in non-ventilated bins.

  14. Non-elliptic wavevector anisotropy for magnetohydrodynamic turbulence

    NASA Astrophysics Data System (ADS)

    Narita, Y.

    2015-11-01

    A model of non-elliptic wavevector anisotropy is developed for the inertial-range spectrum of magnetohydrodynamic turbulence and is presented in the two-dimensional wavevector domain spanning the directions parallel and perpendicular to the mean magnetic field. The non-elliptic model is a variation of the elliptic model with different scalings along the parallel and the perpendicular components of the wavevectors to the mean magnetic field. The non-elliptic anisotropy model reproduces the smooth transition of the power-law spectra from an index of -2 in the parallel projection with respect to the mean magnetic field to an index of -5/3 in the perpendicular projection observed in solar wind turbulence, and is as competitive as the critical balance model to explain the measured frequency spectra in the solar wind. The parameters in the non-elliptic spectrum model are compared with the solar wind observations.

  15. Experimental preparation and characterization of four-dimensional quantum states using polarization and time-bin modes of a single photon

    NASA Astrophysics Data System (ADS)

    Yoo, Jinwon; Choi, Yujun; Cho, Young-Wook; Han, Sang-Wook; Lee, Sang-Yun; Moon, Sung; Oh, Kyunghwan; Kim, Yong-Su

    2018-07-01

    We present a detailed method to prepare and characterize four-dimensional pure quantum states or ququarts using polarization and time-bin modes of a single-photon. In particular, we provide a simple method to generate an arbitrary pure ququart and fully characterize the state with quantum state tomography. We also verify the reliability of the recipe by showing experimental preparation and characterization of 20 ququart states in mutually unbiased bases. As qudits provide superior properties over qubits in many fundamental tests of quantum physics and applications in quantum information processing, the presented method will be useful for photonic quantum information science.

  16. Scalable boson sampling with time-bin encoding using a loop-based architecture.

    PubMed

    Motes, Keith R; Gilchrist, Alexei; Dowling, Jonathan P; Rohde, Peter P

    2014-09-19

    We present an architecture for arbitrarily scalable boson sampling using two nested fiber loops. The architecture has fixed experimental complexity, irrespective of the size of the desired interferometer, whose scale is limited only by fiber and switch loss rates. The architecture employs time-bin encoding, whereby the incident photons form a pulse train, which enters the loops. Dynamically controlled loop coupling ratios allow the construction of the arbitrary linear optics interferometers required for boson sampling. The architecture employs only a single point of interference and may thus be easier to stabilize than other approaches. The scheme has polynomial complexity and could be realized using demonstrated present-day technologies.

  17. Evaluating Nighttime CALIOP 0.532 micron Aerosol Optical Depth and Extinction Coefficient Retrievals

    NASA Technical Reports Server (NTRS)

    Campbell, J. R.; Tackett, J. L.; Reid, J. S.; Zhang, J.; Curtis, C. A.; Hyer, E. J.; Sessions, W. R.; Westphal, D. L.; Prospero, J. M.; Welton, E. J.; hide

    2012-01-01

    NASA Cloud Aerosol Lidar with Orthogonal Polarization (CALIOP) Version 3.01 5-km nighttime 0.532 micron aerosol optical depth (AOD) datasets from 2007 are screened, averaged and evaluated at 1 deg X 1 deg resolution versus corresponding/co-incident 0.550 micron AOD derived using the US Navy Aerosol Analysis and Prediction System (NAAPS), featuring two-dimensional variational assimilation of quality-assured NASA Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging Spectroradiometer (MISR) AOD. In the absence of sunlight, since passive radiometric AOD retrievals rely overwhelmingly on scattered radiances, the model represents one of the few practical global estimates available from which to attempt such a validation. Daytime comparisons, though, provide useful context. Regional-mean CALIOP vertical profiles of night/day 0.532 micron extinction coefficient are compared with 0.523/0.532 micron ground-based lidar measurements to investigate representativeness and diurnal variability. In this analysis, mean nighttime CALIOP AOD are mostly lower than daytime (0.121 vs. 0.126 for all aggregated data points, and 0.099 vs. 0.102 when averaged globally per normalised 1 deg. X 1 deg. bin), though the relationship is reversed over land and coastal regions when the data are averaged per normalised bin (0.134/0.108 vs. 0140/0.112, respectively). Offsets assessed within single bins alone approach +/- 20 %. CALIOP AOD, both day and night, are higher than NAAPS over land (0.137 vs. 0.124) and equal over water (0.082 vs. 0.083) when averaged globally per normalised bin. However, for all data points inclusive, NAAPS exceeds CALIOP over land, coast and ocean, both day and night. Again, differences assessed within single bins approach 50% in extreme cases. Correlation between CALIOP and NAAPS AOD is comparable during both day and night. Higher correlation is found nearest the equator, both as a function of sample size and relative signal magnitudes inherent at these latitudes. Root mean square deviation between CALIOP and NAAPS varies between 0.1 and 0.3 globally during both day/night. Averaging of CALIOP along-track AOD data points within a single NAAPS grid bin improves correlation and RMSD, though day/night and land/ocean biases persist and are believed systematic. Vertical profiles of extinction coefficient derived in the Caribbean compare well with ground-based lidar observations, though potentially anomalous selection of a priori lidar ratios for CALIOP retrievals is likely inducing some discrepancies. Mean effective aerosol layer top heights are stable between day and night, indicating consistent layer-identification diurnally, which is noteworthy considering the potential limiting effects of ambient solar noise during day.

  18. The stratospheric quasi-biennial oscillation in the NCEP reanalyses: Climatological structures

    NASA Astrophysics Data System (ADS)

    Huesmann, Amihan S.; Hitchman, Matthew H.

    2001-06-01

    Global quasi-biennial variation in the lower stratosphere and tropopause region is studied using 41 years (1958-1998) of reanalyses from the National Centers for Environmental Prediction (NCEP). Horizontal wind, temperature, geopotential height, tropopause temperature and pressure fields are used. A new quasi-biennial oscillation (QBO) indexing method is presented, which is based on the zonal mean zonal wind shear anomaly at the equator and is compared to the Singapore index. A phase difference composting technique provides ``snapshots'' of the QBO meridional-vertical structure as it descends, and ``composite phases'' provide a look at its time progression. Via binning large amounts of data, the first observation-based estimate of the QBO meridional circulation is obtained. High-latitude QBO variability supports previous studies that invoke planetary wave-mean flow interaction as an explanation. The meridional distribution of the range in QBO zonal wind is compared with the stratospheric annual cycle, with the annual cycle dominating poleward of ~12° latitude but still being significant in the deep tropics. The issues of temporal shear zone asymmetries and phase locking with the annual cycle are critically examined. Subtracting the time mean and annual cycle removes ~2/3 of the asymmetry in wind (and wind shear) zone descent rate. The NCEP data validate previous findings that both the easterly and westerly QBO anomalous wind regimes in the lower stratosphere change sign preferentially during northern summer. It is noteworthy that the NCEP QBO amplitude and the relationships among the reanalysed zonal wind, temperature, and meridional circulation undergo a substantial change around 1978.

  19. Hα-activity and ages for stars in the SARG survey

    NASA Astrophysics Data System (ADS)

    Sissa, E.; Gratton, R.; Desidera, S.; Martinez Fiorenzano, A. F.; Bonfanti, A.; Carolo, E.; Vassallo, D.; Claudi, R. U.; Endl, M.; Cosentino, R.

    2016-12-01

    Stellar activity influences radial velocity (RV) measurements and can also mimic the presence of orbiting planets. As part of the search for planets around the components of wide binaries performed with the SARG High Resolution Spectrograph at the TNG, it was discovered that HD 200466A shows strong variation in RV that is well correlated with the activity index based on Hα. We used SARG to study the Hα line variations in each component of the binaries and a few bright stars to test the capability of the Hα index of revealing the rotation period or activity cycle. We also analysed the relations between the average activity level and other physical properties of the stars. We finally tried to reveal signals in the RVs that are due to the activity. At least in some cases the variation in the observed RVs is due to the stellar activity. We confirm that Hα can be used as an activity indicator for solar-type stars and as an age indicator for stars younger than 1.5 Gyr. Based on observations made with the Italian Telescopio Nazionale Galileo (TNG) operated on the island of La Palma by the Fundación Galileo Galilei of the INAF (Istituto Nazionale di Astrofisica) at the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofísica de Canarias.A table of the individual Hα measurements is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/596/A76

  20. Construction technique of disposable bin from sludge cake and its environmental risk.

    PubMed

    Kongmuang, Udomsak; Kiykaew, Duangta; Morioka, Ikuharu

    2015-01-01

    Now, a lot of researchers have tried to make recycled rigid materials from the sludge cake produced in paper mill industries for the purpose of decreasing its volume. In this study, the researchers tried to make economically a disposable bin and to examine whether it is toxic or not to the outside environment. To make a disposable bin, the researchers used the sludge cake, a plastic basket, as a fixed mold, white cloth or newspaper, as a removable supporter for wrapping around the mold, and latex or plaster, as a binder. The strength of the samples was measured by tensile-stress testing. The water absorption was evaluated by Cobb test. As toxicological tests, leaching test and seed germination test were selected. It was possible to form the disposal bin from the cleaned sludge cake. They seemed safe to carry garbage in the industry judging from the results of tensile-stress testing. Some of them showed less water absorptiveness (higher water resistance) in the results of Cobb test. The results of leaching test showed small values of three heavy metals, lead, nickel and copper, in the leachate. The seed germination test suggested no adverse effects of the bins in the clay and sand on the tomato growth. The results of these tests suggest that the bins have good strength, sufficient water resistance and no toxicological effect on the environment. This new recycled bin has the possibility to solve the environmental and health problems at disposing the sludge cake.

  1. Atomic density effects on temperature characteristics and thermal transport at grain boundaries through a proper bin size selection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vo, Truong Quoc; Kim, BoHung, E-mail: muratbarisik@iyte.edu.tr, E-mail: bohungk@ulsan.ac.kr; Barisik, Murat, E-mail: muratbarisik@iyte.edu.tr, E-mail: bohungk@ulsan.ac.kr

    2016-05-21

    This study focuses on the proper characterization of temperature profiles across grain boundaries (GBs) in order to calculate the correct interfacial thermal resistance (ITR) and reveal the influence of GB geometries onto thermal transport. The solid-solid interfaces resulting from the orientation difference between the (001), (011), and (111) copper surfaces were investigated. Temperature discontinuities were observed at the boundary of grains due to the phonon mismatch, phonon backscattering, and atomic forces between dissimilar structures at the GBs. We observed that the temperature decreases gradually in the GB area rather than a sharp drop at the interface. As a result, threemore » distinct temperature gradients developed at the GB which were different than the one observed in the bulk solid. This behavior extends a couple molecular diameters into both sides of the interface where we defined a thickness at GB based on the measured temperature profiles for characterization. Results showed dependence on the selection of the bin size used to average the temperature data from the molecular dynamics system. The bin size on the order of the crystal layer spacing was found to present an accurate temperature profile through the GB. We further calculated the GB thickness of various cases by using potential energy (PE) distributions which showed agreement with direct measurements from the temperature profile and validated the proper binning. The variation of grain crystal orientation developed different molecular densities which were characterized by the average atomic surface density (ASD) definition. Our results revealed that the ASD is the primary factor affecting the structural disorders and heat transfer at the solid-solid interfaces. Using a system in which the planes are highly close-packed can enhance the probability of interactions and the degree of overlap between vibrational density of states (VDOS) of atoms forming at interfaces, leading to a reduced ITR. Thus, an accurate understanding of thermal characteristics at the GB can be formulated by selecting a proper bin size.« less

  2. Atomic density effects on temperature characteristics and thermal transport at grain boundaries through a proper bin size selection

    NASA Astrophysics Data System (ADS)

    Vo, Truong Quoc; Barisik, Murat; Kim, BoHung

    2016-05-01

    This study focuses on the proper characterization of temperature profiles across grain boundaries (GBs) in order to calculate the correct interfacial thermal resistance (ITR) and reveal the influence of GB geometries onto thermal transport. The solid-solid interfaces resulting from the orientation difference between the (001), (011), and (111) copper surfaces were investigated. Temperature discontinuities were observed at the boundary of grains due to the phonon mismatch, phonon backscattering, and atomic forces between dissimilar structures at the GBs. We observed that the temperature decreases gradually in the GB area rather than a sharp drop at the interface. As a result, three distinct temperature gradients developed at the GB which were different than the one observed in the bulk solid. This behavior extends a couple molecular diameters into both sides of the interface where we defined a thickness at GB based on the measured temperature profiles for characterization. Results showed dependence on the selection of the bin size used to average the temperature data from the molecular dynamics system. The bin size on the order of the crystal layer spacing was found to present an accurate temperature profile through the GB. We further calculated the GB thickness of various cases by using potential energy (PE) distributions which showed agreement with direct measurements from the temperature profile and validated the proper binning. The variation of grain crystal orientation developed different molecular densities which were characterized by the average atomic surface density (ASD) definition. Our results revealed that the ASD is the primary factor affecting the structural disorders and heat transfer at the solid-solid interfaces. Using a system in which the planes are highly close-packed can enhance the probability of interactions and the degree of overlap between vibrational density of states (VDOS) of atoms forming at interfaces, leading to a reduced ITR. Thus, an accurate understanding of thermal characteristics at the GB can be formulated by selecting a proper bin size.

  3. Genome Scan Meta-Analysis of Schizophrenia and Bipolar Disorder, Part II: Schizophrenia

    PubMed Central

    Lewis, Cathryn M.; Levinson, Douglas F.; Wise, Lesley H.; DeLisi, Lynn E.; Straub, Richard E.; Hovatta, Iiris; Williams, Nigel M.; Schwab, Sibylle G.; Pulver, Ann E.; Faraone, Stephen V.; Brzustowicz, Linda M.; Kaufmann, Charles A.; Garver, David L.; Gurling, Hugh M. D.; Lindholm, Eva; Coon, Hilary; Moises, Hans W.; Byerley, William; Shaw, Sarah H.; Mesen, Andrea; Sherrington, Robin; O’Neill, F. Anthony; Walsh, Dermot; Kendler, Kenneth S.; Ekelund, Jesper; Paunio, Tiina; Lönnqvist, Jouko; Peltonen, Leena; O’Donovan, Michael C.; Owen, Michael J.; Wildenauer, Dieter B.; Maier, Wolfgang; Nestadt, Gerald; Blouin, Jean-Louis; Antonarakis, Stylianos E.; Mowry, Bryan J.; Silverman, Jeremy M.; Crowe, Raymond R.; Cloninger, C. Robert; Tsuang, Ming T.; Malaspina, Dolores; Harkavy-Friedman, Jill M.; Svrakic, Dragan M.; Bassett, Anne S.; Holcomb, Jennifer; Kalsi, Gursharan; McQuillin, Andrew; Brynjolfson, Jon; Sigmundsson, Thordur; Petursson, Hannes; Jazin, Elena; Zoëga, Tomas; Helgason, Tomas

    2003-01-01

    Schizophrenia is a common disorder with high heritability and a 10-fold increase in risk to siblings of probands. Replication has been inconsistent for reports of significant genetic linkage. To assess evidence for linkage across studies, rank-based genome scan meta-analysis (GSMA) was applied to data from 20 schizophrenia genome scans. Each marker for each scan was assigned to 1 of 120 30-cM bins, with the bins ranked by linkage scores (1 = most significant) and the ranks averaged across studies (Ravg) and then weighted for sample size (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} \\begin{equation*}\\sqrt{N[affected cases]}\\end{equation*}\\end{document}). A permutation test was used to compute the probability of observing, by chance, each bin’s average rank (PAvgRnk) or of observing it for a bin with the same place (first, second, etc.) in the order of average ranks in each permutation (Pord). The GSMA produced significant genomewide evidence for linkage on chromosome 2q (PAvgRnk<.000417). Two aggregate criteria for linkage were also met (clusters of nominally significant P values that did not occur in 1,000 replicates of the entire data set with no linkage present): 12 consecutive bins with both PAvgRnk and Pord<.05, including regions of chromosomes 5q, 3p, 11q, 6p, 1q, 22q, 8p, 20q, and 14p, and 19 consecutive bins with Pord<.05, additionally including regions of chromosomes 16q, 18q, 10p, 15q, 6q, and 17q. There is greater consistency of linkage results across studies than has been previously recognized. The results suggest that some or all of these regions contain loci that increase susceptibility to schizophrenia in diverse populations. PMID:12802786

  4. Estimating the potential intensification of global grazing systems based on climate adjusted yield gap analysis

    NASA Astrophysics Data System (ADS)

    Sheehan, J. J.

    2016-12-01

    We report here a first-of-its-kind analysis of the potential for intensification of global grazing systems. Intensification is calculated using the statistical yield gap methodology developed previously by others (Mueller et al 2012 and Licker et al 2010) for global crop systems. Yield gaps are estimated by binning global pasture land area into 100 equal area sized bins of similar climate (defined by ranges of rainfall and growing degree days). Within each bin, grid cells of pastureland are ranked from lowest to highest productivity. The global intensification potential is defined as the sum of global production across all bins at a given percentile ranking (e.g. performance at the 90th percentile) divided by the total current global production. The previous yield gap studies focused on crop systems because productivity data on these systems is readily available. Nevertheless, global crop land represents only one-third of total global agricultural land, while pasture systems account for the remaining two-thirds. Thus, it is critical to conduct the same kind of analysis on what is the largest human use of land on the planet—pasture systems. In 2013, Herrero et al announced the completion of a geospatial data set that augmented the animal census data with data and modeling about production systems and overall food productivity (Herrero et al, PNAS 2013). With this data set, it is now possible to apply yield gap analysis to global pasture systems. We used the Herrero et al data set to evaluate yield gaps for meat and milk production from pasture based systems for cattle, sheep and goats. The figure included with this abstract shows the intensification potential for kcal per hectare per year of meat and milk from global cattle, sheep and goats as a function of increasing levels of performance. Performance is measured as the productivity achieved at a given ranked percentile within each bin.We find that if all pasture land were raised to their 90th percentile of performance, global output of meat and milk could increase 2.8 fold. This is much higher than that reported previously for major grain crops like corn and wheat. Our results suggest that efforts to address poor performance of pasture systems around the world could substantially improve the outlook for meeting future food demand.

  5. The effect of beam pre-bunching on the excitation of terahertz plasmons in a parallel plane guiding system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Suresh C.; Malik, Pratibha

    2015-04-15

    The excitation of terahertz (THz) plasmons by a pre-bunched relativistic electron beam propagating in a parallel plane semiconducting guiding system is studied. It is found that the n-InSb semiconductor strongly supports the confined surface plasmons in the terahertz frequency range. The growth rate and efficiency of the THz surface plasmons increase linearly with modulation index and show the largest value as modulation index approaches unity. Moreover, the growth rate of the instability scales as one-third power of the beam density and inverse one-third power of the THz radiation frequency.

  6. Effects of Land Use/Cover Changes and Urban Forest Configuration on Urban Heat Islands in a Loess Hilly Region: Case Study Based on Yan'an City, China.

    PubMed

    Zhang, Xinping; Wang, Dexiang; Hao, Hongke; Zhang, Fangfang; Hu, Youning

    2017-07-26

    In this study Yan'an City, a typical hilly valley city, was considered as the study area in order to explain the relationships between the surface urban heat island (SUHI) and land use/land cover (LULC) types, the landscape pattern metrics of LULC types and land surface temperature (LST) and remote sensing indexes were retrieved from Landsat data during 1990-2015, and to find factors contributed to the green space cool island intensity (GSCI) through field measurements of 34 green spaces. The results showed that during 1990-2015, because of local anthropogenic activities, SUHI was mainly located in lower vegetation cover areas. There was a significant suburban-urban gradient in the average LST, as well as its heterogeneity and fluctuations. Six landscape metrics comprising the fractal dimension index, percentage of landscape, aggregation index, division index, Shannon's diversity index, and expansion intensity of the classified LST spatiotemporal changes were paralleled to LULC changes, especially for construction land, during the past 25 years. In the urban area, an index-based built-up index was the key positive factor for explaining LST increases, whereas the normalized difference vegetation index and modified normalized difference water index were crucial factors for explaining LST decreases during the study periods. In terms of the heat mitigation performance of green spaces, mixed forest was better than pure forest, and the urban forest configuration had positive effects on GSCI. The results of this study provide insights into the importance of species choice and the spatial design of green spaces for cooling the environment.

  7. Effects of Land Use/Cover Changes and Urban Forest Configuration on Urban Heat Islands in a Loess Hilly Region: Case Study Based on Yan’an City, China

    PubMed Central

    Zhang, Xinping; Hao, Hongke; Zhang, Fangfang; Hu, Youning

    2017-01-01

    In this study Yan’an City, a typical hilly valley city, was considered as the study area in order to explain the relationships between the surface urban heat island (SUHI) and land use/land cover (LULC) types, the landscape pattern metrics of LULC types and land surface temperature (LST) and remote sensing indexes were retrieved from Landsat data during 1990–2015, and to find factors contributed to the green space cool island intensity (GSCI) through field measurements of 34 green spaces. The results showed that during 1990–2015, because of local anthropogenic activities, SUHI was mainly located in lower vegetation cover areas. There was a significant suburban-urban gradient in the average LST, as well as its heterogeneity and fluctuations. Six landscape metrics comprising the fractal dimension index, percentage of landscape, aggregation index, division index, Shannon’s diversity index, and expansion intensity of the classified LST spatiotemporal changes were paralleled to LULC changes, especially for construction land, during the past 25 years. In the urban area, an index-based built-up index was the key positive factor for explaining LST increases, whereas the normalized difference vegetation index and modified normalized difference water index were crucial factors for explaining LST decreases during the study periods. In terms of the heat mitigation performance of green spaces, mixed forest was better than pure forest, and the urban forest configuration had positive effects on GSCI. The results of this study provide insights into the importance of species choice and the spatial design of green spaces for cooling the environment. PMID:28933770

  8. Benefits of a Hospital Two-Bin Kanban System

    DTIC Science & Technology

    2014-09-01

    Reader : Michael Dixon THIS PAGE INTENTIONALLY LEFT BLANK i REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704–0188 Public reporting burden for...Approved by: Nedialko Dimitrov Thesis Advisor Rachel Silvestrini Co-Advisor Michael Dixon Second Reader Robert F. Dell Chair...stocks split between primary (in front) and secondary bins (directly behind). RFID tags are placed on the front of each bin. Photos taken at WRNMMC

  9. Evaluation of five sampling methods for Liposcelis entomophila (Enderlein) and L. decolor (Pearman) (Psocoptera: Liposcelididae) in steel bins containing wheat

    USDA-ARS?s Scientific Manuscript database

    An evaluation of five sampling methods for studying psocid population levels was conducted in two steel bins containing 32.6 metric tonnes of wheat in Manhattan, KS. Psocids were sampled using a 1.2-m open-ended trier, corrugated cardboard refuges placed on the underside of the bin hatch or the surf...

  10. SeaQuaKE: Sea-optimized Quantum Key Exchange

    DTIC Science & Technology

    2015-01-01

    of photon pairs in both polarization [3] and time-bin [4] degrees of freedom simultaneously. Entanglement analysis components in both the...greater throughput per entangled photon pair compared to alternative sources that encode in only a Photon -pair source Time-bin entanglement ...Polarization Entanglement & Pair Generation Hyperentangled Photon Pair Source •Wavelength availability • Power • Pulse rate Time-bin Mux • Waveguide vs

  11. A Powerful, Cost Effective, Web Based Engineering Solution Supporting Conjunction Detection and Visual Analysis

    NASA Astrophysics Data System (ADS)

    Novak, Daniel M.; Biamonti, Davide; Gross, Jeremy; Milnes, Martin

    2013-08-01

    An innovative and visually appealing tool is presented for efficient all-vs-all conjunction analysis on a large catalogue of objects. The conjunction detection uses a nearest neighbour search algorithm, based on spatial binning and identification of pairs of objects in adjacent bins. This results in the fastest all vs all filtering the authors are aware of. The tool is constructed on a server-client architecture, where the server broadcasts to the client the conjunction data and ephemerides, while the client supports the user interface through a modern browser, without plug-in. In order to make the tool flexible and maintainable, Java software technologies were used on the server side, including Spring, Camel, ActiveMQ and CometD. The user interface and visualisation are based on the latest web technologies: HTML5, WebGL, THREE.js. Importance has been given on the ergonomics and visual appeal of the software. In fact certain design concepts have been borrowed from the gaming industry.

  12. Self-organizing maps: a versatile tool for the automatic analysis of untargeted imaging datasets.

    PubMed

    Franceschi, Pietro; Wehrens, Ron

    2014-04-01

    MS-based imaging approaches allow for location-specific identification of chemical components in biological samples, opening up possibilities of much more detailed understanding of biological processes and mechanisms. Data analysis, however, is challenging, mainly because of the sheer size of such datasets. This article presents a novel approach based on self-organizing maps, extending previous work in order to be able to handle the large number of variables present in high-resolution mass spectra. The key idea is to generate prototype images, representing spatial distributions of ions, rather than prototypical mass spectra. This allows for a two-stage approach, first generating typical spatial distributions and associated m/z bins, and later analyzing the interesting bins in more detail using accurate masses. The possibilities and advantages of the new approach are illustrated on an in-house dataset of apple slices. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leng, Shuai; Yu, Lifeng; Wang, Jia

    Purpose: Our purpose was to reduce image noise in spectral CT by exploiting data redundancies in the energy domain to allow flexible selection of the number, width, and location of the energy bins. Methods: Using a variety of spectral CT imaging methods, conventional filtered backprojection (FBP) reconstructions were performed and resulting images were compared to those processed using a Local HighlY constrained backPRojection Reconstruction (HYPR-LR) algorithm. The mean and standard deviation of CT numbers were measured within regions of interest (ROIs), and results were compared between FBP and HYPR-LR. For these comparisons, the following spectral CT imaging methods were used:(i)more » numerical simulations based on a photon-counting, detector-based CT system, (ii) a photon-counting, detector-based micro CT system using rubidium and potassium chloride solutions, (iii) a commercial CT system equipped with integrating detectors utilizing tube potentials of 80, 100, 120, and 140 kV, and (iv) a clinical dual-energy CT examination. The effects of tube energy and energy bin width were evaluated appropriate to each CT system. Results: The mean CT number in each ROI was unchanged between FBP and HYPR-LR images for each of the spectral CT imaging scenarios, irrespective of bin width or tube potential. However, image noise, as represented by the standard deviation of CT numbers in each ROI, was reduced by 36%-76%. In all scenarios, image noise after HYPR-LR algorithm was similar to that of composite images, which used all available photons. No difference in spatial resolution was observed between HYPR-LR processing and FBP. Dual energy patient data processed using HYPR-LR demonstrated reduced noise in the individual, low- and high-energy images, as well as in the material-specific basis images. Conclusions: Noise reduction can be accomplished for spectral CT by exploiting data redundancies in the energy domain. HYPR-LR is a robust method for reducing image noise in a variety of spectral CT imaging systems without losing spatial resolution or CT number accuracy. This method improves the flexibility to select energy bins in the manner that optimizes material identification and separation without paying the penalty of increased image noise or its corollary, increased patient dose.« less

  14. MRUniNovo: an efficient tool for de novo peptide sequencing utilizing the hadoop distributed computing framework.

    PubMed

    Li, Chuang; Chen, Tao; He, Qiang; Zhu, Yunping; Li, Kenli

    2017-03-15

    Tandem mass spectrometry-based de novo peptide sequencing is a complex and time-consuming process. The current algorithms for de novo peptide sequencing cannot rapidly and thoroughly process large mass spectrometry datasets. In this paper, we propose MRUniNovo, a novel tool for parallel de novo peptide sequencing. MRUniNovo parallelizes UniNovo based on the Hadoop compute platform. Our experimental results demonstrate that MRUniNovo significantly reduces the computation time of de novo peptide sequencing without sacrificing the correctness and accuracy of the results, and thus can process very large datasets that UniNovo cannot. MRUniNovo is an open source software tool implemented in java. The source code and the parameter settings are available at http://bioinfo.hupo.org.cn/MRUniNovo/index.php. s131020002@hnu.edu.cn ; taochen1019@163.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Improved workflow for quantification of left ventricular volumes and mass using free-breathing motion corrected cine imaging.

    PubMed

    Cross, Russell; Olivieri, Laura; O'Brien, Kendall; Kellman, Peter; Xue, Hui; Hansen, Michael

    2016-02-25

    Traditional cine imaging for cardiac functional assessment requires breath-holding, which can be problematic in some situations. Free-breathing techniques have relied on multiple averages or real-time imaging, producing images that can be spatially and/or temporally blurred. To overcome this, methods have been developed to acquire real-time images over multiple cardiac cycles, which are subsequently motion corrected and reformatted to yield a single image series displaying one cardiac cycle with high temporal and spatial resolution. Application of these algorithms has required significant additional reconstruction time. The use of distributed computing was recently proposed as a way to improve clinical workflow with such algorithms. In this study, we have deployed a distributed computing version of motion corrected re-binning reconstruction for free-breathing evaluation of cardiac function. Twenty five patients and 25 volunteers underwent cardiovascular magnetic resonance (CMR) for evaluation of left ventricular end-systolic volume (ESV), end-diastolic volume (EDV), and end-diastolic mass. Measurements using motion corrected re-binning were compared to those using breath-held SSFP and to free-breathing SSFP with multiple averages, and were performed by two independent observers. Pearson correlation coefficients and Bland-Altman plots tested agreement across techniques. Concordance correlation coefficient and Bland-Altman analysis tested inter-observer variability. Total scan plus reconstruction times were tested for significant differences using paired t-test. Measured volumes and mass obtained by motion corrected re-binning and by averaged free-breathing SSFP compared favorably to those obtained by breath-held SSFP (r = 0.9863/0.9813 for EDV, 0.9550/0.9685 for ESV, 0.9952/0.9771 for mass). Inter-observer variability was good with concordance correlation coefficients between observers across all acquisition types suggesting substantial agreement. Both motion corrected re-binning and averaged free-breathing SSFP acquisition and reconstruction times were shorter than breath-held SSFP techniques (p < 0.0001). On average, motion corrected re-binning required 3 min less than breath-held SSFP imaging, a 37% reduction in acquisition and reconstruction time. The motion corrected re-binning image reconstruction technique provides robust cardiac imaging that can be used for quantification that compares favorably to breath-held SSFP as well as multiple average free-breathing SSFP, but can be obtained in a fraction of the time when using cloud-based distributed computing reconstruction.

  16. Fully Dynamic Bin Packing

    NASA Astrophysics Data System (ADS)

    Ivković, Zoran; Lloyd, Errol L.

    Classic bin packing seeks to pack a given set of items of possibly varying sizes into a minimum number of identical sized bins. A number of approximation algorithms have been proposed for this NP-hard problem for both the on-line and off-line cases. In this chapter we discuss fully dynamic bin packing, where items may arrive (Insert) and depart (Delete) dynamically. In accordance with standard practice for fully dynamic algorithms, it is assumed that the packing may be arbitrarily rearranged to accommodate arriving and departing items. The goal is to maintain an approximately optimal solution of provably high quality in a total amount of time comparable to that used by an off-line algorithm delivering a solution of the same quality.

  17. TH-A-18C-03: Noise Correlation in CBCT Projection Data and Its Application for Noise Reduction in Low-Dose CBCT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ZHANG, H; Huang, J; Ma, J

    2014-06-15

    Purpose: To study the noise correlation properties of cone-beam CT (CBCT) projection data and to incorporate the noise correlation information to a statistics-based projection restoration algorithm for noise reduction in low-dose CBCT. Methods: In this study, we systematically investigated the noise correlation properties among detector bins of CBCT projection data by analyzing repeated projection measurements. The measurements were performed on a TrueBeam on-board CBCT imaging system with a 4030CB flat panel detector. An anthropomorphic male pelvis phantom was used to acquire 500 repeated projection data at six different dose levels from 0.1 mAs to 1.6 mAs per projection at threemore » fixed angles. To minimize the influence of the lag effect, lag correction was performed on the consecutively acquired projection data. The noise correlation coefficient between detector bin pairs was calculated from the corrected projection data. The noise correlation among CBCT projection data was then incorporated into the covariance matrix of the penalized weighted least-squares (PWLS) criterion for noise reduction of low-dose CBCT. Results: The analyses of the repeated measurements show that noise correlation coefficients are non-zero between the nearest neighboring bins of CBCT projection data. The average noise correlation coefficients for the first- and second- order neighbors are about 0.20 and 0.06, respectively. The noise correlation coefficients are independent of the dose level. Reconstruction of the pelvis phantom shows that the PWLS criterion with consideration of noise correlation (PWLS-Cor) results in a lower noise level as compared to the PWLS criterion without considering the noise correlation (PWLS-Dia) at the matched resolution. Conclusion: Noise is correlated among nearest neighboring detector bins of CBCT projection data. An accurate noise model of CBCT projection data can improve the performance of the statistics-based projection restoration algorithm for low-dose CBCT.« less

  18. Landscape-scale soil moisture heterogeneity and its influence on surface fluxes at the Jornada LTER site: Evaluating a new model parameterization for subgrid-scale soil moisture variability

    NASA Astrophysics Data System (ADS)

    Baker, I. T.; Prihodko, L.; Vivoni, E. R.; Denning, A. S.

    2017-12-01

    Arid and semiarid regions represent a large fraction of global land, with attendant importance of surface energy and trace gas flux to global totals. These regions are characterized by strong seasonality, especially in precipitation, that defines the level of ecosystem stress. Individual plants have been observed to respond non-linearly to increasing soil moisture stress, where plant function is generally maintained as soils dry down to a threshold at which rapid closure of stomates occurs. Incorporating this nonlinear mechanism into landscape-scale models can result in unrealistic binary "on-off" behavior that is especially problematic in arid landscapes. Subsequently, models have `relaxed' their simulation of soil moisture stress on evapotranspiration (ET). Unfortunately, these relaxations are not physically based, but are imposed upon model physics as a means to force a more realistic response. Previously, we have introduced a new method to represent soil moisture regulation of ET, whereby the landscape is partitioned into `BINS' of soil moisture wetness, each associated with a fractional area of the landscape or grid cell. A physically- and observationally-based nonlinear soil moisture stress function is applied, but when convolved with the relative area distribution represented by wetness BINS the system has the emergent property of `smoothing' the landscape-scale response without the need for non-physical impositions on model physics. In this research we confront BINS simulations of Bowen ratio, soil moisture variability and trace gas flux with soil moisture and eddy covariance observations taken at the Jornada LTER dryland site in southern New Mexico. We calculate the mean annual wetting cycle and associated variability about the mean state and evaluate model performance against this variability and time series of land surface fluxes from the highly instrumented Tromble Weir watershed. The BINS simulations capture the relatively rapid reaction to wetting events and more prolonged response to drying cycles, as opposed to binary behavior in the control.

  19. DOA-informed source extraction in the presence of competing talkers and background noise

    NASA Astrophysics Data System (ADS)

    Taseska, Maja; Habets, Emanuël A. P.

    2017-12-01

    A desired speech signal in hands-free communication systems is often degraded by noise and interfering speech. Even though the number and locations of the interferers are often unknown in practice, it is justified to assume in certain applications that the direction-of-arrival (DOA) of the desired source is approximately known. Using the known DOA, fixed spatial filters such as the delay-and-sum beamformer can be steered to extract the desired source. However, it is well-known that fixed data-independent spatial filters do not provide sufficient reduction of directional interferers. Instead, the DOA information can be used to estimate the statistics of the desired and the undesired signals and to compute optimal data-dependent spatial filters. One way the DOA is exploited for optimal spatial filtering in the literature, is by designing DOA-based narrowband detectors to determine whether a desired or an undesired signal is dominant at each time-frequency (TF) bin. Subsequently, the statistics of the desired and the undesired signals can be estimated during the TF bins where the respective signal is dominant. In a similar manner, a Gaussian signal model-based detector which does not incorporate DOA information has been used in scenarios where the undesired signal consists of stationary background noise. However, when the undesired signal is non-stationary, resulting for example from interfering speakers, such a Gaussian signal model-based detector is unable to robustly distinguish desired from undesired speech. To this end, we propose a DOA model-based detector to determine the dominant source at each TF bin and estimate the desired and undesired signal statistics. We demonstrate that data-dependent spatial filters that use the statistics estimated by the proposed framework achieve very good undesired signal reduction, even when using only three microphones.

  20. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SIEBER, CHRISTIAN

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructedmore » community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.« less

Top