Protocols for efficient simulations of long-time protein dynamics using coarse-grained CABS model.
Jamroz, Michal; Kolinski, Andrzej; Kmiecik, Sebastian
2014-01-01
Coarse-grained (CG) modeling is a well-acknowledged simulation approach for getting insight into long-time scale protein folding events at reasonable computational cost. Depending on the design of a CG model, the simulation protocols vary from highly case-specific-requiring user-defined assumptions about the folding scenario-to more sophisticated blind prediction methods for which only a protein sequence is required. Here we describe the framework protocol for the simulations of long-term dynamics of globular proteins, with the use of the CABS CG protein model and sequence data. The simulations can start from a random or a selected (e.g., native) structure. The described protocol has been validated using experimental data for protein folding model systems-the prediction results agreed well with the experimental results.
Aigrain, Louise; Gu, Yong; Quail, Michael A
2016-06-13
The emergence of next-generation sequencing (NGS) technologies in the past decade has allowed the democratization of DNA sequencing both in terms of price per sequenced bases and ease to produce DNA libraries. When it comes to preparing DNA sequencing libraries for Illumina, the current market leader, a plethora of kits are available and it can be difficult for the users to determine which kit is the most appropriate and efficient for their applications; the main concerns being not only cost but also minimal bias, yield and time efficiency. We compared 9 commercially available library preparation kits in a systematic manner using the same DNA sample by probing the amount of DNA remaining after each protocol steps using a new droplet digital PCR (ddPCR) assay. This method allows the precise quantification of fragments bearing either adaptors or P5/P7 sequences on both ends just after ligation or PCR enrichment. We also investigated the potential influence of DNA input and DNA fragment size on the final library preparation efficiency. The overall library preparations efficiencies of the libraries show important variations between the different kits with the ones combining several steps into a single one exhibiting some final yields 4 to 7 times higher than the other kits. Detailed ddPCR data also reveal that the adaptor ligation yield itself varies by more than a factor of 10 between kits, certain ligation efficiencies being so low that it could impair the original library complexity and impoverish the sequencing results. When a PCR enrichment step is necessary, lower adaptor-ligated DNA inputs leads to greater amplification yields, hiding the latent disparity between kits. We describe a ddPCR assay that allows us to probe the efficiency of the most critical step in the library preparation, ligation, and to draw conclusion on which kits is more likely to preserve the sample heterogeneity and reduce the need of amplification.
Jaroenram, Wansadaj; Owens, Leigh
2014-01-01
Non-infectious Penaeus stylirostris densovirus (PstDV)-related sequences in the shrimp genome cause false positive results with current PCR protocols. Here, we examined and mapped PstDV insertion profile in the genome of Australian Penaeus monodon. A DNA sequence which is likely to represent infectious PstDV was also identified and used as a target sequence for recombinase polymerase amplification (RPA)-based approach, developed for specifically detecting PstDV. The RPA protocol at 37 °C for 30 min showed no cross-reaction with other shrimp viruses, and was 10 times more sensitive than the 309F/R PCR protocol currently recommended by the World Organization for Animal Health (OIE) for PstDV diagnosis. These features, together with the simplicity of the protocol, requiring only a heating block for the reaction, offer opportunities for rapid and efficient detection of PstDV. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-01-01
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a-priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. PMID:23473798
Defrance, Matthieu; Janky, Rekin's; Sand, Olivier; van Helden, Jacques
2008-01-01
This protocol explains how to discover functional signals in genomic sequences by detecting over- or under-represented oligonucleotides (words) or spaced pairs thereof (dyads) with the Regulatory Sequence Analysis Tools (http://rsat.ulb.ac.be/rsat/). Two typical applications are presented: (i) predicting transcription factor-binding motifs in promoters of coregulated genes and (ii) discovering phylogenetic footprints in promoters of orthologous genes. The steps of this protocol include purging genomic sequences to discard redundant fragments, discovering over-represented patterns and assembling them to obtain degenerate motifs, scanning sequences and drawing feature maps. The main strength of the method is its statistical ground: the binomial significance provides an efficient control on the rate of false positives. In contrast with optimization-based pattern discovery algorithms, the method supports the detection of under- as well as over-represented motifs. Computation times vary from seconds (gene clusters) to minutes (whole genomes). The execution of the whole protocol should take approximately 1 h.
Nakato, Ryuichiro; Itoh, Tahehiko; Shirahige, Katsuhiko
2013-07-01
Chromatin immunoprecipitation with high-throughput sequencing (ChIP-seq) can identify genomic regions that bind proteins involved in various chromosomal functions. Although the development of next-generation sequencers offers the technology needed to identify these protein-binding sites, the analysis can be computationally challenging because sequencing data sometimes consist of >100 million reads/sample. Herein, we describe a cost-effective and time-efficient protocol that is generally applicable to ChIP-seq analysis; this protocol uses a novel peak-calling program termed DROMPA to identify peaks and an additional program, parse2wig, to preprocess read-map files. This two-step procedure drastically reduces computational time and memory requirements compared with other programs. DROMPA enables the identification of protein localization sites in repetitive sequences and efficiently identifies both broad and sharp protein localization peaks. Specifically, DROMPA outputs a protein-binding profile map in pdf or png format, which can be easily manipulated by users who have a limited background in bioinformatics. © 2013 The Authors Genes to Cells © 2013 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.
Jansma, J Martijn; de Zwart, Jacco A; van Gelderen, Peter; Duyn, Jeff H; Drevets, Wayne C; Furey, Maura L
2013-05-15
Technical developments in MRI have improved signal to noise, allowing use of analysis methods such as Finite impulse response (FIR) of rapid event related functional MRI (er-fMRI). FIR is one of the most informative analysis methods as it determines onset and full shape of the hemodynamic response function (HRF) without any a priori assumptions. FIR is however vulnerable to multicollinearity, which is directly related to the distribution of stimuli over time. Efficiency can be optimized by simplifying a design, and restricting stimuli distribution to specific sequences, while more design flexibility necessarily reduces efficiency. However, the actual effect of efficiency on fMRI results has never been tested in vivo. Thus, it is currently difficult to make an informed choice between protocol flexibility and statistical efficiency. The main goal of this study was to assign concrete fMRI signal to noise values to the abstract scale of FIR statistical efficiency. Ten subjects repeated a perception task with five random and m-sequence based protocol, with varying but, according to literature, acceptable levels of multicollinearity. Results indicated substantial differences in signal standard deviation, while the level was a function of multicollinearity. Experiment protocols varied up to 55.4% in standard deviation. Results confirm that quality of fMRI in an FIR analysis can significantly and substantially vary with statistical efficiency. Our in vivo measurements can be used to aid in making an informed decision between freedom in protocol design and statistical efficiency. Published by Elsevier B.V.
Musculoskeletal MRI at 3.0 T and 7.0 T: a comparison of relaxation times and image contrast.
Jordan, Caroline D; Saranathan, Manojkumar; Bangerter, Neal K; Hargreaves, Brian A; Gold, Garry E
2013-05-01
The purpose of this study was to measure and compare the relaxation times of musculoskeletal tissues at 3.0 T and 7.0 T, and to use these measurements to select appropriate parameters for musculoskeletal protocols at 7.0 T. We measured the T₁ and T₂ relaxation times of cartilage, muscle, synovial fluid, bone marrow and subcutaneous fat at both 3.0 T and 7.0 T in the knees of five healthy volunteers. The T₁ relaxation times were measured using a spin-echo inversion recovery sequence with six inversion times. The T₂ relaxation times were measured using a spin-echo sequence with seven echo times. The accuracy of both the T₁ and T₂ measurement techniques was verified in phantoms at both magnetic field strengths. We used the measured relaxation times to help design 7.0 T musculoskeletal protocols that preserve the favorable contrast characteristics of our 3.0 T protocols, while achieving significantly higher resolution at higher SNR efficiency. The T₁ relaxation times in all tissues at 7.0 T were consistently higher than those measured at 3.0 T, while the T₂ relaxation times at 7.0 T were consistently lower than those measured at 3.0 T. The measured relaxation times were used to help develop high resolution 7.0 T protocols that had similar fluid-to-cartilage contrast to that of the standard clinical 3.0 T protocols for the following sequences: proton-density-weighted fast spin-echo (FSE), T₂-weighted FSE, and 3D-FSE-Cube. The T₁ and T₂ changes were within the expected ranges. Parameters for musculoskeletal protocols at 7.0 T can be optimized based on these values, yielding improved resolution in musculoskeletal imaging with similar contrast to that of standard 3.0 T clinical protocols. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Allegra, Carmen J.
2015-01-01
During the past decade, biomedical technologies have undergone an explosive evolution---from the publication of the first complete human genome in 2003, after more than a decade of effort and at a cost of hundreds of millions of dollars---to the present time, where a complete genomic sequence can be available in less than a day and at a small fraction of the cost of the original sequence. The widespread availability of next generation genomic sequencing has opened the door to the development of precision oncology. The need to test multiple new targeted agents both alone and in combination with other targeted therapies, as well as classic cytotoxic agents, demand the development of novel therapeutic platforms (particularly Master Protocols) capable of efficiently and effectively testing multiple targeted agents or targeted therapeutic strategies in relatively small patient subpopulations. Here, we describe the Master Protocol concept, with a focus on the expected gains and complexities of the use of this design. An overview of Master Protocols currently active or in development is provided along with a more extensive discussion of the Lung Master Protocol (Lung-MAP study). PMID:26433553
Stirnberg, Rüdiger; Huijbers, Willem; Brenner, Daniel; Poser, Benedikt A; Breteler, Monique; Stöcker, Tony
2017-12-01
State-of-the-art simultaneous-multi-slice (SMS-)EPI and 3D-EPI share several properties that benefit functional MRI acquisition. Both sequences employ equivalent parallel imaging undersampling with controlled aliasing to achieve high temporal sampling rates. As a volumetric imaging sequence, 3D-EPI offers additional means of acceleration complementary to 2D-CAIPIRINHA sampling, such as fast water excitation and elliptical sampling. We performed an application-oriented comparison between a tailored, six-fold CAIPIRINHA-accelerated 3D-EPI protocol at 530 ms temporal and 2.4 mm isotropic spatial resolution and an SMS-EPI protocol with identical spatial and temporal resolution for whole-brain resting-state fMRI at 3 T. The latter required eight-fold slice acceleration to compensate for the lack of elliptical sampling and fast water excitation. Both sequences used vendor-supplied on-line image reconstruction. We acquired test/retest resting-state fMRI scans in ten volunteers, with simultaneous acquisition of cardiac and respiration data, subsequently used for optional physiological noise removal (nuisance regression). We found that the 3D-EPI protocol has significantly increased temporal signal-to-noise ratio throughout the brain as compared to the SMS-EPI protocol, especially when employing motion and nuisance regression. Both sequence types reliably identified known functional networks with stronger functional connectivity values for the 3D-EPI protocol. We conclude that the more time-efficient 3D-EPI primarily benefits from reduced parallel imaging noise due to a higher, actual k-space sampling density compared to SMS-EPI. The resultant BOLD sensitivity increase makes 3D-EPI a valuable alternative to SMS-EPI for whole-brain fMRI at 3 T, with voxel sizes well below 3 mm isotropic and sampling rates high enough to separate dominant cardiac signals from BOLD signals in the frequency domain. Copyright © 2017 Elsevier Inc. All rights reserved.
Oliveira, R R; Viana, A J C; Reátegui, A C E; Vincentz, M G A
2015-12-29
Determination of gene expression is an important tool to study biological processes and relies on the quality of the extracted RNA. Changes in gene expression profiles may be directly related to mutations in regulatory DNA sequences or alterations in DNA cytosine methylation, which is an epigenetic mark. Correlation of gene expression with DNA sequence or epigenetic mark polymorphism is often desirable; for this, a robust protocol to isolate high-quality RNA and DNA simultaneously from the same sample is required. Although commercial kits and protocols are available, they are mainly optimized for animal tissues and, in general, restricted to RNA or DNA extraction, not both. In the present study, we describe an efficient and accessible method to extract both RNA and DNA simultaneously from the same sample of various plant tissues, using small amounts of starting material. The protocol was efficient in the extraction of high-quality nucleic acids from several Arabidopsis thaliana tissues (e.g., leaf, inflorescence stem, flower, fruit, cotyledon, seedlings, root, and embryo) and from other tissues of non-model plants, such as Avicennia schaueriana (Acanthaceae), Theobroma cacao (Malvaceae), Paspalum notatum (Poaceae), and Sorghum bicolor (Poaceae). The obtained nucleic acids were used as templates for downstream analyses, such as mRNA sequencing, quantitative real time-polymerase chain reaction, bisulfite treatment, and others; the results were comparable to those obtained with commercial kits. We believe that this protocol could be applied to a broad range of plant species, help avoid technical and sampling biases, and facilitate several RNA- and DNA-dependent analyses.
Droplet-based pyrosequencing using digital microfluidics.
Boles, Deborah J; Benton, Jonathan L; Siew, Germaine J; Levy, Miriam H; Thwar, Prasanna K; Sandahl, Melissa A; Rouse, Jeremy L; Perkins, Lisa C; Sudarsan, Arjun P; Jalili, Roxana; Pamula, Vamsee K; Srinivasan, Vijay; Fair, Richard B; Griffin, Peter B; Eckhardt, Allen E; Pollack, Michael G
2011-11-15
The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., "sample-to-sequence" capability) could eventually be achieved using this low-cost platform.
Pre-capture multiplexing improves efficiency and cost-effectiveness of targeted genomic enrichment.
Shearer, A Eliot; Hildebrand, Michael S; Ravi, Harini; Joshi, Swati; Guiffre, Angelica C; Novak, Barbara; Happe, Scott; LeProust, Emily M; Smith, Richard J H
2012-11-14
Targeted genomic enrichment (TGE) is a widely used method for isolating and enriching specific genomic regions prior to massively parallel sequencing. To make effective use of sequencer output, barcoding and sample pooling (multiplexing) after TGE and prior to sequencing (post-capture multiplexing) has become routine. While previous reports have indicated that multiplexing prior to capture (pre-capture multiplexing) is feasible, no thorough examination of the effect of this method has been completed on a large number of samples. Here we compare standard post-capture TGE to two levels of pre-capture multiplexing: 12 or 16 samples per pool. We evaluated these methods using standard TGE metrics and determined the ability to identify several classes of genetic mutations in three sets of 96 samples, including 48 controls. Our overall goal was to maximize cost reduction and minimize experimental time while maintaining a high percentage of reads on target and a high depth of coverage at thresholds required for variant detection. We adapted the standard post-capture TGE method for pre-capture TGE with several protocol modifications, including redesign of blocking oligonucleotides and optimization of enzymatic and amplification steps. Pre-capture multiplexing reduced costs for TGE by at least 38% and significantly reduced hands-on time during the TGE protocol. We found that pre-capture multiplexing reduced capture efficiency by 23 or 31% for pre-capture pools of 12 and 16, respectively. However efficiency losses at this step can be compensated by reducing the number of simultaneously sequenced samples. Pre-capture multiplexing and post-capture TGE performed similarly with respect to variant detection of positive control mutations. In addition, we detected no instances of sample switching due to aberrant barcode identification. Pre-capture multiplexing improves efficiency of TGE experiments with respect to hands-on time and reagent use compared to standard post-capture TGE. A decrease in capture efficiency is observed when using pre-capture multiplexing; however, it does not negatively impact variant detection and can be accommodated by the experimental design.
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
Efficient and universal quantum key distribution based on chaos and middleware
NASA Astrophysics Data System (ADS)
Jiang, Dong; Chen, Yuanyuan; Gu, Xuemei; Xie, Ling; Chen, Lijun
2017-01-01
Quantum key distribution (QKD) promises unconditionally secure communications, however, the low bit rate of QKD cannot meet the requirements of high-speed applications. Despite the many solutions that have been proposed in recent years, they are neither efficient to generate the secret keys nor compatible with other QKD systems. This paper, based on chaotic cryptography and middleware technology, proposes an efficient and universal QKD protocol that can be directly deployed on top of any existing QKD system without modifying the underlying QKD protocol and optical platform. It initially takes the bit string generated by the QKD system as input, periodically updates the chaotic system, and efficiently outputs the bit sequences. Theoretical analysis and simulation results demonstrate that our protocol can efficiently increase the bit rate of the QKD system as well as securely generate bit sequences with perfect statistical properties. Compared with the existing methods, our protocol is more efficient and universal, it can be rapidly deployed on the QKD system to increase the bit rate when the QKD system becomes the bottleneck of its communication system.
Palacio-Bielsa, Ana; Cubero, Jaime; Cambra, Miguel A; Collados, Raquel; Berruete, Isabel M; López, María M
2011-01-01
Xanthomonas arboricola pv. pruni, the causal agent of bacterial spot disease of stone fruit, is considered a quarantine organism by the European Union and the European and Mediterranean Plant Protection Organization (EPPO). The bacterium can undergo an epiphytic phase and/or be latent and can be transmitted by plant material, but currently, only visual inspections are used to certify plants as being X. arboricola pv. pruni free. A novel and highly sensitive real-time TaqMan PCR detection protocol was designed based on a sequence of a gene for a putative protein related to an ABC transporter ATP-binding system in X. arboricola pv. pruni. Pathogen detection can be completed within a few hours with a sensitivity of 10(2) CFU ml(-1), thus surpassing the sensitivity of the existing conventional PCR. Specificity was assessed for X. arboricola pv. pruni strains from different origins as well as for closely related Xanthomonas species, non-Xanthomonas species, saprophytic bacteria, and healthy Prunus samples. The efficiency of the developed protocol was evaluated with field samples of 14 Prunus species and rootstocks. For symptomatic leaf samples, the protocol was very efficient even when washed tissues of the leaves were directly amplified without any previous DNA extraction. For samples of 117 asymptomatic leaves and 285 buds, the protocol was more efficient after a simple DNA extraction, and X. arboricola pv. pruni was detected in 9.4% and 9.1% of the 402 samples analyzed, respectively, demonstrating its frequent epiphytic or endophytic phase. This newly developed real-time PCR protocol can be used as a quantitative assay, offers a reliable and sensitive test for X. arboricola pv. pruni, and is suitable as a screening test for symptomatic as well as asymptomatic plant material.
Ludgate, Jackie L; Wright, James; Stockwell, Peter A; Morison, Ian M; Eccles, Michael R; Chatterjee, Aniruddha
2017-08-31
Formalin fixed paraffin embedded (FFPE) tumor samples are a major source of DNA from patients in cancer research. However, FFPE is a challenging material to work with due to macromolecular fragmentation and nucleic acid crosslinking. FFPE tissue particularly possesses challenges for methylation analysis and for preparing sequencing-based libraries relying on bisulfite conversion. Successful bisulfite conversion is a key requirement for sequencing-based methylation analysis. Here we describe a complete and streamlined workflow for preparing next generation sequencing libraries for methylation analysis from FFPE tissues. This includes, counting cells from FFPE blocks and extracting DNA from FFPE slides, testing bisulfite conversion efficiency with a polymerase chain reaction (PCR) based test, preparing reduced representation bisulfite sequencing libraries and massively parallel sequencing. The main features and advantages of this protocol are: An optimized method for extracting good quality DNA from FFPE tissues. An efficient bisulfite conversion and next generation sequencing library preparation protocol that uses 50 ng DNA from FFPE tissue. Incorporation of a PCR-based test to assess bisulfite conversion efficiency prior to sequencing. We provide a complete workflow and an integrated protocol for performing DNA methylation analysis at the genome-scale and we believe this will facilitate clinical epigenetic research that involves the use of FFPE tissue.
Identification of forensic samples by using an infrared-based automatic DNA sequencer.
Ricci, Ugo; Sani, Ilaria; Klintschar, Michael; Cerri, Nicoletta; De Ferrari, Francesco; Giovannucci Uzielli, Maria Luisa
2003-06-01
We have recently introduced a new protocol for analyzing all core loci of the Federal Bureau of Investigation's (FBI) Combined DNA Index System (CODIS) with an infrared (IR) automatic DNA sequencer (LI-COR 4200). The amplicons were labeled with forward oligonucleotide primers, covalently linked to a new infrared fluorescent molecule (IRDye 800). The alleles were displayed as familiar autoradiogram-like images with real-time detection. This protocol was employed for paternity testing, population studies, and identification of degraded forensic samples. We extensively analyzed some simulated forensic samples and mixed stains (blood, semen, saliva, bones, and fixed archival embedded tissues), comparing the results with donor samples. Sensitivity studies were also performed for the four multiplex systems. Our results show the efficiency, reliability, and accuracy of the IR system for the analysis of forensic samples. We also compared the efficiency of the multiplex protocol with ultraviolet (UV) technology. Paternity tests, undegraded DNA samples, and real forensic samples were analyzed with this approach based on IR technology and with UV-based automatic sequencers in combination with commercially-available kits. The comparability of the results with the widespread UV methods suggests that it is possible to exchange data between laboratories using the same core group of markers but different primer sets and detection methods.
Single-Cell Semiconductor Sequencing
Kohn, Andrea B.; Moroz, Tatiana P.; Barnes, Jeffrey P.; Netherton, Mandy; Moroz, Leonid L.
2014-01-01
RNA-seq or transcriptome analysis of individual cells and small-cell populations is essential for virtually any biomedical field. It is especially critical for developmental, aging, and cancer biology as well as neuroscience where the enormous heterogeneity of cells present a significant methodological and conceptual challenge. Here we present two methods that allow for fast and cost-efficient transcriptome sequencing from ultra-small amounts of tissue or even from individual cells using semiconductor sequencing technology (Ion Torrent, Life Technologies). The first method is a reduced representation sequencing which maximizes capture of RNAs and preserves transcripts’ directionality. The second, a template-switch protocol, is designed for small mammalian neurons. Both protocols, from cell/tissue isolation to final sequence data, take up to 4 days. The efficiency of these protocols has been validated with single hippocampal neurons and various invertebrate tissues including individually identified neurons within a simpler memory-forming circuit of Aplysia californica and early (1-, 2-, 4-, 8-cells) embryonic and developmental stages from basal metazoans. PMID:23929110
Davidsson, Marcus; Diaz-Fernandez, Paula; Schwich, Oliver D.; Torroba, Marcos; Wang, Gang; Björklund, Tomas
2016-01-01
Detailed characterization and mapping of oligonucleotide function in vivo is generally a very time consuming effort that only allows for hypothesis driven subsampling of the full sequence to be analysed. Recent advances in deep sequencing together with highly efficient parallel oligonucleotide synthesis and cloning techniques have, however, opened up for entirely new ways to map genetic function in vivo. Here we present a novel, optimized protocol for the generation of universally applicable, barcode labelled, plasmid libraries. The libraries are designed to enable the production of viral vector preparations assessing coding or non-coding RNA function in vivo. When generating high diversity libraries, it is a challenge to achieve efficient cloning, unambiguous barcoding and detailed characterization using low-cost sequencing technologies. With the presented protocol, diversity of above 3 million uniquely barcoded adeno-associated viral (AAV) plasmids can be achieved in a single reaction through a process achievable in any molecular biology laboratory. This approach opens up for a multitude of in vivo assessments from the evaluation of enhancer and promoter regions to the optimization of genome editing. The generated plasmid libraries are also useful for validation of sequencing clustering algorithms and we here validate the newly presented message passing clustering process named Starcode. PMID:27874090
Struniawski, R; Szpechcinski, A; Poplawska, B; Skronski, M; Chorostowska-Wynimko, J
2013-01-01
The dried blood spot (DBS) specimens have been successfully employed for the large-scale diagnostics of α1-antitrypsin (AAT) deficiency as an easy to collect and transport alternative to plasma/serum. In the present study we propose a fast, efficient, and cost effective protocol of DNA extraction from dried blood spot (DBS) samples that provides sufficient quantity and quality of DNA and effectively eliminates any natural PCR inhibitors, allowing for successful AAT genotyping by real-time PCR and direct sequencing. DNA extracted from 84 DBS samples from chronic obstructive pulmonary disease patients was genotyped for AAT deficiency variants by real-time PCR. The results of DBS AAT genotyping were validated by serum IEF phenotyping and AAT concentration measurement. The proposed protocol allowed successful DNA extraction from all analyzed DBS samples. Both quantity and quality of DNA were sufficient for further real-time PCR and, if necessary, for genetic sequence analysis. A 100% concordance between AAT DBS genotypes and serum phenotypes in positive detection of two major deficiency S- and Z- alleles was achieved. Both assays, DBS AAT genotyping by real-time PCR and serum AAT phenotyping by IEF, positively identified PI*S and PI*Z allele in 8 out of the 84 (9.5%) and 16 out of 84 (19.0%) patients, respectively. In conclusion, the proposed protocol noticeably reduces the costs and the hand-on-time of DBS samples preparation providing genomic DNA of sufficient quantity and quality for further real-time PCR or genetic sequence analysis. Consequently, it is ideally suited for large-scale AAT deficiency screening programs and should be method of choice.
Droplet-Based Pyrosequencing Using Digital Microfluidics
Boles, Deborah J.; Benton, Jonathan L.; Siew, Germaine J.; Levy, Miriam H.; Thwar, Prasanna K.; Sandahl, Melissa A.; Rouse, Jeremy L.; Perkins, Lisa C.; Sudarsan, Arjun P.; Jalili, Roxana; Pamula, Vamsee K.; Srinivasan, Vijay; Fair, Richard B.; Griffin, Peter B.; Eckhardt, Allen E.; Pollack, Michael G.
2013-01-01
The feasibility of implementing pyrosequencing chemistry within droplets using electrowetting-based digital microfluidics is reported. An array of electrodes patterned on a printed-circuit board was used to control the formation, transportation, merging, mixing, and splitting of submicroliter-sized droplets contained within an oil-filled chamber. A three-enzyme pyrosequencing protocol was implemented in which individual droplets contained enzymes, deoxyribonucleotide triphosphates (dNTPs), and DNA templates. The DNA templates were anchored to magnetic beads which enabled them to be thoroughly washed between nucleotide additions. Reagents and protocols were optimized to maximize signal over background, linearity of response, cycle efficiency, and wash efficiency. As an initial demonstration of feasibility, a portion of a 229 bp Candida parapsilosis template was sequenced using both a de novo protocol and a resequencing protocol. The resequencing protocol generated over 60 bp of sequence with 100% sequence accuracy based on raw pyrogram levels. Excellent linearity was observed for all of the homopolymers (two, three, or four nucleotides) contained in the C. parapsilosis sequence. With improvements in microfluidic design it is expected that longer reads, higher throughput, and improved process integration (i.e., “sample-to-sequence” capability) could eventually be achieved using this low-cost platform. PMID:21932784
A simplified protocol for molecular identification of Eimeria species in field samples.
Haug, Anita; Thebo, Per; Mattsson, Jens G
2007-05-15
This study aimed to find a fast, sensitive and efficient protocol for molecular identification of chicken Eimeria spp. in field samples. Various methods for each of the three steps of the protocol were evaluated: oocyst wall rupturing methods, DNA extraction methods, and identification of species-specific DNA sequences by PCR. We then compared and evaluated five complete protocols. Three series of oocyst suspensions of known number of oocysts from Eimeria mitis, Eimeria praecox, Eimeria maxima and Eimeria tenella were prepared and ground using glass beads or mini-pestle. DNA was extracted from ruptured oocysts using commercial systems (GeneReleaser, Qiagen Stoolkit and Prepman) or phenol-chloroform DNA extraction, followed by identification of species-specific ITS-1 sequences by optimised single species PCR assays. The Stoolkit and Prepman protocols showed insufficient repeatability, and the former was also expensive and relatively time-consuming. In contrast, both the GeneReleaser protocol and phenol-chloroform protocols were robust and sensitive, detecting less than 0.4 oocysts of each species per PCR. Finally, we evaluated our new protocol on 68 coccidia positive field samples. Our data suggests that rupturing the oocysts by mini-pestle grinding, preparing the DNA with GeneReleaser, followed by optimised single species PCR assays, makes a robust and sensitive procedure for identifying chicken Eimeria species in field samples. Importantly, it also provides minimal hands-on-time in the pre-PCR process, lower contamination risk and no handling of toxic chemicals.
Kesanakurti, Prasad; Belton, Mark; Saeed, Hanaa; Rast, Heidi; Boyes, Ian; Rott, Michael
2016-10-01
The majority of plant viruses contain RNA genomes. Detection of viral RNA genomes in infected plant material by next generation sequencing (NGS) is possible through the extraction and sequencing of total RNA, total RNA devoid of ribosomal RNA, small RNA interference (RNAi) molecules, or double stranded RNA (dsRNA). Plants do not typically produce high molecular weight dsRNA, therefore the presence of dsRNA makes it an attractive target for plant virus diagnostics. The sensitivity of NGS as a diagnostic method demands an effective dsRNA protocol that is both representative of the sample and minimizes sample cross contamination. We have developed a modified dsRNA extraction protocol that is more efficient compared to traditional protocols, requiring reduced amounts of starting material, that is less prone to sample cross contamination. This was accomplished by using bead based homogenization of plant material in closed, disposable 50ml tubes. To assess the quality of extraction, we also developed an internal control by designing a real-time (quantitative) PCR (qPCR) assay that targets endornaviruses present in Phaseolus vulgaris cultivar Black Turtle Soup (BTS). Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Dentinger, Bryn T M; Margaritescu, Simona; Moncalvo, Jean-Marc
2010-07-01
We present two methods for DNA extraction from fresh and dried mushrooms that are adaptable to high-throughput sequencing initiatives, such as DNA barcoding. Our results show that these protocols yield ∼85% sequencing success from recently collected materials. Tests with both recent (<2 year) and older (>100 years) specimens reveal that older collections have low success rates and may be an inefficient resource for populating a barcode database. However, our method of extracting DNA from herbarium samples using small amount of tissue is reliable and could be used for important historical specimens. The application of these protocols greatly reduces time, and therefore cost, of generating DNA sequences from mushrooms and other fungi vs. traditional extraction methods. The efficiency of these methods illustrates that standardization and streamlining of sample processing should be shifted from the laboratory to the field. © 2009 Blackwell Publishing Ltd.
Flow cytometry for enrichment and titration in massively parallel DNA sequencing
Sandberg, Julia; Ståhl, Patrik L.; Ahmadian, Afshin; Bjursell, Magnus K.; Lundeberg, Joakim
2009-01-01
Massively parallel DNA sequencing is revolutionizing genomics research throughout the life sciences. However, the reagent costs and labor requirements in current sequencing protocols are still substantial, although improvements are continuously being made. Here, we demonstrate an effective alternative to existing sample titration protocols for the Roche/454 system using Fluorescence Activated Cell Sorting (FACS) technology to determine the optimal DNA-to-bead ratio prior to large-scale sequencing. Our method, which eliminates the need for the costly pilot sequencing of samples during titration is capable of rapidly providing accurate DNA-to-bead ratios that are not biased by the quantification and sedimentation steps included in current protocols. Moreover, we demonstrate that FACS sorting can be readily used to highly enrich fractions of beads carrying template DNA, with near total elimination of empty beads and no downstream sacrifice of DNA sequencing quality. Automated enrichment by FACS is a simple approach to obtain pure samples for bead-based sequencing systems, and offers an efficient, low-cost alternative to current enrichment protocols. PMID:19304748
Sharma, Aseem; Chatterjee, Arindam; Goyal, Manu; Parsons, Matthew S; Bartel, Seth
2015-04-01
Targeting redundancy within MRI can improve its cost-effective utilization. We sought to quantify potential redundancy in our brain MRI protocols. In this retrospective review, we aggregated 207 consecutive adults who underwent brain MRI and reviewed their medical records to document clinical indication, core diagnostic information provided by MRI, and its clinical impact. Contributory imaging abnormalities constituted positive core diagnostic information whereas absence of imaging abnormalities constituted negative core diagnostic information. The senior author selected core sequences deemed sufficient for extraction of core diagnostic information. For validating core sequences selection, four readers assessed the relative ease of extracting core diagnostic information from the core sequences. Potential redundancy was calculated by comparing the average number of core sequences to the average number of sequences obtained. Scanning had been performed using 9.4±2.8 sequences over 37.3±12.3 minutes. Core diagnostic information was deemed extractable from 2.1±1.1 core sequences, with an assumed scanning time of 8.6±4.8 minutes, reflecting a potential redundancy of 74.5%±19.1%. Potential redundancy was least in scans obtained for treatment planning (14.9%±25.7%) and highest in scans obtained for follow-up of benign diseases (81.4%±12.6%). In 97.4% of cases, all four readers considered core diagnostic information to be either easily extractable from core sequences or the ease to be equivalent to that from the entire study. With only one MRI lacking clinical impact (0.48%), overutilization did not seem to contribute to potential redundancy. High potential redundancy that can be targeted for more efficient scanner utilization exists in brain MRI protocols.
A review of recommendations for sequencing receptive and expressive language instruction.
Petursdottir, Anna Ingeborg; Carr, James E
2011-01-01
We review recommendations for sequencing instruction in receptive and expressive language objectives in early and intensive behavioral intervention (EIBI) programs. Several books recommend completing receptive protocols before introducing corresponding expressive protocols. However, this recommendation has little empirical support, and some evidence exists that the reverse sequence may be more efficient. Alternative recommendations include teaching receptive and expressive skills simultaneously (M. L. Sundberg & Partington, 1998) and building learning histories that lead to acquisition of receptive and expressive skills without direct instruction (Greer & Ross, 2008). Empirical support for these recommendations also is limited. Future research should assess the relative efficiency of receptive-before-expressive, expressive-before-receptive, and simultaneous training with children who have diagnoses of autism spectrum disorders. In addition, further evaluation is needed of the potential benefits of multiple-exemplar training and other variables that may influence the efficiency of receptive and expressive instruction.
A REVIEW OF RECOMMENDATIONS FOR SEQUENCING RECEPTIVE AND EXPRESSIVE LANGUAGE INSTRUCTION
Petursdottir, Anna Ingeborg; Carr, James E
2011-01-01
We review recommendations for sequencing instruction in receptive and expressive language objectives in early and intensive behavioral intervention (EIBI) programs. Several books recommend completing receptive protocols before introducing corresponding expressive protocols. However, this recommendation has little empirical support, and some evidence exists that the reverse sequence may be more efficient. Alternative recommendations include teaching receptive and expressive skills simultaneously (M. L. Sundberg & Partington, 1998) and building learning histories that lead to acquisition of receptive and expressive skills without direct instruction (Greer & Ross, 2008). Empirical support for these recommendations also is limited. Future research should assess the relative efficiency of receptive-before-expressive, expressive-before-receptive, and simultaneous training with children who have diagnoses of autism spectrum disorders. In addition, further evaluation is needed of the potential benefits of multiple-exemplar training and other variables that may influence the efficiency of receptive and expressive instruction. PMID:22219535
Accurate, Streamlined Analysis of mRNA Translation by Sucrose Gradient Fractionation
Aboulhouda, Soufiane; Di Santo, Rachael; Therizols, Gabriel; Weinberg, David
2017-01-01
The efficiency with which proteins are produced from mRNA molecules can vary widely across transcripts, cell types, and cellular states. Methods that accurately assay the translational efficiency of mRNAs are critical to gaining a mechanistic understanding of post-transcriptional gene regulation. One way to measure translational efficiency is to determine the number of ribosomes associated with an mRNA molecule, normalized to the length of the coding sequence. The primary method for this analysis of individual mRNAs is sucrose gradient fractionation, which physically separates mRNAs based on the number of bound ribosomes. Here, we describe a streamlined protocol for accurate analysis of mRNA association with ribosomes. Compared to previous protocols, our method incorporates internal controls and improved buffer conditions that together reduce artifacts caused by non-specific mRNA–ribosome interactions. Moreover, our direct-from-fraction qRT-PCR protocol eliminates the need for RNA purification from gradient fractions, which greatly reduces the amount of hands-on time required and facilitates parallel analysis of multiple conditions or gene targets. Additionally, no phenol waste is generated during the procedure. We initially developed the protocol to investigate the translationally repressed state of the HAC1 mRNA in S. cerevisiae, but we also detail adapted procedures for mammalian cell lines and tissues. PMID:29170751
Time Synchronization and Distribution Mechanisms for Space Networks
NASA Technical Reports Server (NTRS)
Woo, Simon S.; Gao, Jay L.; Clare, Loren P.; Mills, David L.
2011-01-01
This work discusses research on the problems of synchronizing and distributing time information between spacecraft based on the Network Time Protocol (NTP), where NTP is a standard time synchronization protocol widely used in the terrestrial network. The Proximity-1 Space Link Interleaved Time Synchronization (PITS) Protocol was designed and developed for synchronizing spacecraft that are in proximity where proximity is less than 100,000 km distant. A particular application is synchronization between a Mars orbiter and rover. Lunar scenarios as well as outer-planet deep space mother-ship-probe missions may also apply. Spacecraft with more accurate time information functions as a time-server, and the other spacecraft functions as a time-client. PITS can be easily integrated and adaptable to the CCSDS Proximity-1 Space Link Protocol with minor modifications. In particular, PITS can take advantage of the timestamping strategy that underlying link layer functionality provides for accurate time offset calculation. The PITS algorithm achieves time synchronization with eight consecutive space network time packet exchanges between two spacecraft. PITS can detect and avoid possible errors from receiving duplicate and out-of-order packets by comparing with the current state variables and timestamps. Further, PITS is able to detect error events and autonomously recover from unexpected events that can possibly occur during the time synchronization and distribution process. This capability achieves an additional level of protocol protection on top of CRC or Error Correction Codes. PITS is a lightweight and efficient protocol, eliminating the needs for explicit frame sequence number and long buffer storage. The PITS protocol is capable of providing time synchronization and distribution services for a more general domain where multiple entities need to achieve time synchronization using a single point-to-point link.
MRI of penile fracture: what should be a tailored protocol in emergency?
Esposito, Andrea Alessandro; Giannitto, Caterina; Muzzupappa, Claudia; Maccagnoni, Sara; Gadda, Franco; Albo, Giancarlo; Biondetti, Pietro Raimondo
2016-09-01
To conduct a review of literature to summarize the existing MRI protocols for penile trauma, suggesting a tailored protocol to reduce costs and time of examination. A systematic search was performed in Medline, Embase, Cochrane Library, and Cinahl databases from 1995 to 2015 to identify studies evaluating penis trauma with MRI examination. Studies were included if there was the description of MRI protocol with at least sequences and orthogonal planes used. We chose a systematic approach for data extraction and descriptive synthesis. 12 articles were included in our study. Among the list of 12 articles: 2 were case reports, 3 were clinical series, and 7 were reviews. Clinical trials were not found. There is no unanimous consensus among the authors. Summarizing the data, the most used protocol is characterized by T2 sequences in three orthogonal planes plus T1 sequences in one plane (either axial or sagittal) without contrast medium injection. There is a lack of a standard protocol. A tailored protocol to answer the diagnostic question, reducing costs and time of examination, is characterized by T2 sequences in three orthogonal planes plus at least a T1 sequence (either axial or sagittal plane).
Combinatorial Pooling Enables Selective Sequencing of the Barley Gene Space
Lonardi, Stefano; Duma, Denisa; Alpert, Matthew; Cordero, Francesca; Beccuti, Marco; Bhat, Prasanna R.; Wu, Yonghui; Ciardo, Gianfranco; Alsaihati, Burair; Ma, Yaqin; Wanamaker, Steve; Resnik, Josh; Bozdag, Serdar; Luo, Ming-Cheng; Close, Timothy J.
2013-01-01
For the vast majority of species – including many economically or ecologically important organisms, progress in biological research is hampered due to the lack of a reference genome sequence. Despite recent advances in sequencing technologies, several factors still limit the availability of such a critical resource. At the same time, many research groups and international consortia have already produced BAC libraries and physical maps and now are in a position to proceed with the development of whole-genome sequences organized around a physical map anchored to a genetic map. We propose a BAC-by-BAC sequencing protocol that combines combinatorial pooling design and second-generation sequencing technology to efficiently approach denovo selective genome sequencing. We show that combinatorial pooling is a cost-effective and practical alternative to exhaustive DNA barcoding when preparing sequencing libraries for hundreds or thousands of DNA samples, such as in this case gene-bearing minimum-tiling-path BAC clones. The novelty of the protocol hinges on the computational ability to efficiently compare hundred millions of short reads and assign them to the correct BAC clones (deconvolution) so that the assembly can be carried out clone-by-clone. Experimental results on simulated data for the rice genome show that the deconvolution is very accurate, and the resulting BAC assemblies have high quality. Results on real data for a gene-rich subset of the barley genome confirm that the deconvolution is accurate and the BAC assemblies have good quality. While our method cannot provide the level of completeness that one would achieve with a comprehensive whole-genome sequencing project, we show that it is quite successful in reconstructing the gene sequences within BACs. In the case of plants such as barley, this level of sequence knowledge is sufficient to support critical end-point objectives such as map-based cloning and marker-assisted breeding. PMID:23592960
Combinatorial pooling enables selective sequencing of the barley gene space.
Lonardi, Stefano; Duma, Denisa; Alpert, Matthew; Cordero, Francesca; Beccuti, Marco; Bhat, Prasanna R; Wu, Yonghui; Ciardo, Gianfranco; Alsaihati, Burair; Ma, Yaqin; Wanamaker, Steve; Resnik, Josh; Bozdag, Serdar; Luo, Ming-Cheng; Close, Timothy J
2013-04-01
For the vast majority of species - including many economically or ecologically important organisms, progress in biological research is hampered due to the lack of a reference genome sequence. Despite recent advances in sequencing technologies, several factors still limit the availability of such a critical resource. At the same time, many research groups and international consortia have already produced BAC libraries and physical maps and now are in a position to proceed with the development of whole-genome sequences organized around a physical map anchored to a genetic map. We propose a BAC-by-BAC sequencing protocol that combines combinatorial pooling design and second-generation sequencing technology to efficiently approach denovo selective genome sequencing. We show that combinatorial pooling is a cost-effective and practical alternative to exhaustive DNA barcoding when preparing sequencing libraries for hundreds or thousands of DNA samples, such as in this case gene-bearing minimum-tiling-path BAC clones. The novelty of the protocol hinges on the computational ability to efficiently compare hundred millions of short reads and assign them to the correct BAC clones (deconvolution) so that the assembly can be carried out clone-by-clone. Experimental results on simulated data for the rice genome show that the deconvolution is very accurate, and the resulting BAC assemblies have high quality. Results on real data for a gene-rich subset of the barley genome confirm that the deconvolution is accurate and the BAC assemblies have good quality. While our method cannot provide the level of completeness that one would achieve with a comprehensive whole-genome sequencing project, we show that it is quite successful in reconstructing the gene sequences within BACs. In the case of plants such as barley, this level of sequence knowledge is sufficient to support critical end-point objectives such as map-based cloning and marker-assisted breeding.
A universal TaqMan-based RT-PCR protocol for cost-efficient detection of small noncoding RNA.
Jung, Ulrike; Jiang, Xiaoou; Kaufmann, Stefan H E; Patzel, Volker
2013-12-01
Several methods for the detection of RNA have been developed over time. For small RNA detection, a stem-loop reverse primer-based protocol relying on TaqMan RT-PCR has been described. This protocol requires an individual specific TaqMan probe for each target RNA and, hence, is highly cost-intensive for experiments with small sample sizes or large numbers of different samples. We describe a universal TaqMan-based probe protocol which can be used to detect any target sequence and demonstrate its applicability for the detection of endogenous as well as artificial eukaryotic and bacterial small RNAs. While the specific and the universal probe-based protocol showed the same sensitivity, the absolute sensitivity of detection was found to be more than 100-fold lower for both than previously reported. In subsequent experiments, we found previously unknown limitations intrinsic to the method affecting its feasibility in determination of mature template RISC incorporation as well as in multiplexing. Both protocols were equally specific in discriminating between correct and incorrect small RNA targets or between mature miRNA and its unprocessed RNA precursor, indicating the stem-loop RT-primer, but not the TaqMan probe, triggers target specificity. The presented universal TaqMan-based RT-PCR protocol represents a cost-efficient method for the detection of small RNAs.
Efficient isolation method for high-quality genomic DNA from cicada exuviae.
Nguyen, Hoa Quynh; Kim, Ye Inn; Borzée, Amaël; Jang, Yikweon
2017-10-01
In recent years, animal ethics issues have led researchers to explore nondestructive methods to access materials for genetic studies. Cicada exuviae are among those materials because they are cast skins that individuals left after molt and are easily collected. In this study, we aim to identify the most efficient extraction method to obtain high quantity and quality of DNA from cicada exuviae. We compared relative DNA yield and purity of six extraction protocols, including both manual protocols and available commercial kits, extracting from four different exoskeleton parts. Furthermore, amplification and sequencing of genomic DNA were evaluated in terms of availability of sequencing sequence at the expected genomic size. Both the choice of protocol and exuvia part significantly affected DNA yield and purity. Only samples that were extracted using the PowerSoil DNA Isolation kit generated gel bands of expected size as well as successful sequencing results. The failed attempts to extract DNA using other protocols could be partially explained by a low DNA yield from cicada exuviae and partly by contamination with humic acids that exist in the soil where cicada nymphs reside before emergence, as shown by spectroscopic measurements. Genomic DNA extracted from cicada exuviae could provide valuable information for species identification, allowing the investigation of genetic diversity across consecutive broods, or spatiotemporal variation among various populations. Consequently, we hope to provide a simple method to acquire pure genomic DNA applicable for multiple research purposes.
Deterministic generation of remote entanglement with active quantum feedback
Martin, Leigh; Motzoi, Felix; Li, Hanhan; ...
2015-12-10
We develop and study protocols for deterministic remote entanglement generation using quantum feedback, without relying on an entangling Hamiltonian. In order to formulate the most effective experimentally feasible protocol, we introduce the notion of average-sense locally optimal feedback protocols, which do not require real-time quantum state estimation, a difficult component of real-time quantum feedback control. We use this notion of optimality to construct two protocols that can deterministically create maximal entanglement: a semiclassical feedback protocol for low-efficiency measurements and a quantum feedback protocol for high-efficiency measurements. The latter reduces to direct feedback in the continuous-time limit, whose dynamics can bemore » modeled by a Wiseman-Milburn feedback master equation, which yields an analytic solution in the limit of unit measurement efficiency. Our formalism can smoothly interpolate between continuous-time and discrete-time descriptions of feedback dynamics and we exploit this feature to derive a superior hybrid protocol for arbitrary nonunit measurement efficiency that switches between quantum and semiclassical protocols. Lastly, we show using simulations incorporating experimental imperfections that deterministic entanglement of remote superconducting qubits may be achieved with current technology using the continuous-time feedback protocol alone.« less
Li, Ruichao; Xie, Miaomiao; Dong, Ning; Lin, Dachuan; Yang, Xuemei; Wong, Marcus Ho Yin; Chan, Edward Wai-Chi; Chen, Sheng
2018-03-01
Multidrug resistance (MDR)-encoding plasmids are considered major molecular vehicles responsible for transmission of antibiotic resistance genes among bacteria of the same or different species. Delineating the complete sequences of such plasmids could provide valuable insight into the evolution and transmission mechanisms underlying bacterial antibiotic resistance development. However, due to the presence of multiple repeats of mobile elements, complete sequencing of MDR plasmids remains technically complicated, expensive, and time-consuming. Here, we demonstrate a rapid and efficient approach to obtaining multiple MDR plasmid sequences through the use of the MinION nanopore sequencing platform, which is incorporated in a portable device. By assembling the long sequencing reads generated by a single MinION run according to a rapid barcoding sequencing protocol, we obtained the complete sequences of 20 plasmids harbored by multiple bacterial strains. Importantly, single long reads covering a plasmid end-to-end were recorded, indicating that de novo assembly may be unnecessary if the single reads exhibit high accuracy. This workflow represents a convenient and cost-effective approach for systematic assessment of MDR plasmids responsible for treatment failure of bacterial infections, offering the opportunity to perform detailed molecular epidemiological studies to probe the evolutionary and transmission mechanisms of MDR-encoding elements.
Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis.
Alanazi, Adwan; Elleithy, Khaled
2015-09-02
Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol.
Real-Time QoS Routing Protocols in Wireless Multimedia Sensor Networks: Study and Analysis
Alanazi, Adwan; Elleithy, Khaled
2015-01-01
Many routing protocols have been proposed for wireless sensor networks. These routing protocols are almost always based on energy efficiency. However, recent advances in complementary metal-oxide semiconductor (CMOS) cameras and small microphones have led to the development of Wireless Multimedia Sensor Networks (WMSN) as a class of wireless sensor networks which pose additional challenges. The transmission of imaging and video data needs routing protocols with both energy efficiency and Quality of Service (QoS) characteristics in order to guarantee the efficient use of the sensor nodes and effective access to the collected data. Also, with integration of real time applications in Wireless Senor Networks (WSNs), the use of QoS routing protocols is not only becoming a significant topic, but is also gaining the attention of researchers. In designing an efficient QoS routing protocol, the reliability and guarantee of end-to-end delay are critical events while conserving energy. Thus, considerable research has been focused on designing energy efficient and robust QoS routing protocols. In this paper, we present a state of the art research work based on real-time QoS routing protocols for WMSNs that have already been proposed. This paper categorizes the real-time QoS routing protocols into probabilistic and deterministic protocols. In addition, both categories are classified into soft and hard real time protocols by highlighting the QoS issues including the limitations and features of each protocol. Furthermore, we have compared the performance of mobility-aware query based real-time QoS routing protocols from each category using Network Simulator-2 (NS2). This paper also focuses on the design challenges and future research directions as well as highlights the characteristics of each QoS routing protocol. PMID:26364639
Ultrashort Echo Time and Zero Echo Time MRI at 7T
Larson, Peder E. Z.; Han, Misung; Krug, Roland; Jakary, Angela; Nelson, Sarah J.; Vigneron, Daniel B.; Henry, Roland G.; McKinnon, Graeme; Kelley, Douglas A. C.
2016-01-01
Object Zero echo time (ZTE) and ultrashort echo time (UTE) pulse sequences for MRI offer unique advantages of being able to detect signal from rapidly decaying short-T2 tissue components. In this paper, we applied 3D zero echo time (ZTE) and ultrashort echo time (UTE) pulse sequences at 7T to assess differences between these methods. Materials and Methods We matched the ZTE and UTE pulse sequences closely in terms of readout trajectories and image contrast. Our ZTE used the Water- and fat-suppressed solid-state proton projection imaging (WASPI) method to fill the center of k-space. Images from healthy volunteers obtained at 7T were compared qualitatively as well as with SNR and CNR measurements for various ultrashort, short, and long-T2 tissues. Results We measured nearly identical contrast-to-noise and signal-to-noise ratios (CNR/SNR) in similar scan times between the two approaches for ultrashort, short, and long-T2 components in the brain, knee and ankle. In our protocol, we observed gradient fidelity artifacts in UTE, and our chosen flip angle and readout also resulted as well as shading artifacts in ZTE due to inadvertent spatial selectivity. These can be corrected by advanced reconstruction methods or with different chosen protocol parameters. Conclusion The applied ZTE and UTE pulse sequences achieved similar contrast and SNR efficiency for volumetric imaging of ultrashort-T2 components. Several key differences are that ZTE is limited to volumetric imaging but has substantially reduced acoustic noise levels during the scan. Meanwhile, UTE has higher acoustic noise levels and greater sensitivity to gradient fidelity, but offers more flexibility in image contrast and volume selection. PMID:26702940
Riesgo, Ana; Pérez-Porro, Alicia R; Carmona, Susana; Leys, Sally P; Giribet, Gonzalo
2012-03-01
Transcriptome sequencing with next-generation sequencing technologies has the potential for addressing many long-standing questions about the biology of sponges. Transcriptome sequence quality depends on good cDNA libraries, which requires high-quality mRNA. Standard protocols for preserving and isolating mRNA often require optimization for unusual tissue types. Our aim was assessing the efficiency of two preservation modes, (i) flash freezing with liquid nitrogen (LN₂) and (ii) immersion in RNAlater, for the recovery of high-quality mRNA from sponge tissues. We also tested whether the long-term storage of samples at -80 °C affects the quantity and quality of mRNA. We extracted mRNA from nine sponge species and analysed the quantity and quality (A260/230 and A260/280 ratios) of mRNA according to preservation method, storage time, and taxonomy. The quantity and quality of mRNA depended significantly on the preservation method used (LN₂) outperforming RNAlater), the sponge species, and the interaction between them. When the preservation was analysed in combination with either storage time or species, the quantity and A260/230 ratio were both significantly higher for LN₂-preserved samples. Interestingly, individual comparisons for each preservation method over time indicated that both methods performed equally efficiently during the first month, but RNAlater lost efficiency in storage times longer than 2 months compared with flash-frozen samples. In summary, we find that for long-term preservation of samples, flash freezing is the preferred method. If LN₂ is not available, RNAlater can be used, but mRNA extraction during the first month of storage is advised. © 2011 Blackwell Publishing Ltd.
Modifications in trypsin digestion protocol for increasing the efficiency and coverage.
Syal, Kirtimaan; Tadala, Raghu
2015-01-01
Standard trypsin digestion protocol of proteins followed by MALDI-MS analysis has been realized as an important tool for the identification and characterization of proteins. In this article, we proposed the elimination of the step of 'staining/de-staining of gel pieces' in in-gel digestion protocol in order to improve the efficiency of trypsin digestion. Coomassie dye is known to interfere with digestion of proteins by trypsin and the procedure of staining-de-staining could result in loss of photoaffinity probe, post translational modifications and catalytic activities of enzymes. Further, we studied parameters like hydrophobicity and isoelectric point, and attempted to quantitatively relate it to the efficiency of trypsin digestion. We suggest that properties of proteins should be considered and trypsin digestion protocol should be appropriately modified as per sequence and other information.
Mchinda, Samira; Varma, Gopal; Prevost, Valentin H; Le Troter, Arnaud; Rapacchi, Stanislas; Guye, Maxime; Pelletier, Jean; Ranjeva, Jean-Philippe; Alsop, David C; Duhamel, Guillaume; Girard, Olivier M
2018-05-01
To implement, characterize, and optimize an interleaved inhomogeneous magnetization transfer (ihMT) gradient echo sequence allowing for whole-brain imaging within a clinically compatible scan time. A general framework for ihMT modelling was developed based on the Provotorov theory of radiofrequency saturation, which accounts for the dipolar order underpinning the ihMT effect. Experimental studies and numerical simulations were performed to characterize and optimize the ihMT-gradient echo dependency with sequence timings, saturation power, and offset frequency. The protocol was optimized in terms of maximum signal intensity and the reproducibility assessed for a nominal resolution of 1.5 mm isotropic. All experiments were performed on healthy volunteers at 1.5T. An important mechanism driving signal optimization and leading to strong ihMT signal enhancement that relies on the dynamics of radiofrequency energy deposition has been identified. By taking advantage of the delay allowed for readout between ihMT pulse bursts, it was possible to boost the ihMT signal by almost 2-fold compared to previous implementation. Reproducibility of the optimal protocol was very good, with an intra-individual error < 2%. The proposed sensitivity-boosted and time-efficient steady-state ihMT-gradient echo sequence, implemented and optimized at 1.5T, allowed robust high-resolution 3D ihMT imaging of the whole brain within a clinically compatible scan time. Magn Reson Med 79:2607-2619, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Ultrafast Brain MRI: Clinical Deployment and Comparison to Conventional Brain MRI at 3T.
Prakkamakul, Supada; Witzel, Thomas; Huang, Susie; Boulter, Daniel; Borja, Maria J; Schaefer, Pamela; Rosen, Bruce; Heberlein, Keith; Ratai, Eva; Gonzalez, Gilberto; Rapalino, Otto
2016-09-01
To compare an ultrafast brain magnetic resonance imaging (MRI) protocol to the conventional protocol in motion-prone inpatient clinical settings. This retrospective study was HIPAA compliant and approved by the Institutional Review Board with waived inform consent. Fifty-nine inpatients (30 males, 29 females; mean age 55.1, range 23-93 years)who underwent 3-Tesla brain MRI using ultrafast and conventional protocols, both including five sequences, were included in the study. The total scan time for five ultrafast sequences was 4 minutes 59 seconds. The ideal conventional acquisition time was 10 minutes 32 seconds but the actual acquisition took 15-20 minutes. The average scan times for ultrafast localizers, T1-weighted, T2-weighted, fluid-attenuated inversion recovery (FLAIR), diffusion-weighted, T2*-weighted sequences were 14, 41, 62, 96, 80, 6 seconds, respectively. Two blinded neuroradiologists independently assessed three aspects: (1) image quality, (2) gray-white matter (GM-WM) differentiation, and (3) diagnostic concordance for the detection of six clinically relevant imaging findings. Wilcoxon signed-rank test was used to compare image quality and GM-WM scores. Interobserver reproducibility was calculated. The ultrafast T1-weighted sequence demonstrated significantly better image quality (P = .005) and GM-WM differentiation (P < .001) compared to the conventional sequence. There was high agreement (>85%) between both protocols for the detection of mass-like lesion, hemorrhage, diffusion restriction, WM FLAIR hyperintensities, subarachnoid FLAIR hyperintensities, and hydrocephalus. The ultrafast protocol achieved at least comparable image quality and high diagnostic concordance compared to the conventional protocol. This fast protocol can be a viable option to replace the conventional protocol in motion-prone inpatient clinical settings. Copyright © 2016 by the American Society of Neuroimaging.
Practical quantum key distribution protocol without monitoring signal disturbance.
Sasaki, Toshihiko; Yamamoto, Yoshihisa; Koashi, Masato
2014-05-22
Quantum cryptography exploits the fundamental laws of quantum mechanics to provide a secure way to exchange private information. Such an exchange requires a common random bit sequence, called a key, to be shared secretly between the sender and the receiver. The basic idea behind quantum key distribution (QKD) has widely been understood as the property that any attempt to distinguish encoded quantum states causes a disturbance in the signal. As a result, implementation of a QKD protocol involves an estimation of the experimental parameters influenced by the eavesdropper's intervention, which is achieved by randomly sampling the signal. If the estimation of many parameters with high precision is required, the portion of the signal that is sacrificed increases, thus decreasing the efficiency of the protocol. Here we propose a QKD protocol based on an entirely different principle. The sender encodes a bit sequence onto non-orthogonal quantum states and the receiver randomly dictates how a single bit should be calculated from the sequence. The eavesdropper, who is unable to learn the whole of the sequence, cannot guess the bit value correctly. An achievable rate of secure key distribution is calculated by considering complementary choices between quantum measurements of two conjugate observables. We found that a practical implementation using a laser pulse train achieves a key rate comparable to a decoy-state QKD protocol, an often-used technique for lasers. It also has a better tolerance of bit errors and of finite-sized-key effects. We anticipate that this finding will give new insight into how the probabilistic nature of quantum mechanics can be related to secure communication, and will facilitate the simple and efficient use of conventional lasers for QKD.
A two-hop based adaptive routing protocol for real-time wireless sensor networks.
Rachamalla, Sandhya; Kancherla, Anitha Sheela
2016-01-01
One of the most important and challenging issues in wireless sensor networks (WSNs) is to optimally manage the limited energy of nodes without degrading the routing efficiency. In this paper, we propose an energy-efficient adaptive routing mechanism for WSNs, which saves energy of nodes by removing the much delayed packets without degrading the real-time performance of the used routing protocol. It uses the adaptive transmission power algorithm which is based on the attenuation of the wireless link to improve the energy efficiency. The proposed routing mechanism can be associated with any geographic routing protocol and its performance is evaluated by integrating with the well known two-hop based real-time routing protocol, PATH and the resulting protocol is energy-efficient adaptive routing protocol (EE-ARP). The EE-ARP performs well in terms of energy consumption, deadline miss ratio, packet drop and end-to-end delay.
Generation and validation of homozygous fluorescent knock-in cells using CRISPR-Cas9 genome editing.
Koch, Birgit; Nijmeijer, Bianca; Kueblbeck, Moritz; Cai, Yin; Walther, Nike; Ellenberg, Jan
2018-06-01
Gene tagging with fluorescent proteins is essential for investigations of the dynamic properties of cellular proteins. CRISPR-Cas9 technology is a powerful tool for inserting fluorescent markers into all alleles of the gene of interest (GOI) and allows functionality and physiological expression of the fusion protein. It is essential to evaluate such genome-edited cell lines carefully in order to preclude off-target effects caused by (i) incorrect insertion of the fluorescent protein, (ii) perturbation of the fusion protein by the fluorescent proteins or (iii) nonspecific genomic DNA damage by CRISPR-Cas9. In this protocol, we provide a step-by-step description of our systematic pipeline to generate and validate homozygous fluorescent knock-in cell lines.We have used the paired Cas9D10A nickase approach to efficiently insert tags into specific genomic loci via homology-directed repair (HDR) with minimal off-target effects. It is time-consuming and costly to perform whole-genome sequencing of each cell clone to check for spontaneous genetic variations occurring in mammalian cell lines. Therefore, we have developed an efficient validation pipeline of the generated cell lines consisting of junction PCR, Southern blotting analysis, Sanger sequencing, microscopy, western blotting analysis and live-cell imaging for cell-cycle dynamics. This protocol takes between 6 and 9 weeks. With this protocol, up to 70% of the targeted genes can be tagged homozygously with fluorescent proteins, thus resulting in physiological levels and phenotypically functional expression of the fusion proteins.
Spin ensemble-based AC magnetometry using concatenated dynamical decoupling at low temperatures
NASA Astrophysics Data System (ADS)
Farfurnik, D.; Jarmola, A.; Budker, D.; Bar-Gill, N.
2018-01-01
Ensembles of nitrogen-vacancy centers in diamond are widely used as AC magnetometers. While such measurements are usually performed using standard (XY) dynamical decoupling (DD) protocols at room temperature, we study the sensitivities achieved by utilizing various DD protocols, for measuring magnetic AC fields at frequencies in the 10-250 kHz range, at room temperature and 77 K. By performing measurements on an isotopically pure 12C sample, we find that the Carr-Purcell-Meiboom-Gill protocol, which is not robust against pulse imperfections, is less efficient for magnetometry than robust XY-based sequences. The concatenation of a standard XY-based protocol may enhance the sensitivities only for measuring high-frequency fields, for which many (> 500) DD pulses are necessary and the robustness against pulse imperfections is critical. Moreover, we show that cooling is effective only for measuring low-frequency fields (˜10 kHz), for which the experiment time approaches T 1 at a small number of applied DD pulses.
Highly multiplexed targeted DNA sequencing from single nuclei.
Leung, Marco L; Wang, Yong; Kim, Charissa; Gao, Ruli; Jiang, Jerry; Sei, Emi; Navin, Nicholas E
2016-02-01
Single-cell DNA sequencing methods are challenged by poor physical coverage, high technical error rates and low throughput. To address these issues, we developed a single-cell DNA sequencing protocol that combines flow-sorting of single nuclei, time-limited multiple-displacement amplification (MDA), low-input library preparation, DNA barcoding, targeted capture and next-generation sequencing (NGS). This approach represents a major improvement over our previous single nucleus sequencing (SNS) Nature Protocols paper in terms of generating higher-coverage data (>90%), thereby enabling the detection of genome-wide variants in single mammalian cells at base-pair resolution. Furthermore, by pooling 48-96 single-cell libraries together for targeted capture, this approach can be used to sequence many single-cell libraries in parallel in a single reaction. This protocol greatly reduces the cost of single-cell DNA sequencing, and it can be completed in 5-6 d by advanced users. This single-cell DNA sequencing protocol has broad applications for studying rare cells and complex populations in diverse fields of biological research and medicine.
Quantifying and Mitigating the Effect of Preferential Sampling on Phylodynamic Inference
Karcher, Michael D.; Palacios, Julia A.; Bedford, Trevor; Suchard, Marc A.; Minin, Vladimir N.
2016-01-01
Phylodynamics seeks to estimate effective population size fluctuations from molecular sequences of individuals sampled from a population of interest. One way to accomplish this task formulates an observed sequence data likelihood exploiting a coalescent model for the sampled individuals’ genealogy and then integrating over all possible genealogies via Monte Carlo or, less efficiently, by conditioning on one genealogy estimated from the sequence data. However, when analyzing sequences sampled serially through time, current methods implicitly assume either that sampling times are fixed deterministically by the data collection protocol or that their distribution does not depend on the size of the population. Through simulation, we first show that, when sampling times do probabilistically depend on effective population size, estimation methods may be systematically biased. To correct for this deficiency, we propose a new model that explicitly accounts for preferential sampling by modeling the sampling times as an inhomogeneous Poisson process dependent on effective population size. We demonstrate that in the presence of preferential sampling our new model not only reduces bias, but also improves estimation precision. Finally, we compare the performance of the currently used phylodynamic methods with our proposed model through clinically-relevant, seasonal human influenza examples. PMID:26938243
A new real-time PCR protocol for detection of avian haemosporidians.
Bell, Jeffrey A; Weckstein, Jason D; Fecchio, Alan; Tkach, Vasyl V
2015-07-19
Birds possess the most diverse assemblage of haemosporidian parasites; including three genera, Plasmodium, Haemoproteus, and Leucocytozoon. Currently there are over 200 morphologically identified avian haemosporidian species, although true species richness is unknown due to great genetic diversity and insufficient sampling in highly diverse regions. Studies aimed at surveying haemosporidian diversity involve collecting and screening samples from hundreds to thousands of individuals. Currently, screening relies on microscopy and/or single or nested standard PCR. Although effective, these methods are time and resource consuming, and in the case of microscopy require substantial expertise. Here we report a newly developed real-time PCR protocol designed to quickly and reliably detect all three genera of avian haemosporidians in a single biochemical reaction. Using available DNA sequences from avian haemosporidians we designed primers R330F and R480RL, which flank a 182 base pair fragment of mitochondrial conserved rDNA. These primers were initially tested using real-time PCR on samples from Malawi, Africa, previously screened for avian haemosporidians using traditional nested PCR. Our real time protocol was further tested on 94 samples from the Cerrado biome of Brazil, previously screened using a single PCR assay for haemosporidian parasites. These samples were also amplified using modified nested PCR protocols, allowing for comparisons between the three different screening methods (single PCR, nested PCR, real-time PCR). The real-time PCR protocol successfully identified all three genera of avian haemosporidians from both single and mixed infections previously detected from Malawi. There was no significant difference between the three different screening protocols used for the 94 samples from the Brazilian Cerrado (χ(2) = 0.3429, df = 2, P = 0.842). After proving effective, the real-time protocol was used to screen 2113 Brazilian samples, identifying 693 positive samples. Our real-time PCR assay proved as effective as two widely used molecular screening techniques, single PCR and nested PCR. However, the real-time protocol has the distinct advantage of detecting all three genera in a single reaction, which significantly increases efficiency by greatly decreasing screening time and cost. Our real-time PCR protocol is therefore a valuable tool in the quickly expanding field of avian haemosporidian research.
Phase-insensitive storage of coherences by reversible mapping onto long-lived populations
NASA Astrophysics Data System (ADS)
Mieth, Simon; Genov, Genko T.; Yatsenko, Leonid P.; Vitanov, Nikolay V.; Halfmann, Thomas
2016-01-01
We theoretically develop and experimentally demonstrate a coherence population mapping (CPM) protocol to store atomic coherences in long-lived populations, enabling storage times far beyond the typically very short decoherence times of quantum systems. The amplitude and phase of an atomic coherence is written onto the populations of a three-state system by specifically designed sequences of radiation pulses from two coupling fields. As an important feature, the CPM sequences enable a retrieval efficiency, which is insensitive to the phase of the initial coherence. The information is preserved in every individual atom of the medium, enabling applications in purely homogeneously or inhomogeneously broadened ensembles even when stochastic phase jumps are the main source of decoherence. We experimentally confirm the theoretical predictions by applying CPM for storage of atomic coherences in a doped solid, reaching storage times in the regime of 1 min.
ECS: efficient communication scheduling for underwater sensor networks.
Hong, Lu; Hong, Feng; Guo, Zhongwen; Li, Zhengbao
2011-01-01
TDMA protocols have attracted a lot of attention for underwater acoustic sensor networks (UWSNs), because of the unique characteristics of acoustic signal propagation such as great energy consumption in transmission, long propagation delay and long communication range. Previous TDMA protocols all allocated transmission time to nodes based on discrete time slots. This paper proposes an efficient continuous time scheduling TDMA protocol (ECS) for UWSNs, including the continuous time based and sender oriented conflict analysis model, the transmission moment allocation algorithm and the distributed topology maintenance algorithm. Simulation results confirm that ECS improves network throughput by 20% on average, compared to existing MAC protocols.
Evaluation of MRI sequences for quantitative T1 brain mapping
NASA Astrophysics Data System (ADS)
Tsialios, P.; Thrippleton, M.; Glatz, A.; Pernet, C.
2017-11-01
T1 mapping constitutes a quantitative MRI technique finding significant application in brain imaging. It allows evaluation of contrast uptake, blood perfusion, volume, providing a more specific biomarker of disease progression compared to conventional T1-weighted images. While there are many techniques for T1-mapping there is a wide range of reported T1-values in tissues, raising the issue of protocols reproducibility and standardization. The gold standard for obtaining T1-maps is based on acquiring IR-SE sequence. Widely used alternative sequences are IR-SE-EPI, VFA (DESPOT), DESPOT-HIFI and MP2RAGE that speed up scanning and fitting procedures. A custom MRI phantom was used to assess the reproducibility and accuracy of the different methods. All scans were performed using a 3T Siemens Prisma scanner. The acquired data processed using two different codes. The main difference was observed for VFA (DESPOT) which grossly overestimated T1 relaxation time by 214 ms [126 270] compared to the IR-SE sequence. MP2RAGE and DESPOT-HIFI sequences gave slightly shorter time than IR-SE (~20 to 30ms) and can be considered as alternative and time-efficient methods for acquiring accurate T1 maps of the human brain, while IR-SE-EPI gave identical result, at a cost of a lower image quality.
High efficiency endocrine operation protocol: From design to implementation.
Mascarella, Marco A; Lahrichi, Nadia; Cloutier, Fabienne; Kleiman, Simcha; Payne, Richard J; Rosenberg, Lawrence
2016-10-01
We developed a high efficiency endocrine operative protocol based on a mathematical programming approach, process reengineering, and value-stream mapping to increase the number of operations completed per day without increasing operating room time at a tertiary-care, academic center. Using this protocol, a case-control study of 72 patients undergoing endocrine operation during high efficiency days were age, sex, and procedure-matched to 72 patients undergoing operation during standard days. The demographic profile, operative times, and perioperative complications were noted. The average number of cases per 8-hour workday in the high efficiency and standard operating rooms were 7 and 5, respectively. Mean procedure times in both groups were similar. The turnaround time (mean ± standard deviation) in the high efficiency group was 8.5 (±2.7) minutes as compared with 15.4 (±4.9) minutes in the standard group (P < .001). Transient postoperative hypocalcemia was 6.9% (5/72) and 8.3% (6/72) for the high efficiency and standard groups, respectively (P = .99). In this study, patients undergoing high efficiency endocrine operation had similar procedure times and perioperative complications compared with the standard group. The proposed high efficiency protocol seems to better utilize operative time and decrease the backlog of patients waiting for endocrine operation in a country with a universal national health care program. Copyright © 2016 Elsevier Inc. All rights reserved.
Jaroenram, Wansadaj; Owens, Leigh
2014-11-01
Penaeus stylirostris densovirus (PstDV) is an important shrimp pathogen that causes mortality in P. stylirostris and runt deformity syndrome (RDS) in Penaeus vannamei and Penaeus monodon. Recently, PstDV-related sequences were found in the genome of P. monodon and P. vannamei. This led to false positive results by PCR-based detection system. Here, a more efficient detection platform based on recombinase polymerase amplification (RPA) and a lateral flow dipstick (LFD) was developed for detecting PstDV. Under the optimal conditions, 30 min at 37°C for RPA followed by 5 min at room temperature for LFD, the protocol was 10 times more sensitive than the Saksmerphrome et al's interim 3-tube nested PCR and showed no cross-reaction with other shrimp viruses. It also reduced false positive results arising from viral inserts to ∼5% compared to 76-78% by the IQ2000™ nested PCR kit and the 309F/R PCR protocol currently recommended by World Organization for Animal Health (OIE) for PstDV detection. Together with simplicity and portability, the protocol serves as an alternative tool to PCR for primarily screening PstDV, which is suitable for both laboratory and field application. Copyright © 2014 Elsevier B.V. All rights reserved.
How Efficient Is My (Medicinal) Chemistry?
Vanden Eynde, Jean Jacques
2016-01-01
“Greening” a chemical transformation is not about only changing the nature of a solvent or decreasing the reaction temperature. There are metrics enabling a critical quantification of the efficiency of an experimental protocol. Some of them are applied to different sequences for the preparation of paracetamol in order to understand their performance parameters and elucidate pathways for improvement. PMID:27196914
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Sayah, Anousheh; Jay, Ann K; Toaff, Jacob S; Makariou, Erini V; Berkowitz, Frank
2016-09-01
Reducing lumbar spine MRI scanning time while retaining diagnostic accuracy can benefit patients and reduce health care costs. This study compares the effectiveness of a rapid lumbar MRI protocol using 3D T2-weighted sampling perfection with application-optimized contrast with different flip-angle evolutions (SPACE) sequences with a standard MRI protocol for evaluation of lumbar spondylosis. Two hundred fifty consecutive unenhanced lumbar MRI examinations performed at 1.5 T were retrospectively reviewed. Full, rapid, and complete versions of each examination were interpreted for spondylotic changes at each lumbar level, including herniations and neural compromise. The full examination consisted of sagittal T1-weighted, T2-weighted turbo spin-echo (TSE), and STIR sequences; and axial T1- and T2-weighted TSE sequences (time, 18 minutes 40 seconds). The rapid examination consisted of sagittal T1- and T2-weighted SPACE sequences, with axial SPACE reformations (time, 8 minutes 46 seconds). The complete examination consisted of the full examination plus the T2-weighted SPACE sequence. Sensitivities and specificities of the full and rapid examinations were calculated using the complete study as the reference standard. The rapid and full studies had sensitivities of 76.0% and 69.3%, with specificities of 97.2% and 97.9%, respectively, for all degenerative processes. Rapid and full sensitivities were 68.7% and 66.3% for disk herniation, 85.2% and 81.5% for canal compromise, 82.9% and 69.1% for lateral recess compromise, and 76.9% and 69.7% for foraminal compromise, respectively. Isotropic SPACE T2-weighted imaging provides high-quality imaging of lumbar spondylosis, with multiplanar reformatting capability. Our SPACE-based rapid protocol had sensitivities and specificities for herniations and neural compromise comparable to those of the protocol without SPACE. This protocol fits within a 15-minute slot, potentially reducing costs and discomfort for a large subgroup of patients.
Zerbini, Francesca; Zanella, Ilaria; Fraccascia, Davide; König, Enrico; Irene, Carmela; Frattini, Luca F; Tomasi, Michele; Fantappiè, Laura; Ganfini, Luisa; Caproni, Elena; Parri, Matteo; Grandi, Alberto; Grandi, Guido
2017-04-24
The exploitation of the CRISPR/Cas9 machinery coupled to lambda (λ) recombinase-mediated homologous recombination (recombineering) is becoming the method of choice for genome editing in E. coli. First proposed by Jiang and co-workers, the strategy has been subsequently fine-tuned by several authors who demonstrated, by using few selected loci, that the efficiency of mutagenesis (number of mutant colonies over total number of colonies analyzed) can be extremely high (up to 100%). However, from published data it is difficult to appreciate the robustness of the technology, defined as the number of successfully mutated loci over the total number of targeted loci. This information is particularly relevant in high-throughput genome editing, where repetition of experiments to rescue missing mutants would be impractical. This work describes a "brute force" validation activity, which culminated in the definition of a robust, simple and rapid protocol for single or multiple gene deletions. We first set up our own version of the CRISPR/Cas9 protocol and then we evaluated the mutagenesis efficiency by changing different parameters including sequence of guide RNAs, length and concentration of donor DNAs, and use of single stranded and double stranded donor DNAs. We then validated the optimized conditions targeting 78 "dispensable" genes. This work led to the definition of a protocol, featuring the use of double stranded synthetic donor DNAs, which guarantees mutagenesis efficiencies consistently higher than 10% and a robustness of 100%. The procedure can be applied also for simultaneous gene deletions. This work defines for the first time the robustness of a CRISPR/Cas9-based protocol based on a large sample size. Since the technical solutions here proposed can be applied to other similar procedures, the data could be of general interest for the scientific community working on bacterial genome editing and, in particular, for those involved in synthetic biology projects requiring high throughput procedures.
2014-01-01
Background Next-generation DNA sequencing (NGS) technologies have made huge impacts in many fields of biological research, but especially in evolutionary biology. One area where NGS has shown potential is for high-throughput sequencing of complete mtDNA genomes (of humans and other animals). Despite the increasing use of NGS technologies and a better appreciation of their importance in answering biological questions, there remain significant obstacles to the successful implementation of NGS-based projects, especially for new users. Results Here we present an ‘A to Z’ protocol for obtaining complete human mitochondrial (mtDNA) genomes – from DNA extraction to consensus sequence. Although designed for use on humans, this protocol could also be used to sequence small, organellar genomes from other species, and also nuclear loci. This protocol includes DNA extraction, PCR amplification, fragmentation of PCR products, barcoding of fragments, sequencing using the 454 GS FLX platform, and a complete bioinformatics pipeline (primer removal, reference-based mapping, output of coverage plots and SNP calling). Conclusions All steps in this protocol are designed to be straightforward to implement, especially for researchers who are undertaking next-generation sequencing for the first time. The molecular steps are scalable to large numbers (hundreds) of individuals and all steps post-DNA extraction can be carried out in 96-well plate format. Also, the protocol has been assembled so that individual ‘modules’ can be swapped out to suit available resources. PMID:24460871
ECS: Efficient Communication Scheduling for Underwater Sensor Networks
Hong, Lu; Hong, Feng; Guo, Zhongwen; Li, Zhengbao
2011-01-01
TDMA protocols have attracted a lot of attention for underwater acoustic sensor networks (UWSNs), because of the unique characteristics of acoustic signal propagation such as great energy consumption in transmission, long propagation delay and long communication range. Previous TDMA protocols all allocated transmission time to nodes based on discrete time slots. This paper proposes an efficient continuous time scheduling TDMA protocol (ECS) for UWSNs, including the continuous time based and sender oriented conflict analysis model, the transmission moment allocation algorithm and the distributed topology maintenance algorithm. Simulation results confirm that ECS improves network throughput by 20% on average, compared to existing MAC protocols. PMID:22163775
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C; Quake, Stephen R; Burkholder, William F
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation.
Tan, Swee Jin; Phan, Huan; Gerry, Benjamin Michael; Kuhn, Alexandre; Hong, Lewis Zuocheng; Min Ong, Yao; Poon, Polly Suk Yean; Unger, Marc Alexander; Jones, Robert C.; Quake, Stephen R.; Burkholder, William F.
2013-01-01
Library preparation for next-generation DNA sequencing (NGS) remains a key bottleneck in the sequencing process which can be relieved through improved automation and miniaturization. We describe a microfluidic device for automating laboratory protocols that require one or more column chromatography steps and demonstrate its utility for preparing Next Generation sequencing libraries for the Illumina and Ion Torrent platforms. Sixteen different libraries can be generated simultaneously with significantly reduced reagent cost and hands-on time compared to manual library preparation. Using an appropriate column matrix and buffers, size selection can be performed on-chip following end-repair, dA tailing, and linker ligation, so that the libraries eluted from the chip are ready for sequencing. The core architecture of the device ensures uniform, reproducible column packing without user supervision and accommodates multiple routine protocol steps in any sequence, such as reagent mixing and incubation; column packing, loading, washing, elution, and regeneration; capture of eluted material for use as a substrate in a later step of the protocol; and removal of one column matrix so that two or more column matrices with different functional properties can be used in the same protocol. The microfluidic device is mounted on a plastic carrier so that reagents and products can be aliquoted and recovered using standard pipettors and liquid handling robots. The carrier-mounted device is operated using a benchtop controller that seals and operates the device with programmable temperature control, eliminating any requirement for the user to manually attach tubing or connectors. In addition to NGS library preparation, the device and controller are suitable for automating other time-consuming and error-prone laboratory protocols requiring column chromatography steps, such as chromatin immunoprecipitation. PMID:23894273
Abbreviated Combined MR Protocol: A New Faster Strategy for Characterizing Breast Lesions.
Moschetta, Marco; Telegrafo, Michele; Rella, Leonarda; Stabile Ianora, Amato Antonio; Angelelli, Giuseppe
2016-06-01
The use of an abbreviated magnetic resonance (MR) protocol has been recently proposed for cancer screening. The aim of our study is to evaluate the diagnostic accuracy of an abbreviated MR protocol combining short TI inversion recovery (STIR), turbo-spin-echo (TSE)-T2 sequences, a pre-contrast T1, and a single intermediate (3 minutes after contrast injection) post-contrast T1 sequence for characterizing breast lesions. A total of 470 patients underwent breast MR examination for screening, problem solving, or preoperative staging. Two experienced radiologists evaluated both standard and abbreviated protocols in consensus. Sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), and diagnostic accuracy for both protocols were calculated (with the histological findings and 6-month ultrasound follow-up as the reference standard) and compared with the McNemar test. The post-processing and interpretation times for the MR images were compared with the paired t test. In 177 of 470 (38%) patients, the MR sequences detected 185 breast lesions. Standard and abbreviated protocols obtained sensitivity, specificity, diagnostic accuracy, PPV, and NPV values respectively of 92%, 92%, 92%, 68%, and 98% and of 89%, 91%, 91%, 64%, and 98% with no statistically significant difference (P < .0001). The mean post-processing and interpretation time were, respectively, 7 ± 1 minutes and 6 ± 3.2 minutes for the standard protocol and 1 ± 1.2 minutes and 2 ± 1.2 minutes for the abbreviated protocol, with a statistically significant difference (P < .01). An abbreviated combined MR protocol represents a time-saving tool for radiologists and patients with the same diagnostic potential as the standard protocol in patients undergoing breast MRI for screening, problem solving, or preoperative staging. Copyright © 2016 Elsevier Inc. All rights reserved.
Comparative Analysis of Single-Cell RNA Sequencing Methods.
Ziegenhain, Christoph; Vieth, Beate; Parekh, Swati; Reinius, Björn; Guillaumet-Adkins, Amy; Smets, Martha; Leonhardt, Heinrich; Heyn, Holger; Hellmann, Ines; Enard, Wolfgang
2017-02-16
Single-cell RNA sequencing (scRNA-seq) offers new possibilities to address biological and medical questions. However, systematic comparisons of the performance of diverse scRNA-seq protocols are lacking. We generated data from 583 mouse embryonic stem cells to evaluate six prominent scRNA-seq methods: CEL-seq2, Drop-seq, MARS-seq, SCRB-seq, Smart-seq, and Smart-seq2. While Smart-seq2 detected the most genes per cell and across cells, CEL-seq2, Drop-seq, MARS-seq, and SCRB-seq quantified mRNA levels with less amplification noise due to the use of unique molecular identifiers (UMIs). Power simulations at different sequencing depths showed that Drop-seq is more cost-efficient for transcriptome quantification of large numbers of cells, while MARS-seq, SCRB-seq, and Smart-seq2 are more efficient when analyzing fewer cells. Our quantitative comparison offers the basis for an informed choice among six prominent scRNA-seq methods, and it provides a framework for benchmarking further improvements of scRNA-seq protocols. Copyright © 2017 Elsevier Inc. All rights reserved.
Adenine specific DNA chemical sequencing reaction.
Iverson, B L; Dervan, P B
1987-01-01
Reaction of DNA with K2PdCl4 at pH 2.0 followed by a piperidine workup produces specific cleavage at adenine (A) residues. Product analysis revealed the K2PdCl4 reaction involves selective depurination at adenine, affording an excision reaction analogous to the other chemical DNA sequencing reactions. Adenine residues methylated at the exocyclic amine (N6) react with lower efficiency than unmethylated adenine in an identical sequence. This simple protocol specific for A may be a useful addition to current chemical sequencing reactions. Images PMID:3671067
Chowdhury, Supriyo; Basu, Arpita; Kundu, Surekha
2014-09-01
In spite of the economic importance of sesame (Sesamum indicum L.) and the recent availability of its genome sequence, a high-frequency transformation protocol is still not available. The only two existing Agrobacterium-mediated transformation protocols that are available have poor transformation efficiencies of less than 2%. In the present study, we report a high-frequency, simple, and reproducible transformation protocol for sesame. Transformation was done using de-embryonated cotyledons via somatic embryogenic stages. All the critical parameters of transformation, like incubation period of explants in pre-regeneration medium prior to infection by Agrobacterium tumefaciens, cocultivation period, concentrations of acetosyringone in cocultivation medium, kanamycin concentration, and concentration of plant hormones, including 6-benzylaminopurine, have been optimized. This protocol is superior to the two existing protocols in its high regeneration and transformation efficiencies. The transformed sesame lines have been tested by PCR, RT-PCR for neomycin phosphotransferase II gene expression, and β-glucuronidase (GUS) assay. The regeneration frequency and transformation efficiency are 57.33 and 42.66%, respectively. T0 and T1 generation transgenic plants were analyzed, and several T1 plants homozygous for the transgenes were obtained.
Hybrid selection for sequencing pathogen genomes from clinical samples
2011-01-01
We have adapted a solution hybrid selection protocol to enrich pathogen DNA in clinical samples dominated by human genetic material. Using mock mixtures of human and Plasmodium falciparum malaria parasite DNA as well as clinical samples from infected patients, we demonstrate an average of approximately 40-fold enrichment of parasite DNA after hybrid selection. This approach will enable efficient genome sequencing of pathogens from clinical samples, as well as sequencing of endosymbiotic organisms such as Wolbachia that live inside diverse metazoan phyla. PMID:21835008
Technical Considerations for Reduced Representation Bisulfite Sequencing with Multiplexed Libraries
Chatterjee, Aniruddha; Rodger, Euan J.; Stockwell, Peter A.; Weeks, Robert J.; Morison, Ian M.
2012-01-01
Reduced representation bisulfite sequencing (RRBS), which couples bisulfite conversion and next generation sequencing, is an innovative method that specifically enriches genomic regions with a high density of potential methylation sites and enables investigation of DNA methylation at single-nucleotide resolution. Recent advances in the Illumina DNA sample preparation protocol and sequencing technology have vastly improved sequencing throughput capacity. Although the new Illumina technology is now widely used, the unique challenges associated with multiplexed RRBS libraries on this platform have not been previously described. We have made modifications to the RRBS library preparation protocol to sequence multiplexed libraries on a single flow cell lane of the Illumina HiSeq 2000. Furthermore, our analysis incorporates a bioinformatics pipeline specifically designed to process bisulfite-converted sequencing reads and evaluate the output and quality of the sequencing data generated from the multiplexed libraries. We obtained an average of 42 million paired-end reads per sample for each flow-cell lane, with a high unique mapping efficiency to the reference human genome. Here we provide a roadmap of modifications, strategies, and trouble shooting approaches we implemented to optimize sequencing of multiplexed libraries on an a RRBS background. PMID:23193365
NASA Technical Reports Server (NTRS)
Hooke, A. J.
1979-01-01
A set of standard telemetry protocols for downlink data flow facilitating the end-to-end transport of instrument data from the spacecraft to the user in real time is proposed. The direct switching of data by autonomous message 'packets' that are assembled by the source instrument on the spacecraft is discussed. The data system consists thus of a format on a message rather than word basis, and such packet telemetry would include standardized protocol headers. Standards are being developed within the NASA End-to-End Data System (NEEDS) program for the source packet and transport frame protocols. The source packet protocol contains identification of both the sequence number of the packet as it is generated by the source and the total length of the packet, while the transport frame protocol includes a sequence count defining the serial number of the frame as it is generated by the spacecraft data system, and a field specifying any 'options' selected in the format of the frame itself.
Design and development of compact monitoring system for disaster remote health centres.
Santhi, S; Sadasivam, G S
2015-02-01
To enhance speedy communication between the patient and the doctor through newly proposed routing protocol at the mobile node. The proposed model is applied for a telemedicine application during disaster recovery management. In this paper, Energy Efficient Link Stability Routing Protocol (EELSRP) has been developed by simulation and real time. This framework is designed for the immediate healing of affected persons in remote areas, especially at the time of the disaster where there is no hospital proximity. In case of disasters, there might be an outbreak of infectious diseases. In such cases, the patient's medical record is also transferred by the field operator from disaster place to the hospital to facilitate the identification of the disease-causing agent and to prescribe the necessary medication. The heterogeneous networking framework provides reliable, energy efficientand speedy communication between the patient and the doctor using the proposed routing protocol at the mobile node. The performance of the simulation and real time versions of the Energy Efficient Link Stability Routing Protocol (EELSRP) protocol has been analyzed. Experimental results prove the efficiency of the real-time version of EESLRP protocol. The packet delivery ratio and throughput of the real time version of EELSRP protocol is increased by 3% and 10%, respectively, when compared to the simulated version of EELSRP. The end-to-end delay and energy consumption are reduced by 10% and 2% in the real time version of EELSRP.
Factors affecting the efficient transformation of Colletotrichum species
Redman, Regina S.; Rodriguez, Rusty J.
1994-01-01
Factors affecting the efficient transformation of Colletotrichum species. Experimental Mycology, 18, 230-246. Twelve isolates representing four species of Colletotrichum were transformed either by enhanced protoplast, restriction enzyme-mediated integration (REMI), or electroporation-mediated protocols. The enhanced protoplast transformation protocol resulted in 100- and 50-fold increases in the transformation efficiencies of Colletotrichum lindemuthianum and C. magna , respectively. REMI transformation involved the use of Hin dIII and vector DNA linearized with HindIII to increase the number of integration events and potential gene disruptions in the fungal genome. Combining the enhanced protoplast and the REMI protocols resulted in a 22-fold increase in the number of hygromycin/nystatin-resistant mutants in C. lindemuthianum . Electroporation-mediated transformation was performed on mycelial fragments and spores of four Colletotrichum species, resulting in efficiencies of up to 1000 transformants/μg DNA. The pHA1.3 vector which confers hygromycin resistance contains telomeric sequences from Fusarium oxysporum , transforms by autonomous replication and genomic integration, and was essential for elevated transformation efficiencies of 100 to 10,000 transformants/μg DNA. Modifications of pHA1.3 occurred during bacterial amplification and post fungal transformation resulting in plasmids capable of significantly elevated transformation efficiencies in C. lindemuthianum.
Crispr-mediated Gene Targeting of Human Induced Pluripotent Stem Cells.
Byrne, Susan M; Church, George M
2015-01-01
CRISPR/Cas9 nuclease systems can create double-stranded DNA breaks at specific sequences to efficiently and precisely disrupt, excise, mutate, insert, or replace genes. However, human embryonic stem or induced pluripotent stem cells (iPSCs) are more difficult to transfect and less resilient to DNA damage than immortalized tumor cell lines. Here, we describe an optimized protocol for genome engineering of human iPSCs using a simple transient transfection of plasmids and/or single-stranded oligonucleotides. With this protocol, we achieve transfection efficiencies greater than 60%, with gene disruption efficiencies from 1-25% and gene insertion/replacement efficiencies from 0.5-10% without any further selection or enrichment steps. We also describe how to design and assess optimal sgRNA target sites and donor targeting vectors; cloning individual iPSC by single cell FACS sorting, and genotyping successfully edited cells.
Effects of a Short Drilling Implant Protocol on Osteotomy Site Temperature and Drill Torque.
Mihali, Sorin G; Canjau, Silvana; Cernescu, Anghel; Bortun, Cristina M; Wang, Hom-Lay; Bratu, Emanuel
2018-02-01
To establish a protocol for reducing the drilling sequence during implant site preparation based on temperature and insertion torque. The traditional conventional drilling sequence (used several drills with 0.6-mm increment each time) was compared with the proposed short drilling protocol (only used 2 drills: initial and final drill). One hundred drilling osteotomies were performed in bovine and porcine bones. Sets of 2 osteotomy sites were created in 5 bone densities using 2 types of drilling protocols. Thermographic pictures were captured throughout all drilling procedures and analyzed using ThermaCAM Researcher Professional 2.10. Torque values were determined during drilling by measuring electrical input and drill speed. There were statistically significant differences in bone temperature between the conventional and short drilling protocols during implant site preparation (analysis of variance P = 0.0008). However, there were no significant differences between the 2 types of drilling protocols for both implant diameters. Implant site preparation time was significantly reduced when using the short drilling protocol compared with the conventional drilling protocol (P < 0.001). Within the limitations of the study, the short drilling protocol proposed herein may represent a safe approach for implant site preparation.
Oldrini, Guillaume; Fedida, Benjamin; Poujol, Julie; Felblinger, Jacques; Trop, Isabelle; Henrot, Philippe; Darai, Emile; Thomassin-Naggara, Isabelle
2017-10-01
To evaluate the added value of ULTRAFAST-MR sequence to an abbreviated FAST protocol in comparison with FULL protocol to distinguish benign from malignant lesions in a population of women, regardless of breast MR imaging indication. From March 10th to September 22th, 2014, we retrospectively included a total of 70 consecutive patients with 106 histologically proven lesions (58 malignant and 48 benign) who underwent breast MR imaging for preoperative breast staging (n=38), high-risk screening (n=7), problem solving (n=18), and nipple discharge (n=4) with 12 time resolved imaging of contrast kinetics (TRICKS) acquisitions during contrast inflow interleaved in a regular high-resolution dynamic MRI protocol (FULL protocol). Two readers scored MR exams as either positive or negative and described significant lesions according to Bi-RADS lexicon with a TRICKS images (ULTRAFAST), an abbreviated protocol (FAST) and all images (FULL protocol). Sensitivity, specificity, positive and negative predictive values, and accuracy were calculated for each protocol and compared with McNemar's test. For all readers, the combined FAST-ULTRAFAST protocol significantly improved the reading with a specificity of 83.3% and 70.8% in comparison with FAST protocol or FULL protocol, respectively, without change in sensitivity. By adding ULTRAFAST protocol to FAST protocol, readers 1 and 2 were able to correctly change the diagnosis in 22.9% (11/48) and 10.4% (5/48) of benign lesions, without missing any malignancy, respectively. Both interpretation and image acquisition times for combined FAST-ULTRAFAST protocol and FAST protocol were shorter compared to FULL protocol (p<0.001). Compared to FULL protocol, adding ULTRAFAST to FAST protocol improves specificity, mainly in correctly reclassifying benign masses and reducing interpretation and acquisition time, without decreasing sensitivity. Copyright © 2017 Elsevier B.V. All rights reserved.
Energy Consumption Research of Mobile Data Collection Protocol for Underwater Nodes Using an USV.
Lv, Zhichao; Zhang, Jie; Jin, Jiucai; Li, Qi; Gao, Baoru
2018-04-16
The Unmanned Surface Vehicle (USV) integrated with an acoustic modem is a novel mobile vehicle for data collection, which has an advantage in terms of mobility, efficiency, and collection cost. In the scenario of data collection, the USV is controlled autonomously along the planning trajectory and the data of underwater nodes are dynamically collected. In order to improve the efficiency of data collection and extend the life of the underwater nodes, a mobile data collection protocol for underwater nodes using the USV was proposed. In the protocol, the stop-and-wait ARQ transmission mechanism is adopted, where the duty cycle is designed considering the ratio between the sleep mode and the detection mode, and the transmission ratio is defined by the duty cycle, wake-up signal cycles, and USV’s speed. According to protocol, the evaluation index for energy consumption is constructed based on the duty cycle and the transmission ratio. The energy consumption of the protocol is simulated and analyzed using the mobile communication experiment data of USV, taking into consideration USV’s speed, data sequence length, and duty cycle. Optimized protocol parameters are identified, which in turn denotes the proposed protocol’s feasibility and effectiveness.
Kim, Jae-Eung; Huang, Rui; Chen, Hui; You, Chun; Zhang, Y-H Percival
2016-09-01
A foolproof protocol was developed for the construction of mutant DNA library for directed protein evolution. First, a library of linear mutant gene was generated by error-prone PCR or molecular shuffling, and a linear vector backbone was prepared by high-fidelity PCR. Second, the amplified insert and vector fragments were assembled by overlap-extension PCR with a pair of 5'-phosphorylated primers. Third, full-length linear plasmids with phosphorylated 5'-ends were self-ligated with T4 ligase, yielding circular plasmids encoding mutant variants suitable for high-efficiency transformation. Self-made competent Escherichia coli BL21(DE3) showed a transformation efficiency of 2.4 × 10(5) cfu/µg of the self-ligated circular plasmid. Using this method, three mutants of mCherry fluorescent protein were found to alter their colors and fluorescent intensities under visible and UV lights, respectively. Also, one mutant of 6-phosphorogluconate dehydrogenase from a thermophilic bacterium Moorella thermoacetica was found to show the 3.5-fold improved catalytic efficiency (kcat /Km ) on NAD(+) as compared to the wild-type. This protocol is DNA-sequence independent, and does not require restriction enzymes, special E. coli host, or labor-intensive optimization. In addition, this protocol can be used for subcloning the relatively long DNA sequences into any position of plasmids. Copyright © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
SU-C-17A-02: Sirius MRI Markers for Prostate Post-Implant Assessment: MR Protocol Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, T; Wang, J; Kudchadker, R
Purpose: Currently, CT is used to visualize prostate brachytherapy sources, at the expense of accurate structure contouring. MRI is superior to CT for anatomical delineation, but the sources appear as voids on MRI images. Previously we have developed Sirius MRI markers (C4 Imaging) to replace spacers to assist source localization on MRI images. Here we develop an MRI pulse sequence protocol that enhances the signal of these markers to enable MRI-only post-implant prostate dosimetric analysis. Methods: To simulate a clinical scenario, a CIRS multi-modality prostate phantom was implanted with 66 markers and 86 sources. The implanted phantom was imaged onmore » both 1.5T and 3.0T GE scanners under various conditions, different pulse sequences (2D fast spin echo [FSE], 3D balanced steadystate free precession [bSSFP] and 3D fast spoiled gradient echo [FSPGR]), as well as varying amount of padding to simulate various patient sizes and associated signal fall-off from the surface coil elements. Standard FSE sequences from the current clinical protocols were also evaluated. Marker visibility, marker size, intra-marker distance, total scan time and artifacts were evaluated for various combinations of echo time, repetition time, flip angle, number of excitations, bandwidth, slice thickness and spacing, fieldof- view, frequency/phase encoding steps and frequency direction. Results: We have developed a 3D FSPGR pulse sequence that enhances marker signal and ensures the integrity of the marker shape while maintaining reasonable scan time. For patients contraindicated for 3.0T, we have also developed a similar sequence for 1.5T scanners. Signal fall-off with distance from prostate to coil can be compensated mainly by decreasing bandwidth. The markers are not visible using standard FSE sequences. FSPGR sequences are more robust for consistent marker visualization as compared to bSSFP sequences. Conclusion: The developed MRI pulse sequence protocol for Sirius MRI markers assists source localization to enable MRIonly post-implant prostate dosimetric analysis. S.J. Frank is a co-founder of C4 Imaging (manufactures the MRI markers)« less
Kawakami, Shuji; Hasegawa, Takuya; Imachi, Hiroyuki; Yamaguchi, Takashi; Harada, Hideki; Ohashi, Akiyoshi; Kubota, Kengo
2012-02-01
In situ detection of functional genes with single-cell resolution is currently of interest to microbiologists. Here, we developed a two-pass tyramide signal amplification (TSA)-fluorescence in situ hybridization (FISH) protocol with PCR-derived polynucleotide probes for the detection of single-copy genes in prokaryotic cells. The mcrA gene and the apsA gene in methanogens and sulfate-reducing bacteria, respectively, were targeted. The protocol showed bright fluorescence with a good signal-to-noise ratio and achieved a high efficiency of detection (>98%). The discrimination threshold was approximately 82-89% sequence identity. Microorganisms possessing the mcrA or apsA gene in anaerobic sludge samples were successfully detected by two-pass TSA-FISH with polynucleotide probes. The developed protocol is useful for identifying single microbial cells based on functional gene sequences. Copyright © 2011 Elsevier B.V. All rights reserved.
Rector, Annabel; Tachezy, Ruth; Van Ranst, Marc
2004-01-01
The discovery of novel viruses has often been accomplished by using hybridization-based methods that necessitate the availability of a previously characterized virus genome probe or knowledge of the viral nucleotide sequence to construct consensus or degenerate PCR primers. In their natural replication cycle, certain viruses employ a rolling-circle mechanism to propagate their circular genomes, and multiply primed rolling-circle amplification (RCA) with φ29 DNA polymerase has recently been applied in the amplification of circular plasmid vectors used in cloning. We employed an isothermal RCA protocol that uses random hexamer primers to amplify the complete genomes of papillomaviruses without the need for prior knowledge of their DNA sequences. We optimized this RCA technique with extracted human papillomavirus type 16 (HPV-16) DNA from W12 cells, using a real-time quantitative PCR assay to determine amplification efficiency, and obtained a 2.4 × 104-fold increase in HPV-16 DNA concentration. We were able to clone the complete HPV-16 genome from this multiply primed RCA product. The optimized protocol was subsequently applied to a bovine fibropapillomatous wart tissue sample. Whereas no papillomavirus DNA could be detected by restriction enzyme digestion of the original sample, multiply primed RCA enabled us to obtain a sufficient amount of papillomavirus DNA for restriction enzyme analysis, cloning, and subsequent sequencing of a novel variant of bovine papillomavirus type 1. The multiply primed RCA method allows the discovery of previously unknown papillomaviruses, and possibly also other circular DNA viruses, without a priori sequence information. PMID:15113879
Seamless Insert-Plasmid Assembly at High Efficiency and Low Cost
Benoit, Roger M.; Ostermeier, Christian; Geiser, Martin; Li, Julia Su Zhou; Widmer, Hans; Auer, Manfred
2016-01-01
Seamless cloning methods, such as co-transformation cloning, sequence- and ligation-independent cloning (SLIC) or the Gibson assembly, are essential tools for the precise construction of plasmids. The efficiency of co-transformation cloning is however low and the Gibson assembly reagents are expensive. With the aim to improve the robustness of seamless cloning experiments while keeping costs low, we examined the importance of complementary single-stranded DNA ends for co-transformation cloning and the influence of single-stranded gaps in circular plasmids on SLIC cloning efficiency. Most importantly, our data show that single-stranded gaps in double-stranded plasmids, which occur in typical SLIC protocols, can drastically decrease the efficiency at which the DNA transforms competent E. coli bacteria. Accordingly, filling-in of single-stranded gaps using DNA polymerase resulted in increased transformation efficiency. Ligation of the remaining nicks did not lead to a further increase in transformation efficiency. These findings demonstrate that highly efficient insert-plasmid assembly can be achieved by using only T5 exonuclease and Phusion DNA polymerase, without Taq DNA ligase from the original Gibson protocol, which significantly reduces the cost of the reactions. We successfully used this modified Gibson assembly protocol with two short insert-plasmid overlap regions, each counting only 15 nucleotides. PMID:27073895
Geometric Heat Engines Featuring Power that Grows with Efficiency.
Raz, O; Subaşı, Y; Pugatch, R
2016-04-22
Thermodynamics places a limit on the efficiency of heat engines, but not on their output power or on how the power and efficiency change with the engine's cycle time. In this Letter, we develop a geometrical description of the power and efficiency as a function of the cycle time, applicable to an important class of heat engine models. This geometrical description is used to design engine protocols that attain both the maximal power and maximal efficiency at the fast driving limit. Furthermore, using this method, we also prove that no protocol can exactly attain the Carnot efficiency at nonzero power.
Xu, Chang; Nezami Ranjbar, Mohammad R; Wu, Zhong; DiCarlo, John; Wang, Yexun
2017-01-03
Detection of DNA mutations at very low allele fractions with high accuracy will significantly improve the effectiveness of precision medicine for cancer patients. To achieve this goal through next generation sequencing, researchers need a detection method that 1) captures rare mutation-containing DNA fragments efficiently in the mix of abundant wild-type DNA; 2) sequences the DNA library extensively to deep coverage; and 3) distinguishes low level true variants from amplification and sequencing errors with high accuracy. Targeted enrichment using PCR primers provides researchers with a convenient way to achieve deep sequencing for a small, yet most relevant region using benchtop sequencers. Molecular barcoding (or indexing) provides a unique solution for reducing sequencing artifacts analytically. Although different molecular barcoding schemes have been reported in recent literature, most variant calling has been done on limited targets, using simple custom scripts. The analytical performance of barcode-aware variant calling can be significantly improved by incorporating advanced statistical models. We present here a highly efficient, simple and scalable enrichment protocol that integrates molecular barcodes in multiplex PCR amplification. In addition, we developed smCounter, an open source, generic, barcode-aware variant caller based on a Bayesian probabilistic model. smCounter was optimized and benchmarked on two independent read sets with SNVs and indels at 5 and 1% allele fractions. Variants were called with very good sensitivity and specificity within coding regions. We demonstrated that we can accurately detect somatic mutations with allele fractions as low as 1% in coding regions using our enrichment protocol and variant caller.
Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J; Burnett, John C; Zhou, Jiehua
2016-09-22
The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct "biased sequences" and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the "biased sequences" was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy.
An expert protocol for immunofluorescent detection of calcium channels in tsA-201 cells.
Koch, Peter; Herzig, Stefan; Matthes, Jan
Pore-forming subunits of voltage gated calcium channels (VGCC) are large membrane proteins (260kDa) containing 24 transmembrane domains. Despite transfection with viral promoter driven vectors, biochemical analysis of VGCC is often hampered by rather low expression levels in heterologous systems rendering VGCC challenging targets. Especially in immunofluorescent detection, calcium channels are demanding proteins. We provide an expert step-by-step protocol with adapted conditions for handling procedures (tsA-201 cell culture, transient transfection, incubation time and temperature at 28°C or 37°C and immunostaining) to address the L-type calcium-channel pore Ca v 1.2 in an immunofluorescent approach. We performed immunocytochemical analysis of Ca v 1.2 expression at single-cell level in combination with detection of different markers for cellular organelles. We show confluency levels and shapes of tsA-201 cells at different time points during an experiment. Our experiments reveal sufficient levels of Ca v 1.2 protein and a correct Ca v 1.2 expression pattern in polygonal shaped cells already 12h after transfection. A sequence of elaborated protocol modifications allows subcellular localization analysis of Ca v 1.2 in an immunocytochemical approach. We provide a protocol that may be used to achieve insights into physiological and pathophysiological processes involving voltage gated calcium channels. Our protocol may be used for expression analysis of other challenging proteins and efficient overexpression may be exploited in related biochemical techniques requiring immunolabels. Copyright © 2016 Elsevier Inc. All rights reserved.
Extraction of genomic DNA from yeasts for PCR-based applications.
Lõoke, Marko; Kristjuhan, Kersti; Kristjuhan, Arnold
2011-05-01
We have developed a quick and low-cost genomic DNA extraction protocol from yeast cells for PCR-based applications. This method does not require any enzymes, hazardous chemicals, or extreme temperatures, and is especially powerful for simultaneous analysis of a large number of samples. DNA can be efficiently extracted from different yeast species (Kluyveromyces lactis, Hansenula polymorpha, Schizosaccharomyces pombe, Candida albicans, Pichia pastoris, and Saccharomyces cerevisiae). The protocol involves lysis of yeast colonies or cells from liquid culture in a lithium acetate (LiOAc)-SDS solution and subsequent precipitation of DNA with ethanol. Approximately 100 nanograms of total genomic DNA can be extracted from 1 × 10(7) cells. DNA extracted by this method is suitable for a variety of PCR-based applications (including colony PCR, real-time qPCR, and DNA sequencing) for amplification of DNA fragments of ≤ 3500 bp.
Novel method for high-throughput colony PCR screening in nanoliter-reactors
Walser, Marcel; Pellaux, Rene; Meyer, Andreas; Bechtold, Matthias; Vanderschuren, Herve; Reinhardt, Richard; Magyar, Joseph; Panke, Sven; Held, Martin
2009-01-01
We introduce a technology for the rapid identification and sequencing of conserved DNA elements employing a novel suspension array based on nanoliter (nl)-reactors made from alginate. The reactors have a volume of 35 nl and serve as reaction compartments during monoseptic growth of microbial library clones, colony lysis, thermocycling and screening for sequence motifs via semi-quantitative fluorescence analyses. nl-Reactors were kept in suspension during all high-throughput steps which allowed performing the protocol in a highly space-effective fashion and at negligible expenses of consumables and reagents. As a first application, 11 high-quality microsatellites for polymorphism studies in cassava were isolated and sequenced out of a library of 20 000 clones in 2 days. The technology is widely scalable and we envision that throughputs for nl-reactor based screenings can be increased up to 100 000 and more samples per day thereby efficiently complementing protocols based on established deep-sequencing technologies. PMID:19282448
NASA Astrophysics Data System (ADS)
Richa, Tambi; Ide, Soichiro; Suzuki, Ryosuke; Ebina, Teppei; Kuroda, Yutaka
2017-02-01
Efficient and rapid prediction of domain regions from amino acid sequence information alone is often required for swift structural and functional characterization of large multi-domain proteins. Here we introduce Fast H-DROP, a thirty times accelerated version of our previously reported H-DROP (Helical Domain linker pRediction using OPtimal features), which is unique in specifically predicting helical domain linkers (boundaries). Fast H-DROP, analogously to H-DROP, uses optimum features selected from a set of 3000 ones by combining a random forest and a stepwise feature selection protocol. We reduced the computational time from 8.5 min per sequence in H-DROP to 14 s per sequence in Fast H-DROP on an 8 Xeon processor Linux server by using SWISS-PROT instead of Genbank non-redundant (nr) database for generating the PSSMs. The sensitivity and precision of Fast H-DROP assessed by cross-validation were 33.7 and 36.2%, which were merely 2% lower than that of H-DROP. The reduced computational time of Fast H-DROP, without affecting prediction performances, makes it more interactive and user-friendly. Fast H-DROP and H-DROP are freely available from http://domserv.lab.tuat.ac.jp/.
Revised Mechanism and Improved Efficiency of the QuikChange Site-Directed Mutagenesis Method.
Xia, Yongzhen; Xun, Luying
2017-01-01
Site-directed mutagenesis has been widely used for the substitution, addition or deletion of nucleotide residues in a defined DNA sequence. QuikChange™ site-directed mutagenesis and its related protocols have been widely used for this purpose because of convenience and efficiency. We have recently demonstrated that the mechanism of the QuikChange™ site-directed mutagenesis process is different from that being proposed. The new mechanism promotes the use of partially overlapping primers and commercial PCR enzymes for efficient PCR and mutagenesis.
ERIC Educational Resources Information Center
Wright, David L.; Magnuson, Curt E.; Black, Charles B.
2005-01-01
Individuals practiced two unique discrete sequence production tasks that differed in their relative time profile in either a blocked or random practice schedule. Each participant was subsequently administered a "precuing" protocol to examine the cost of initially compiling or modifying the plan for an upcoming movement's relative timing. The…
IRB Process Improvements: A Machine Learning Analysis.
Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A
2017-06-01
Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.
Deep sampling of the Palomero maize transcriptome by a high throughput strategy of pyrosequencing.
Vega-Arreguín, Julio C; Ibarra-Laclette, Enrique; Jiménez-Moraila, Beatriz; Martínez, Octavio; Vielle-Calzada, Jean Philippe; Herrera-Estrella, Luis; Herrera-Estrella, Alfredo
2009-07-06
In-depth sequencing analysis has not been able to determine the overall complexity of transcriptional activity of a plant organ or tissue sample. In some cases, deep parallel sequencing of Expressed Sequence Tags (ESTs), although not yet optimized for the sequencing of cDNAs, has represented an efficient procedure for validating gene prediction and estimating overall gene coverage. This approach could be very valuable for complex plant genomes. In addition, little emphasis has been given to efforts aiming at an estimation of the overall transcriptional universe found in a multicellular organism at a specific developmental stage. To explore, in depth, the transcriptional diversity in an ancient maize landrace, we developed a protocol to optimize the sequencing of cDNAs and performed 4 consecutive GS20-454 pyrosequencing runs of a cDNA library obtained from 2 week-old Palomero Toluqueño maize plants. The protocol reported here allowed obtaining over 90% of informative sequences. These GS20-454 runs generated over 1.5 Million reads, representing the largest amount of sequences reported from a single plant cDNA library. A collection of 367,391 quality-filtered reads (30.09 Mb) from a single run was sufficient to identify transcripts corresponding to 34% of public maize ESTs databases; total sequences generated after 4 filtered runs increased this coverage to 50%. Comparisons of all 1.5 Million reads to the Maize Assembled Genomic Islands (MAGIs) provided evidence for the transcriptional activity of 11% of MAGIs. We estimate that 5.67% (86,069 sequences) do not align with public ESTs or annotated genes, potentially representing new maize transcripts. Following the assembly of 74.4% of the reads in 65,493 contigs, real-time PCR of selected genes confirmed a predicted correlation between the abundance of GS20-454 sequences and corresponding levels of gene expression. A protocol was developed that significantly increases the number, length and quality of cDNA reads using massive 454 parallel sequencing. We show that recurrent 454 pyrosequencing of a single cDNA sample is necessary to attain a thorough representation of the transcriptional universe present in maize, that can also be used to estimate transcript abundance of specific genes. This data suggests that the molecular and functional diversity contained in the vast native landraces remains to be explored, and that large-scale transcriptional sequencing of a presumed ancestor of the modern maize varieties represents a valuable approach to characterize the functional diversity of maize for future agricultural and evolutionary studies.
Targeted Genome Editing Using DNA-Free RNA-Guided Cas9 Ribonucleoprotein for CHO Cell Engineering.
Shin, Jongoh; Lee, Namil; Cho, Suhyung; Cho, Byung-Kwan
2018-01-01
Recent advances in the CRISPR/Cas9 system have dramatically facilitated genome engineering in various cell systems. Among the protocols, the direct delivery of the Cas9-sgRNA ribonucleoprotein (RNP) complex into cells is an efficient approach to increase genome editing efficiency. This method uses purified Cas9 protein and in vitro transcribed sgRNA to edit the target gene without vector DNA. We have applied the RNP complex to CHO cell engineering to obtain desirable phenotypes and to reduce unintended insertional mutagenesis and off-target effects. Here, we describe our routine methods for RNP complex-mediated gene deletion including the protocols to prepare the purified Cas9 protein and the in vitro transcribed sgRNA. Subsequently, we also describe a protocol to confirm the edited genomic positions using the T7E1 enzymatic assay and next-generation sequencing.
Melloul, Emmanuel; Raptis, Dimitri A; Boss, Andreas; Pfammater, Thomas; Tschuor, Christoph; Tian, Yinghua; Graf, Rolf; Clavien, Pierre-Alain; Lesurtel, Mickael
2014-04-01
To develop a noninvasive technique to assess liver volumetry and intrahepatic portal vein anatomy in a mouse model of liver regeneration. Fifty-two C57BL/6 male mice underwent magnetic resonance imaging (MRI) of the liver using a 4.7 T small animal MRI system after no treatment, 70% partial hepatectomy (PH), or selective portal vein embolization. The protocol consisted of the following sequences: three-dimensional-encoded spoiled gradient-echo sequence (repetition time per echo time 15 per 2.7 ms, flip angle 20°) for volumetry, and two-dimensional-encoded time-of-flight angiography sequence (repetition time per echo time 18 per 6.4 ms, flip angle 80°) for vessel visualization. Liver volume and portal vein segmentation was performed using a dedicated postprocessing software. In animals with portal vein embolization, portography served as reference standard. True liver volume was measured after sacrificing the animals. Measurements were carried out by two independent observers with subsequent analysis by the Cohen κ-test for interobserver agreement. MRI liver volumetry highly correlated with the true liver volume measurement using a conventional method in both the untreated liver and the liver remnant after 70% PH with a high interobserver correlation coefficient of 0.94 (95% confidence interval, 0.80-0.98 for untreated liver [P < 0.001] and 0.90-0.97 after 70% PH [P < 0.001]). The diagnostic accuracy of magnetic resonance angiography for the occlusion of one branch of the portal vein was 0.95 (95% confidence interval, 0.84-1). The level of agreement between the two observers for the description of intrahepatic vascular anatomy was excellent (Cohen κ value = 0.925). This protocol may be used for noninvasive liver volumetry and visualization of portal vein anatomy in mice. It will serve the dynamic study of new strategies to enhance liver regeneration in vivo. Copyright © 2014 Elsevier Inc. All rights reserved.
Integrated design, execution, and analysis of arrayed and pooled CRISPR genome-editing experiments.
Canver, Matthew C; Haeussler, Maximilian; Bauer, Daniel E; Orkin, Stuart H; Sanjana, Neville E; Shalem, Ophir; Yuan, Guo-Cheng; Zhang, Feng; Concordet, Jean-Paul; Pinello, Luca
2018-05-01
CRISPR (clustered regularly interspaced short palindromic repeats) genome-editing experiments offer enormous potential for the evaluation of genomic loci using arrayed single guide RNAs (sgRNAs) or pooled sgRNA libraries. Numerous computational tools are available to help design sgRNAs with optimal on-target efficiency and minimal off-target potential. In addition, computational tools have been developed to analyze deep-sequencing data resulting from genome-editing experiments. However, these tools are typically developed in isolation and oftentimes are not readily translatable into laboratory-based experiments. Here, we present a protocol that describes in detail both the computational and benchtop implementation of an arrayed and/or pooled CRISPR genome-editing experiment. This protocol provides instructions for sgRNA design with CRISPOR (computational tool for the design, evaluation, and cloning of sgRNA sequences), experimental implementation, and analysis of the resulting high-throughput sequencing data with CRISPResso (computational tool for analysis of genome-editing outcomes from deep-sequencing data). This protocol allows for design and execution of arrayed and pooled CRISPR experiments in 4-5 weeks by non-experts, as well as computational data analysis that can be performed in 1-2 d by both computational and noncomputational biologists alike using web-based and/or command-line versions.
Quantum Authencryption with Two-Photon Entangled States for Off-Line Communicants
NASA Astrophysics Data System (ADS)
Ye, Tian-Yu
2016-02-01
In this paper, a quantum authencryption protocol is proposed by using the two-photon entangled states as the quantum resource. Two communicants Alice and Bob share two private keys in advance, which determine the generation of two-photon entangled states. The sender Alice sends the two-photon entangled state sequence encoded with her classical bits to the receiver Bob in the manner of one-step quantum transmission. Upon receiving the encoded quantum state sequence, Bob decodes out Alice's classical bits with the two-photon joint measurements and authenticates the integrity of Alice's secret with the help of one-way hash function. The proposed protocol only uses the one-step quantum transmission and needs neither a public discussion nor a trusted third party. As a result, the proposed protocol can be adapted to the case where the receiver is off-line, such as the quantum E-mail systems. Moreover, the proposed protocol provides the message authentication to one bit level with the help of one-way hash function and has an information-theoretical efficiency equal to 100 %.
Turatsinze, Jean-Valery; Thomas-Chollier, Morgane; Defrance, Matthieu; van Helden, Jacques
2008-01-01
This protocol shows how to detect putative cis-regulatory elements and regions enriched in such elements with the regulatory sequence analysis tools (RSAT) web server (http://rsat.ulb.ac.be/rsat/). The approach applies to known transcription factors, whose binding specificity is represented by position-specific scoring matrices, using the program matrix-scan. The detection of individual binding sites is known to return many false predictions. However, results can be strongly improved by estimating P value, and by searching for combinations of sites (homotypic and heterotypic models). We illustrate the detection of sites and enriched regions with a study case, the upstream sequence of the Drosophila melanogaster gene even-skipped. This protocol is also tested on random control sequences to evaluate the reliability of the predictions. Each task requires a few minutes of computation time on the server. The complete protocol can be executed in about one hour.
Distributed reservation control protocols for random access broadcasting channels
NASA Technical Reports Server (NTRS)
Greene, E. P.; Ephremides, A.
1981-01-01
Attention is given to a communication network consisting of an arbitrary number of nodes which can communicate with each other via a time-division multiple access (TDMA) broadcast channel. The reported investigation is concerned with the development of efficient distributed multiple access protocols for traffic consisting primarily of single packet messages in a datagram mode of operation. The motivation for the design of the protocols came from the consideration of efficient multiple access utilization of moderate to high bandwidth (4-40 Mbit/s capacity) communication satellite channels used for the transmission of short (1000-10,000 bits) fixed length packets. Under these circumstances, the ratio of roundtrip propagation time to packet transmission time is between 100 to 10,000. It is shown how a TDMA channel can be adaptively shared by datagram traffic and constant bandwidth users such as in digital voice applications. The distributed reservation control protocols described are a hybrid between contention and reservation protocols.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Stern, Frank; Spencer, Justin
Savings from electric energy efficiency measures and programs are often expressed in terms of annual energy and presented as kilowatt-hours per year (kWh/year). However, for a full assessment of the value of these savings, it is usually necessary to consider the measure or program's impact on peak demand as well as time-differentiated energy savings. This cross-cutting protocol describes methods for estimating the peak demand and time-differentiated energy impacts of measures implemented through energy efficiency programs.
Shaukat, Shahzad; Angez, Mehar; Alam, Muhammad Masroor; Jebbink, Maarten F; Deijs, Martin; Canuti, Marta; Sharif, Salmaan; de Vries, Michel; Khurshid, Adnan; Mahmood, Tariq; van der Hoek, Lia; Zaidi, Syed Sohail Zahoor
2014-08-12
The use of sequence independent methods combined with next generation sequencing for identification purposes in clinical samples appears promising and exciting results have been achieved to understand unexplained infections. One sequence independent method, Virus Discovery based on cDNA Amplified Fragment Length Polymorphism (VIDISCA) is capable of identifying viruses that would have remained unidentified in standard diagnostics or cell cultures. VIDISCA is normally combined with next generation sequencing, however, we set up a simplified VIDISCA which can be used in case next generation sequencing is not possible. Stool samples of 10 patients with unexplained acute flaccid paralysis showing cytopathic effect in rhabdomyosarcoma cells and/or mouse cells were used to test the efficiency of this method. To further characterize the viruses, VIDISCA-positive samples were amplified and sequenced with gene specific primers. Simplified VIDISCA detected seven viruses (70%) and the proportion of eukaryotic viral sequences from each sample ranged from 8.3 to 45.8%. Human enterovirus EV-B97, EV-B100, echovirus-9 and echovirus-21, human parechovirus type-3, human astrovirus probably a type-3/5 recombinant, and tetnovirus-1 were identified. Phylogenetic analysis based on the VP1 region demonstrated that the human enteroviruses are more divergent isolates circulating in the community. Our data support that a simplified VIDISCA protocol can efficiently identify unrecognized viruses grown in cell culture with low cost, limited time without need of advanced technical expertise. Also complex data interpretation is avoided thus the method can be used as a powerful diagnostic tool in limited resources. Redesigning the routine diagnostics might lead to additional detection of previously undiagnosed viruses in clinical samples of patients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurnik, Charles W; Gowans, Dakers; Telarico, Chad
The Commercial and Industrial Lighting Evaluation Protocol (the protocol) describes methods to account for gross energy savings resulting from the programmatic installation of efficient lighting equipment in large populations of commercial, industrial, and other nonresidential facilities. This protocol does not address savings resulting from changes in codes and standards, or from education and training activities. A separate Uniform Methods Project (UMP) protocol, Chapter 3: Commercial and Industrial Lighting Controls Evaluation Protocol, addresses methods for evaluating savings resulting from lighting control measures such as adding time clocks, tuning energy management system commands, and adding occupancy sensors.
Nong, Rachel Yuan; Wu, Di; Yan, Junhong; Hammond, Maria; Gu, Gucci Jijuan; Kamali-Moghaddam, Masood; Landegren, Ulf; Darmanis, Spyros
2013-06-01
Solid-phase proximity ligation assays share properties with the classical sandwich immunoassays for protein detection. The proteins captured via antibodies on solid supports are, however, detected not by single antibodies with detectable functions, but by pairs of antibodies with attached DNA strands. Upon recognition by these sets of three antibodies, pairs of DNA strands brought in proximity are joined by ligation. The ligated reporter DNA strands are then detected via methods such as real-time PCR or next-generation sequencing (NGS). We describe how to construct assays that can offer improved detection specificity by virtue of recognition by three antibodies, as well as enhanced sensitivity owing to reduced background and amplified detection. Finally, we also illustrate how the assays can be applied for parallel detection of proteins, taking advantage of the oligonucleotide ligation step to avoid background problems that might arise with multiplexing. The protocol for the singleplex solid-phase proximity ligation assay takes ~5 h. The multiplex version of the assay takes 7-8 h depending on whether quantitative PCR (qPCR) or sequencing is used as the readout. The time for the sequencing-based protocol includes the library preparation but not the actual sequencing, as times may vary based on the choice of sequencing platform.
Basic quantitative polymerase chain reaction using real-time fluorescence measurements.
Ares, Manuel
2014-10-01
This protocol uses quantitative polymerase chain reaction (qPCR) to measure the number of DNA molecules containing a specific contiguous sequence in a sample of interest (e.g., genomic DNA or cDNA generated by reverse transcription). The sample is subjected to fluorescence-based PCR amplification and, theoretically, during each cycle, two new duplex DNA molecules are produced for each duplex DNA molecule present in the sample. The progress of the reaction during PCR is evaluated by measuring the fluorescence of dsDNA-dye complexes in real time. In the early cycles, DNA duplication is not detected because inadequate amounts of DNA are made. At a certain threshold cycle, DNA-dye complexes double each cycle for 8-10 cycles, until the DNA concentration becomes so high and the primer concentration so low that the reassociation of the product strands blocks efficient synthesis of new DNA and the reaction plateaus. There are two types of measurements: (1) the relative change of the target sequence compared to a reference sequence and (2) the determination of molecule number in the starting sample. The first requires a reference sequence, and the second requires a sample of the target sequence with known numbers of the molecules of sequence to generate a standard curve. By identifying the threshold cycle at which a sample first begins to accumulate DNA-dye complexes exponentially, an estimation of the numbers of starting molecules in the sample can be extrapolated. © 2014 Cold Spring Harbor Laboratory Press.
Strods, Arnis; Ose, Velta; Bogans, Janis; Cielens, Indulis; Kalnins, Gints; Radovica, Ilze; Kazaks, Andris; Pumpens, Paul; Renhofa, Regina
2015-06-26
Hepatitis B virus (HBV) core (HBc) virus-like particles (VLPs) are one of the most powerful protein engineering tools utilised to expose immunological epitopes and/or cell-targeting signals and for the packaging of genetic material and immune stimulatory sequences. Although HBc VLPs and their numerous derivatives are produced in highly efficient bacterial and yeast expression systems, the existing purification and packaging protocols are not sufficiently optimised and standardised. Here, a simple alkaline treatment method was employed for the complete removal of internal RNA from bacteria- and yeast-produced HBc VLPs and for the conversion of these VLPs into empty particles, without any damage to the VLP structure. The empty HBc VLPs were able to effectively package the added DNA and RNA sequences. Furthermore, the alkaline hydrolysis technology appeared efficient for the purification and packaging of four different HBc variants carrying lysine residues on the HBc VLP spikes. Utilising the introduced lysine residues and the intrinsic aspartic and glutamic acid residues exposed on the tips of the HBc spikes for chemical coupling of the chosen peptide and/or nucleic acid sequences ensured a standard and easy protocol for the further development of versatile HBc VLP-based vaccine and gene therapy applications.
Strods, Arnis; Ose, Velta; Bogans, Janis; Cielens, Indulis; Kalnins, Gints; Radovica, Ilze; Kazaks, Andris; Pumpens, Paul; Renhofa, Regina
2015-01-01
Hepatitis B virus (HBV) core (HBc) virus-like particles (VLPs) are one of the most powerful protein engineering tools utilised to expose immunological epitopes and/or cell-targeting signals and for the packaging of genetic material and immune stimulatory sequences. Although HBc VLPs and their numerous derivatives are produced in highly efficient bacterial and yeast expression systems, the existing purification and packaging protocols are not sufficiently optimised and standardised. Here, a simple alkaline treatment method was employed for the complete removal of internal RNA from bacteria- and yeast-produced HBc VLPs and for the conversion of these VLPs into empty particles, without any damage to the VLP structure. The empty HBc VLPs were able to effectively package the added DNA and RNA sequences. Furthermore, the alkaline hydrolysis technology appeared efficient for the purification and packaging of four different HBc variants carrying lysine residues on the HBc VLP spikes. Utilising the introduced lysine residues and the intrinsic aspartic and glutamic acid residues exposed on the tips of the HBc spikes for chemical coupling of the chosen peptide and/or nucleic acid sequences ensured a standard and easy protocol for the further development of versatile HBc VLP-based vaccine and gene therapy applications. PMID:26113394
NASA Astrophysics Data System (ADS)
Strods, Arnis; Ose, Velta; Bogans, Janis; Cielens, Indulis; Kalnins, Gints; Radovica, Ilze; Kazaks, Andris; Pumpens, Paul; Renhofa, Regina
2015-06-01
Hepatitis B virus (HBV) core (HBc) virus-like particles (VLPs) are one of the most powerful protein engineering tools utilised to expose immunological epitopes and/or cell-targeting signals and for the packaging of genetic material and immune stimulatory sequences. Although HBc VLPs and their numerous derivatives are produced in highly efficient bacterial and yeast expression systems, the existing purification and packaging protocols are not sufficiently optimised and standardised. Here, a simple alkaline treatment method was employed for the complete removal of internal RNA from bacteria- and yeast-produced HBc VLPs and for the conversion of these VLPs into empty particles, without any damage to the VLP structure. The empty HBc VLPs were able to effectively package the added DNA and RNA sequences. Furthermore, the alkaline hydrolysis technology appeared efficient for the purification and packaging of four different HBc variants carrying lysine residues on the HBc VLP spikes. Utilising the introduced lysine residues and the intrinsic aspartic and glutamic acid residues exposed on the tips of the HBc spikes for chemical coupling of the chosen peptide and/or nucleic acid sequences ensured a standard and easy protocol for the further development of versatile HBc VLP-based vaccine and gene therapy applications.
Seismic Parameters of Mining-Induced Aftershock Sequences for Re-entry Protocol Development
NASA Astrophysics Data System (ADS)
Vallejos, Javier A.; Estay, Rodrigo A.
2018-03-01
A common characteristic of deep mines in hard rock is induced seismicity. This results from stress changes and rock failure around mining excavations. Following large seismic events, there is an increase in the levels of seismicity, which gradually decay with time. Restricting access to areas of a mine for enough time to allow this decay of seismic events is the main approach in re-entry strategies. The statistical properties of aftershock sequences can be studied with three scaling relations: (1) Gutenberg-Richter frequency magnitude, (2) the modified Omori's law (MOL) for the temporal decay, and (3) Båth's law for the magnitude of the largest aftershock. In this paper, these three scaling relations, in addition to the stochastic Reasenberg-Jones model are applied to study the characteristic parameters of 11 large magnitude mining-induced aftershock sequences in four mines in Ontario, Canada. To provide guidelines for re-entry protocol development, the dependence of the scaling relation parameters on the magnitude of the main event are studied. Some relations between the parameters and the magnitude of the main event are found. Using these relationships and the scaling relations, a space-time-magnitude re-entry protocol is developed. These findings provide a first approximation to concise and well-justified guidelines for re-entry protocol development applicable to the range of mining conditions found in Ontario, Canada.
Grobarczyk, Benjamin; Franco, Bénédicte; Hanon, Kevin; Malgrange, Brigitte
2015-10-01
Genome engineering and human iPS cells are two powerful technologies, which can be combined to highlight phenotypic differences and identify pathological mechanisms of complex diseases by providing isogenic cellular material. However, very few data are available regarding precise gene correction in human iPS cells. Here, we describe an optimized stepwise protocol to deliver CRISPR/Cas9 plasmids in human iPS cells. We highlight technical issues especially those associated to human stem cell culture and to the correction of a point mutation to obtain isogenic iPS cell line, without inserting any resistance cassette. Based on a two-steps clonal isolation protocol (mechanical picking followed by enzymatic dissociation), we succeed to select and expand corrected human iPS cell line with a great efficiency (more than 2% of the sequenced colonies). This protocol can also be used to obtain knock-out cell line from healthy iPS cell line by the NHEJ pathway (with about 15% efficiency) and reproduce disease phenotype. In addition, we also provide protocols for functional validation tests after every critical step.
2011-01-01
Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath-hold imaging and flow-related artefacts. Conclusions This study showed that with current systems there was no generic protocol which resulted into acceptable flow offset values. Protocol optimization would have to be performed on a per scanner and per protocol basis. Proper optimization might make accurate (transverse) aortic flow quantification possible for most scanners. Pulmonary flow quantification would still need further (offline) correction. PMID:21388521
Genome-wide analysis of replication timing by next-generation sequencing with E/L Repli-seq.
Marchal, Claire; Sasaki, Takayo; Vera, Daniel; Wilson, Korey; Sima, Jiao; Rivera-Mulia, Juan Carlos; Trevilla-García, Claudia; Nogues, Coralin; Nafie, Ebtesam; Gilbert, David M
2018-05-01
This protocol is an extension to: Nat. Protoc. 6, 870-895 (2014); doi:10.1038/nprot.2011.328; published online 02 June 2011Cycling cells duplicate their DNA content during S phase, following a defined program called replication timing (RT). Early- and late-replicating regions differ in terms of mutation rates, transcriptional activity, chromatin marks and subnuclear position. Moreover, RT is regulated during development and is altered in diseases. Here, we describe E/L Repli-seq, an extension of our Repli-chip protocol. E/L Repli-seq is a rapid, robust and relatively inexpensive protocol for analyzing RT by next-generation sequencing (NGS), allowing genome-wide assessment of how cellular processes are linked to RT. Briefly, cells are pulse-labeled with BrdU, and early and late S-phase fractions are sorted by flow cytometry. Labeled nascent DNA is immunoprecipitated from both fractions and sequenced. Data processing leads to a single bedGraph file containing the ratio of nascent DNA from early versus late S-phase fractions. The results are comparable to those of Repli-chip, with the additional benefits of genome-wide sequence information and an increased dynamic range. We also provide computational pipelines for downstream analyses, for parsing phased genomes using single-nucleotide polymorphisms (SNPs) to analyze RT allelic asynchrony, and for direct comparison to Repli-chip data. This protocol can be performed in up to 3 d before sequencing, and requires basic cellular and molecular biology skills, as well as a basic understanding of Unix and R.
Improved multiple displacement amplification (iMDA) and ultraclean reagents.
Motley, S Timothy; Picuri, John M; Crowder, Chris D; Minich, Jeremiah J; Hofstadler, Steven A; Eshoo, Mark W
2014-06-06
Next-generation sequencing sample preparation requires nanogram to microgram quantities of DNA; however, many relevant samples are comprised of only a few cells. Genomic analysis of these samples requires a whole genome amplification method that is unbiased and free of exogenous DNA contamination. To address these challenges we have developed protocols for the production of DNA-free consumables including reagents and have improved upon multiple displacement amplification (iMDA). A specialized ethylene oxide treatment was developed that renders free DNA and DNA present within Gram positive bacterial cells undetectable by qPCR. To reduce DNA contamination in amplification reagents, a combination of ion exchange chromatography, filtration, and lot testing protocols were developed. Our multiple displacement amplification protocol employs a second strand-displacing DNA polymerase, improved buffers, improved reaction conditions and DNA free reagents. The iMDA protocol, when used in combination with DNA-free laboratory consumables and reagents, significantly improved efficiency and accuracy of amplification and sequencing of specimens with moderate to low levels of DNA. The sensitivity and specificity of sequencing of amplified DNA prepared using iMDA was compared to that of DNA obtained with two commercial whole genome amplification kits using 10 fg (~1-2 bacterial cells worth) of bacterial genomic DNA as a template. Analysis showed >99% of the iMDA reads mapped to the template organism whereas only 0.02% of the reads from the commercial kits mapped to the template. To assess the ability of iMDA to achieve balanced genomic coverage, a non-stochastic amount of bacterial genomic DNA (1 pg) was amplified and sequenced, and data obtained were compared to sequencing data obtained directly from genomic DNA. The iMDA DNA and genomic DNA sequencing had comparable coverage 99.98% of the reference genome at ≥1X coverage and 99.9% at ≥5X coverage while maintaining both balance and representation of the genome. The iMDA protocol in combination with DNA-free laboratory consumables, significantly improved the ability to sequence specimens with low levels of DNA. iMDA has broad utility in metagenomics, diagnostics, ancient DNA analysis, pre-implantation embryo screening, single-cell genomics, whole genome sequencing of unculturable organisms, and forensic applications for both human and microbial targets.
An accurate and efficient experimental approach for characterization of the complex oral microbiota.
Zheng, Wei; Tsompana, Maria; Ruscitto, Angela; Sharma, Ashu; Genco, Robert; Sun, Yijun; Buck, Michael J
2015-10-05
Currently, taxonomic interrogation of microbiota is based on amplification of 16S rRNA gene sequences in clinical and scientific settings. Accurate evaluation of the microbiota depends heavily on the primers used, and genus/species resolution bias can arise with amplification of non-representative genomic regions. The latest Illumina MiSeq sequencing chemistry has extended the read length to 300 bp, enabling deep profiling of large number of samples in a single paired-end reaction at a fraction of the cost. An increasingly large number of researchers have adopted this technology for various microbiome studies targeting the 16S rRNA V3-V4 hypervariable region. To expand the applicability of this powerful platform for further descriptive and functional microbiome studies, we standardized and tested an efficient, reliable, and straightforward workflow for the amplification, library construction, and sequencing of the 16S V1-V3 hypervariable region using the new 2 × 300 MiSeq platform. Our analysis involved 11 subgingival plaque samples from diabetic and non-diabetic human subjects suffering from periodontitis. The efficiency and reliability of our experimental protocol was compared to 16S V3-V4 sequencing data from the same samples. Comparisons were based on measures of observed taxonomic richness and species evenness, along with Procrustes analyses using beta(β)-diversity distance metrics. As an experimental control, we also analyzed a total of eight technical replicates for the V1-V3 and V3-V4 regions from a synthetic community with known bacterial species operon counts. We show that our experimental protocol accurately measures true bacterial community composition. Procrustes analyses based on unweighted UniFrac β-diversity metrics depicted significant correlation between oral bacterial composition for the V1-V3 and V3-V4 regions. However, measures of phylotype richness were higher for the V1-V3 region, suggesting that V1-V3 offers a deeper assessment of population diversity and community ecology for the complex oral microbiota. This study provides researchers with valuable experimental evidence for the selection of appropriate 16S amplicons for future human oral microbiome studies. We expect that the tested 16S V1-V3 framework will be widely applicable to other types of microbiota, allowing robust, time-efficient, and inexpensive examination of thousands of samples for population, phylogenetic, and functional crossectional and longitutidal studies.
Yang, Ying-Jie; Wang, Ye; Li, Zhi-Feng; Gong, Ya; Zhang, Peng; Hu, Wen-Chao; Sheng, Duo-Hong; Li, Yue-Zhong
2017-08-16
The CRISPR/Cas9 system is a powerful tool for genome editing, in which the sgRNA binds and guides the Cas9 protein for the sequence-specific cleavage. The protocol is employable in different organisms, but is often limited by cell damage due to the endonuclease activity of the introduced Cas9 and the potential off-target DNA cleavage from incorrect guide by the 20 nt spacer. In this study, after resolving some critical limits, we have established an efficient CRISPR/Cas9 system for the deletion of large genome fragments related to the biosynthesis of secondary metabolites in Myxococcus xanthus cells. We revealed that the high expression of a codon-optimized cas9 gene in M. xanthus was cytotoxic, and developed a temporally high expression strategy to reduce the cell damage from high expressions of Cas9. We optimized the deletion protocol by using the tRNA-sgRNA-tRNA chimeric structure to ensure correct sgRNA sequence. We found that, in addition to the position-dependent nucleotide preference, the free energy of a 20 nt spacer was a key factor for the deletion efficiency. By using the developed protocol, we achieved the CRISPR/Cas9-induced deletion of large biosynthetic gene clusters for secondary metabolites in M. xanthus DK1622 and its epothilone-producing mutant. The findings and the proposals described in this paper were suggested to be workable in other organisms, for example, other Gram negative bacteria with high GC content.
Distance-Based and Low Energy Adaptive Clustering Protocol for Wireless Sensor Networks
Gani, Abdullah; Anisi, Mohammad Hossein; Ab Hamid, Siti Hafizah; Akhunzada, Adnan; Khan, Muhammad Khurram
2016-01-01
A wireless sensor network (WSN) comprises small sensor nodes with limited energy capabilities. The power constraints of WSNs necessitate efficient energy utilization to extend the overall network lifetime of these networks. We propose a distance-based and low-energy adaptive clustering (DISCPLN) protocol to streamline the green issue of efficient energy utilization in WSNs. We also enhance our proposed protocol into the multi-hop-DISCPLN protocol to increase the lifetime of the network in terms of high throughput with minimum delay time and packet loss. We also propose the mobile-DISCPLN protocol to maintain the stability of the network. The modelling and comparison of these protocols with their corresponding benchmarks exhibit promising results. PMID:27658194
Trapnell, Cole; Roberts, Adam; Goff, Loyal; Pertea, Geo; Kim, Daehwan; Kelley, David R; Pimentel, Harold; Salzberg, Steven L; Rinn, John L; Pachter, Lior
2012-01-01
Recent advances in high-throughput cDNA sequencing (RNA-seq) can reveal new genes and splice variants and quantify expression genome-wide in a single assay. The volume and complexity of data from RNA-seq experiments necessitate scalable, fast and mathematically principled analysis software. TopHat and Cufflinks are free, open-source software tools for gene discovery and comprehensive expression analysis of high-throughput mRNA sequencing (RNA-seq) data. Together, they allow biologists to identify new genes and new splice variants of known ones, as well as compare gene and transcript expression under two or more conditions. This protocol describes in detail how to use TopHat and Cufflinks to perform such analyses. It also covers several accessory tools and utilities that aid in managing data, including CummeRbund, a tool for visualizing RNA-seq analysis results. Although the procedure assumes basic informatics skills, these tools assume little to no background with RNA-seq analysis and are meant for novices and experts alike. The protocol begins with raw sequencing reads and produces a transcriptome assembly, lists of differentially expressed and regulated genes and transcripts, and publication-quality visualizations of analysis results. The protocol's execution time depends on the volume of transcriptome sequencing data and available computing resources but takes less than 1 d of computer time for typical experiments and ~1 h of hands-on time. PMID:22383036
A peak position comparison method for high-speed quantitative Laue microdiffraction data processing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kou, Jiawei; Chen, Kai; Tamura, Nobumichi
Indexing Laue patterns of a synchrotron microdiffraction scan can take as much as ten times longer than collecting the data, impeding efficient structural analysis using this technique. Here in this paper, a novel strategy is developed. By comparing the peak positions of adjacent Laue patterns and checking the intensity sequence, grain and phase boundaries are identified, requiring only a limited number of indexing steps for each individual grain. Using this protocol, the Laue patterns can be indexed on the fly as they are taken. The validation of this method is demonstrated by analyzing the microstructure of a laser 3D printedmore » multi-phase/multi-grain Ni-based superalloy.« less
A peak position comparison method for high-speed quantitative Laue microdiffraction data processing
Kou, Jiawei; Chen, Kai; Tamura, Nobumichi
2018-09-12
Indexing Laue patterns of a synchrotron microdiffraction scan can take as much as ten times longer than collecting the data, impeding efficient structural analysis using this technique. Here in this paper, a novel strategy is developed. By comparing the peak positions of adjacent Laue patterns and checking the intensity sequence, grain and phase boundaries are identified, requiring only a limited number of indexing steps for each individual grain. Using this protocol, the Laue patterns can be indexed on the fly as they are taken. The validation of this method is demonstrated by analyzing the microstructure of a laser 3D printedmore » multi-phase/multi-grain Ni-based superalloy.« less
Bragalini, Claudia; Ribière, Céline; Parisot, Nicolas; Vallon, Laurent; Prudent, Elsa; Peyretaillade, Eric; Girlanda, Mariangela; Peyret, Pierre; Marmeisse, Roland; Luis, Patricia
2014-01-01
Eukaryotic microbial communities play key functional roles in soil biology and potentially represent a rich source of natural products including biocatalysts. Culture-independent molecular methods are powerful tools to isolate functional genes from uncultured microorganisms. However, none of the methods used in environmental genomics allow for a rapid isolation of numerous functional genes from eukaryotic microbial communities. We developed an original adaptation of the solution hybrid selection (SHS) for an efficient recovery of functional complementary DNAs (cDNAs) synthesized from soil-extracted polyadenylated mRNAs. This protocol was tested on the Glycoside Hydrolase 11 gene family encoding endo-xylanases for which we designed 35 explorative 31-mers capture probes. SHS was implemented on four soil eukaryotic cDNA pools. After two successive rounds of capture, >90% of the resulting cDNAs were GH11 sequences, of which 70% (38 among 53 sequenced genes) were full length. Between 1.5 and 25% of the cloned captured sequences were expressed in Saccharomyces cerevisiae. Sequencing of polymerase chain reaction-amplified GH11 gene fragments from the captured sequences highlighted hundreds of phylogenetically diverse sequences that were not yet described, in public databases. This protocol offers the possibility of performing exhaustive exploration of eukaryotic gene families within microbial communities thriving in any type of environment. PMID:25281543
Suppressing spectral diffusion of emitted photons with optical pulses
Fotso, H. F.; Feiguin, A. E.; Awschalom, D. D.; ...
2016-01-22
In many quantum architectures the solid-state qubits, such as quantum dots or color centers, are interfaced via emitted photons. However, the frequency of photons emitted by solid-state systems exhibits slow uncontrollable fluctuations over time (spectral diffusion), creating a serious problem for implementation of the photon-mediated protocols. Here we show that a sequence of optical pulses applied to the solid-state emitter can stabilize the emission line at the desired frequency. We demonstrate efficiency, robustness, and feasibility of the method analytically and numerically. Taking nitrogen-vacancy center in diamond as an example, we show that only several pulses, with the width of 1more » ns, separated by few ns (which is not difficult to achieve) can suppress spectral diffusion. As a result, our method provides a simple and robust way to greatly improve the efficiency of photon-mediated entanglement and/or coupling to photonic cavities for solid-state qubits.« less
Chavhan, Govind B; Babyn, Paul S; Vasanawala, Shreyas S
2013-05-01
Familiarity with basic sequence properties and their trade-offs is necessary for radiologists performing abdominal magnetic resonance (MR) imaging. Acquiring diagnostic-quality MR images in the pediatric abdomen is challenging due to motion, inability to breath hold, varying patient size, and artifacts. Motion-compensation techniques (eg, respiratory gating, signal averaging, suppression of signal from moving tissue, swapping phase- and frequency-encoding directions, use of faster sequences with breath holding, parallel imaging, and radial k-space filling) can improve image quality. Each of these techniques is more suitable for use with certain sequences and acquisition planes and in specific situations and age groups. Different T1- and T2-weighted sequences work better in different age groups and with differing acquisition planes and have specific advantages and disadvantages. Dynamic imaging should be performed differently in younger children than in older children. In younger children, the sequence and the timing of dynamic phases need to be adjusted. Different sequences work better in smaller children and in older children because of differing breath-holding ability, breathing patterns, field of view, and use of sedation. Hence, specific protocols should be maintained for younger children and older children. Combining longer-higher-resolution sequences and faster-lower-resolution sequences helps acquire diagnostic-quality images in a reasonable time. © RSNA, 2013.
Streaming fragment assignment for real-time analysis of sequencing experiments
Roberts, Adam; Pachter, Lior
2013-01-01
We present eXpress, a software package for highly efficient probabilistic assignment of ambiguously mapping sequenced fragments. eXpress uses a streaming algorithm with linear run time and constant memory use. It can determine abundances of sequenced molecules in real time, and can be applied to ChIP-seq, metagenomics and other large-scale sequencing data. We demonstrate its use on RNA-seq data, showing greater efficiency than other quantification methods. PMID:23160280
Takahashi, Mayumi; Wu, Xiwei; Ho, Michelle; Chomchan, Pritsana; Rossi, John J.; Burnett, John C.; Zhou, Jiehua
2016-01-01
The systemic evolution of ligands by exponential enrichment (SELEX) technique is a powerful and effective aptamer-selection procedure. However, modifications to the process can dramatically improve selection efficiency and aptamer performance. For example, droplet digital PCR (ddPCR) has been recently incorporated into SELEX selection protocols to putatively reduce the propagation of byproducts and avoid selection bias that result from differences in PCR efficiency of sequences within the random library. However, a detailed, parallel comparison of the efficacy of conventional solution PCR versus the ddPCR modification in the RNA aptamer-selection process is needed to understand effects on overall SELEX performance. In the present study, we took advantage of powerful high throughput sequencing technology and bioinformatics analysis coupled with SELEX (HT-SELEX) to thoroughly investigate the effects of initial library and PCR methods in the RNA aptamer identification. Our analysis revealed that distinct “biased sequences” and nucleotide composition existed in the initial, unselected libraries purchased from two different manufacturers and that the fate of the “biased sequences” was target-dependent during selection. Our comparison of solution PCR- and ddPCR-driven HT-SELEX demonstrated that PCR method affected not only the nucleotide composition of the enriched sequences, but also the overall SELEX efficiency and aptamer efficacy. PMID:27652575
Bernsen, M R; Dijkman, H B; de Vries, E; Figdor, C G; Ruiter, D J; Adema, G J; van Muijen, G N
1998-10-01
Molecular analysis of small tissue samples has become increasingly important in biomedical studies. Using a laser dissection microscope and modified nucleic acid isolation protocols, we demonstrate that multiple mRNA as well as DNA sequences can be identified from a single-cell sample. In addition, we show that the specificity of procurement of tissue samples is not compromised by smear contamination resulting from scraping of the microtome knife during sectioning of lesions. The procedures described herein thus allow for efficient RT-PCR or PCR analysis of multiple nucleic acid sequences from small tissue samples obtained by laser-assisted microdissection.
Error reduction and parameter optimization of the TAPIR method for fast T1 mapping.
Zaitsev, M; Steinhoff, S; Shah, N J
2003-06-01
A methodology is presented for the reduction of both systematic and random errors in T(1) determination using TAPIR, a Look-Locker-based fast T(1) mapping technique. The relations between various sequence parameters were carefully investigated in order to develop recipes for choosing optimal sequence parameters. Theoretical predictions for the optimal flip angle were verified experimentally. Inversion pulse imperfections were identified as the main source of systematic errors in T(1) determination with TAPIR. An effective remedy is demonstrated which includes extension of the measurement protocol to include a special sequence for mapping the inversion efficiency itself. Copyright 2003 Wiley-Liss, Inc.
Jérôme, Marc; Martinsohn, Jann Thorsten; Ortega, Delphine; Carreau, Philippe; Verrez-Bagnis, Véronique; Mouchel, Olivier
2008-05-28
Traceability in the fish food sector plays an increasingly important role for consumer protection and confidence building. This is reflected by the introduction of legislation and rules covering traceability on national and international levels. Although traceability through labeling is well established and supported by respective regulations, monitoring and enforcement of these rules are still hampered by the lack of efficient diagnostic tools. We describe protocols using a direct sequencing method based on 212-274-bp diagnostic sequences derived from species-specific mitochondria DNA cytochrome b, 16S rRNA, and cytochrome oxidase subunit I sequences which can efficiently be applied to unambiguously determine even closely related fish species in processed food products labeled "anchovy". Traceability of anchovy-labeled products is supported by the public online database AnchovyID ( http://anchovyid.jrc.ec.europa.eu), which provided data obtained during our study and tools for analytical purposes.
Targeting vector construction through recombineering.
Malureanu, Liviu A
2011-01-01
Gene targeting in mouse embryonic stem cells is an essential, yet still very expensive and highly time-consuming, tool and method to study gene function at the organismal level or to create mouse models of human diseases. Conventional cloning-based methods have been largely used for generating targeting vectors, but are hampered by a number of limiting factors, including the variety and location of restriction enzymes in the gene locus of interest, the specific PCR amplification of repetitive DNA sequences, and cloning of large DNA fragments. Recombineering is a technique that exploits the highly efficient homologous recombination function encoded by λ phage in Escherichia coli. Bacteriophage-based recombination can recombine homologous sequences as short as 30-50 bases, allowing manipulations such as insertion, deletion, or mutation of virtually any genomic region. The large availability of mouse genomic bacterial artificial chromosome (BAC) libraries covering most of the genome facilitates the retrieval of genomic DNA sequences from the bacterial chromosomes through recombineering. This chapter describes a successfully applied protocol and aims to be a detailed guide through the steps of generation of targeting vectors through recombineering.
A Power-Optimized Cooperative MAC Protocol for Lifetime Extension in Wireless Sensor Networks.
Liu, Kai; Wu, Shan; Huang, Bo; Liu, Feng; Xu, Zhen
2016-10-01
In wireless sensor networks, in order to satisfy the requirement of long working time of energy-limited nodes, we need to design an energy-efficient and lifetime-extended medium access control (MAC) protocol. In this paper, a node cooperation mechanism that one or multiple nodes with higher channel gain and sufficient residual energy help a sender relay its data packets to its recipient is employed to achieve this objective. We first propose a transmission power optimization algorithm to prolong network lifetime by optimizing the transmission powers of the sender and its cooperative nodes to maximize their minimum residual energy after their data packet transmissions. Based on it, we propose a corresponding power-optimized cooperative MAC protocol. A cooperative node contention mechanism is designed to ensure that the sender can effectively select a group of cooperative nodes with the lowest energy consumption and the best channel quality for cooperative transmissions, thus further improving the energy efficiency. Simulation results show that compared to typical MAC protocol with direct transmissions and energy-efficient cooperative MAC protocol, the proposed cooperative MAC protocol can efficiently improve the energy efficiency and extend the network lifetime.
A Power-Optimized Cooperative MAC Protocol for Lifetime Extension in Wireless Sensor Networks
Liu, Kai; Wu, Shan; Huang, Bo; Liu, Feng; Xu, Zhen
2016-01-01
In wireless sensor networks, in order to satisfy the requirement of long working time of energy-limited nodes, we need to design an energy-efficient and lifetime-extended medium access control (MAC) protocol. In this paper, a node cooperation mechanism that one or multiple nodes with higher channel gain and sufficient residual energy help a sender relay its data packets to its recipient is employed to achieve this objective. We first propose a transmission power optimization algorithm to prolong network lifetime by optimizing the transmission powers of the sender and its cooperative nodes to maximize their minimum residual energy after their data packet transmissions. Based on it, we propose a corresponding power-optimized cooperative MAC protocol. A cooperative node contention mechanism is designed to ensure that the sender can effectively select a group of cooperative nodes with the lowest energy consumption and the best channel quality for cooperative transmissions, thus further improving the energy efficiency. Simulation results show that compared to typical MAC protocol with direct transmissions and energy-efficient cooperative MAC protocol, the proposed cooperative MAC protocol can efficiently improve the energy efficiency and extend the network lifetime. PMID:27706079
A novel method of genomic DNA extraction for Cactaceae1
Fehlberg, Shannon D.; Allen, Jessica M.; Church, Kathleen
2013-01-01
• Premise of the study: Genetic studies of Cactaceae can at times be impeded by difficult sampling logistics and/or high mucilage content in tissues. Simplifying sampling and DNA isolation through the use of cactus spines has not previously been investigated. • Methods and Results: Several protocols for extracting DNA from spines were tested and modified to maximize yield, amplification, and sequencing. Sampling of and extraction from spines resulted in a simplified protocol overall and complete avoidance of mucilage as compared to typical tissue extractions. Sequences from one nuclear and three plastid regions were obtained across eight genera and 20 species of cacti using DNA extracted from spines. • Conclusions: Genomic DNA useful for amplification and sequencing can be obtained from cactus spines. The protocols described here are valuable for any cactus species, but are particularly useful for investigators interested in sampling living collections, extensive field sampling, and/or conservation genetic studies. PMID:25202521
Research on low-latency MAC protocols for wireless sensor networks
NASA Astrophysics Data System (ADS)
He, Chenguang; Sha, Xuejun; Lee, Chankil
2007-11-01
Energy-efficient should not be the only design goal in MAC protocols for wireless sensor networks, which involve the use of battery-operated computing and sensing devices. Low-latency operation becomes the same important as energy-efficient in the case that the traffic load is very heavy or the real-time constrain is used in applications like tracking or locating. This paper introduces some causes of traditional time delays which are inherent in a multi-hops network using existing WSN MAC protocols, illuminates the importance of low-latency MAC design for wireless sensor networks, and presents three MACs as examples of low-latency protocols designed specially for sleep delay, wait delay and wakeup delay in wireless sensor networks, respectively. The paper also discusses design trade-offs with emphasis on low-latency and points out their advantages and disadvantages, together with some design considerations and suggestions for MAC protocols for future applications and researches.
Treatment Protocol for High Velocity/High Energy Gunshot Injuries to the Face
Peled, Micha; Leiser, Yoav; Emodi, Omri; Krausz, Amir
2011-01-01
Major causes of facial combat injuries include blasts, high-velocity/high-energy missiles, and low-velocity missiles. High-velocity bullets fired from assault rifles encompass special ballistic properties, creating a transient cavitation space with a small entrance wound and a much larger exit wound. There is no dispute regarding the fact that primary emergency treatment of ballistic injuries to the face commences in accordance with the current advanced trauma life support (ATLS) recommendations; the main areas in which disputes do exist concern the question of the timing, sequence, and modes of surgical treatment. The aim of the present study is to present the treatment outcome of high-velocity/high-energy gunshot injuries to the face, using a protocol based on the experience of a single level I trauma center. A group of 23 injured combat soldiers who sustained bullet and shrapnel injuries to the maxillofacial region during a 3-week regional military conflict were evaluated in this study. Nine patients met the inclusion criteria (high-velocity/high-energy injuries) and were included in the study. According to our protocol, upon arrival patients underwent endotracheal intubation and were hemodynamically stabilized in the shock-trauma unit and underwent total-body computed tomography with 3-D reconstruction of the head and neck and computed tomography angiography. All patients underwent maxillofacial surgery upon the day of arrival according to the protocol we present. In view of our treatment outcomes, results, and low complication rates, we conclude that strict adherence to a well-founded and structured treatment protocol based on clinical experience is mandatory in providing efficient, appropriate, and successful treatment to a relatively large group of patients who sustain various degrees of maxillofacial injuries during a short period of time. PMID:23449809
Wan, Shixiang; Zou, Quan
2017-01-01
Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.
Miller, Thomas F.
2017-01-01
We present a coarse-grained simulation model that is capable of simulating the minute-timescale dynamics of protein translocation and membrane integration via the Sec translocon, while retaining sufficient chemical and structural detail to capture many of the sequence-specific interactions that drive these processes. The model includes accurate geometric representations of the ribosome and Sec translocon, obtained directly from experimental structures, and interactions parameterized from nearly 200 μs of residue-based coarse-grained molecular dynamics simulations. A protocol for mapping amino-acid sequences to coarse-grained beads enables the direct simulation of trajectories for the co-translational insertion of arbitrary polypeptide sequences into the Sec translocon. The model reproduces experimentally observed features of membrane protein integration, including the efficiency with which polypeptide domains integrate into the membrane, the variation in integration efficiency upon single amino-acid mutations, and the orientation of transmembrane domains. The central advantage of the model is that it connects sequence-level protein features to biological observables and timescales, enabling direct simulation for the mechanistic analysis of co-translational integration and for the engineering of membrane proteins with enhanced membrane integration efficiency. PMID:28328943
Efficiency in nonequilibrium molecular dynamics Monte Carlo simulations
Radak, Brian K.; Roux, Benoît
2016-10-07
Hybrid algorithms combining nonequilibrium molecular dynamics and Monte Carlo (neMD/MC) offer a powerful avenue for improving the sampling efficiency of computer simulations of complex systems. These neMD/MC algorithms are also increasingly finding use in applications where conventional approaches are impractical, such as constant-pH simulations with explicit solvent. However, selecting an optimal nonequilibrium protocol for maximum efficiency often represents a non-trivial challenge. This work evaluates the efficiency of a broad class of neMD/MC algorithms and protocols within the theoretical framework of linear response theory. The approximations are validated against constant pH-MD simulations and shown to provide accurate predictions of neMD/MC performance.more » An assessment of a large set of protocols confirms (both theoretically and empirically) that a linear work protocol gives the best neMD/MC performance. Lastly, a well-defined criterion for optimizing the time parameters of the protocol is proposed and demonstrated with an adaptive algorithm that improves the performance on-the-fly with minimal cost.« less
High-throughput physical mapping of chromosomes using automated in situ hybridization.
George, Phillip; Sharakhova, Maria V; Sharakhov, Igor V
2012-06-28
Projects to obtain whole-genome sequences for 10,000 vertebrate species and for 5,000 insect and related arthropod species are expected to take place over the next 5 years. For example, the sequencing of the genomes for 15 malaria mosquitospecies is currently being done using an Illumina platform. This Anopheles species cluster includes both vectors and non-vectors of malaria. When the genome assemblies become available, researchers will have the unique opportunity to perform comparative analysis for inferring evolutionary changes relevant to vector ability. However, it has proven difficult to use next-generation sequencing reads to generate high-quality de novo genome assemblies. Moreover, the existing genome assemblies for Anopheles gambiae, although obtained using the Sanger method, are gapped or fragmented. Success of comparative genomic analyses will be limited if researchers deal with numerous sequencing contigs, rather than with chromosome-based genome assemblies. Fragmented, unmapped sequences create problems for genomic analyses because: (i) unidentified gaps cause incorrect or incomplete annotation of genomic sequences; (ii) unmapped sequences lead to confusion between paralogous genes and genes from different haplotypes; and (iii) the lack of chromosome assignment and orientation of the sequencing contigs does not allow for reconstructing rearrangement phylogeny and studying chromosome evolution. Developing high-resolution physical maps for species with newly sequenced genomes is a timely and cost-effective investment that will facilitate genome annotation, evolutionary analysis, and re-sequencing of individual genomes from natural populations. Here, we present innovative approaches to chromosome preparation, fluorescent in situ hybridization (FISH), and imaging that facilitate rapid development of physical maps. Using An. gambiae as an example, we demonstrate that the development of physical chromosome maps can potentially improve genome assemblies and, thus, the quality of genomic analyses. First, we use a high-pressure method to prepare polytene chromosome spreads. This method, originally developed for Drosophila, allows the user to visualize more details on chromosomes than the regular squashing technique. Second, a fully automated, front-end system for FISH is used for high-throughput physical genome mapping. The automated slide staining system runs multiple assays simultaneously and dramatically reduces hands-on time. Third, an automatic fluorescent imaging system, which includes a motorized slide stage, automatically scans and photographs labeled chromosomes after FISH. This system is especially useful for identifying and visualizing multiple chromosomal plates on the same slide. In addition, the scanning process captures a more uniform FISH result. Overall, the automated high-throughput physical mapping protocol is more efficient than a standard manual protocol.
Fluorescent in situ hybridisation to amphioxus chromosomes.
Castro, Luis Filipe Costa; Holland, Peter William Harold
2002-12-01
We describe an efficient protocol for mapping genes and other DNA sequences to amphioxus chromosomes using fluorescent in situ hybridisation. We apply this method to identify the number and location of ribosomal DNA gene clusters and telomere sequences in metaphase spreads of Branchiostoma floridae. We also describe how the locations of two single copy genes can be mapped relative to each other, and demonstrate this by mapping an amphioxus Pax gene relative to a homologue of the Notch gene. These methods have great potential for performing comparative genomics between amphioxus and vertebrates.
A new communication protocol family for a distributed spacecraft control system
NASA Technical Reports Server (NTRS)
Baldi, Andrea; Pace, Marco
1994-01-01
In this paper we describe the concepts behind and architecture of a communication protocol family, which was designed to fulfill the communication requirements of ESOC's new distributed spacecraft control system SCOS 2. A distributed spacecraft control system needs a data delivery subsystem to be used for telemetry (TLM) distribution, telecommand (TLC) dispatch and inter-application communication, characterized by the following properties: reliability, so that any operational workstation is guaranteed to receive the data it needs to accomplish its role; efficiency, so that the telemetry distribution, even for missions with high telemetry rates, does not cause a degradation of the overall control system performance; scalability, so that the network is not the bottleneck both in terms of bandwidth and reconfiguration; flexibility, so that it can be efficiently used in many different situations. The new protocol family which satisfies the above requirements is built on top of widely used communication protocols (UDP and TCP), provides reliable point-to-point and broadcast communication (UDP+) and is implemented in C++. Reliability is achieved using a retransmission mechanism based on a sequence numbering scheme. Such a scheme allows to have cost-effective performances compared to the traditional protocols, because retransmission is only triggered by applications which explicitly need reliability. This flexibility enables applications with different profiles to take advantage of the available protocols, so that the best rate between sped and reliability can be achieved case by case.
Efficient protocols for Stirling heat engines at the micro-scale
NASA Astrophysics Data System (ADS)
Muratore-Ginanneschi, Paolo; Schwieger, Kay
2015-10-01
We investigate the thermodynamic efficiency of sub-micro-scale Stirling heat engines operating under the conditions described by overdamped stochastic thermodynamics. We show how to construct optimal protocols such that at maximum power the efficiency attains for constant isotropic mobility the universal law η=2 ηC/(4-ηC) , where ηC is the efficiency of an ideal Carnot cycle. We show that these protocols are specified by the solution of an optimal mass transport problem. Such solution can be determined explicitly using well-known Monge-Ampère-Kantorovich reconstruction algorithms. Furthermore, we show that the same law describes the efficiency of heat engines operating at maximum work over short time periods. Finally, we illustrate the straightforward extension of these results to cases when the mobility is anisotropic and temperature dependent.
Hennig, Bianca P.; Velten, Lars; Racke, Ines; Tu, Chelsea Szu; Thoms, Matthias; Rybin, Vladimir; Besir, Hüseyin; Remans, Kim; Steinmetz, Lars M.
2017-01-01
Efficient preparation of high-quality sequencing libraries that well represent the biological sample is a key step for using next-generation sequencing in research. Tn5 enables fast, robust, and highly efficient processing of limited input material while scaling to the parallel processing of hundreds of samples. Here, we present a robust Tn5 transposase purification strategy based on an N-terminal His6-Sumo3 tag. We demonstrate that libraries prepared with our in-house Tn5 are of the same quality as those processed with a commercially available kit (Nextera XT), while they dramatically reduce the cost of large-scale experiments. We introduce improved purification strategies for two versions of the Tn5 enzyme. The first version carries the previously reported point mutations E54K and L372P, and stably produces libraries of constant fragment size distribution, even if the Tn5-to-input molecule ratio varies. The second Tn5 construct carries an additional point mutation (R27S) in the DNA-binding domain. This construct allows for adjustment of the fragment size distribution based on enzyme concentration during tagmentation, a feature that opens new opportunities for use of Tn5 in customized experimental designs. We demonstrate the versatility of our Tn5 enzymes in different experimental settings, including a novel single-cell polyadenylation site mapping protocol as well as ultralow input DNA sequencing. PMID:29118030
Enhancement and optimization of plasmid expression in femtosecond optical transfection.
Praveen, Bavishna B; Stevenson, David J; Antkowiak, Maciej; Dholakia, Kishan; Gunn-Moore, Frank J
2011-04-01
Cell transfection using femtosecond lasers is gaining importance for its proven ability to achieve selective transfection in a sterile and relatively non-invasive manner. However, the net efficiency of this technique is limited due to a number of factors that ultimately makes it difficult to be used as a viable and widely used technique. We report here a method to achieve significant enhancement in the efficiency of femtosecond optical transfection. The transfection procedure is modified by incorporating a suitable synthetic peptide containing nuclear localization and DNA binding sequences, assisting DNA import into the nucleus. We achieved a 3-fold enhancement in the transfection efficiency for adherent Chinese Hamster Ovary (CHO-K1) cells with this modified protocol. Further, in the presence of this biochemical reagent, we were able to reduce the required plasmid concentration by ~70% without compromising the transfection efficiency. Also, we report for the first time the successful photo-transfection of recently trypsinised cells with significantly high transfection efficiency when transfected with modified plasmid. This paves the way for the development of high throughput microfluidic optical transfection devices. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Snyder-Mackler, Noah; Majoros, William H.; Yuan, Michael L.; Shaver, Amanda O.; Gordon, Jacob B.; Kopp, Gisela H.; Schlebusch, Stephen A.; Wall, Jeffrey D.; Alberts, Susan C.; Mukherjee, Sayan; Zhou, Xiang; Tung, Jenny
2016-01-01
Research on the genetics of natural populations was revolutionized in the 1990s by methods for genotyping noninvasively collected samples. However, these methods have remained largely unchanged for the past 20 years and lag far behind the genomics era. To close this gap, here we report an optimized laboratory protocol for genome-wide capture of endogenous DNA from noninvasively collected samples, coupled with a novel computational approach to reconstruct pedigree links from the resulting low-coverage data. We validated both methods using fecal samples from 62 wild baboons, including 48 from an independently constructed extended pedigree. We enriched fecal-derived DNA samples up to 40-fold for endogenous baboon DNA and reconstructed near-perfect pedigree relationships even with extremely low-coverage sequencing. We anticipate that these methods will be broadly applicable to the many research systems for which only noninvasive samples are available. The lab protocol and software (“WHODAD”) are freely available at www.tung-lab.org/protocols-and-software.html and www.xzlab.org/software.html, respectively. PMID:27098910
He, Qiye; Johnston, Jeff; Zeitlinger, Julia
2014-01-01
Understanding how eukaryotic enhancers are bound and regulated by specific combinations of transcription factors is still a major challenge. To better map transcription factor binding genome-wide at nucleotide resolution in vivo, we have developed a robust ChIP-exo protocol called ChIP experiments with nucleotide resolution through exonuclease, unique barcode and single ligation (ChIP-nexus), which utilizes an efficient DNA self-circularization step during library preparation. Application of ChIP-nexus to four proteins—human TBP and Drosophila NFkB, Twist and Max— demonstrates that it outperforms existing ChIP protocols in resolution and specificity, pinpoints relevant binding sites within enhancers containing multiple binding motifs and allows the analysis of in vivo binding specificities. Notably, we show that Max frequently interacts with DNA sequences next to its motif, and that this binding pattern correlates with local DNA sequence features such as DNA shape. ChIP-nexus will be broadly applicable to studying in vivo transcription factor binding specificity and its relationship to cis-regulatory changes in humans and model organisms. PMID:25751057
NASA Astrophysics Data System (ADS)
Jian, Wei; Estevez, Claudio; Chowdhury, Arshad; Jia, Zhensheng; Wang, Jianxin; Yu, Jianguo; Chang, Gee-Kung
2010-12-01
This paper presents an energy-efficient Medium Access Control (MAC) protocol for very-high-throughput millimeter-wave (mm-wave) wireless sensor communication networks (VHT-MSCNs) based on hybrid multiple access techniques of frequency division multiplexing access (FDMA) and time division multiplexing access (TDMA). An energy-efficient Superframe for wireless sensor communication network employing directional mm-wave wireless access technologies is proposed for systems that require very high throughput, such as high definition video signals, for sensing, processing, transmitting, and actuating functions. Energy consumption modeling for each network element and comparisons among various multi-access technologies in term of power and MAC layer operations are investigated for evaluating the energy-efficient improvement of proposed MAC protocol.
Noda, Yoshifumi; Goshima, Satoshi; Kojima, Toshihisa; Kawaguchi, Shimpei; Kawada, Hiroshi; Kawai, Nobuyuki; Koyasu, Hiromi; Matsuo, Masayuki; Bae, Kyongtae T
2017-04-01
To evaluate the value of adding single-shot balanced turbo field-echo (b-TFE) sequence to conventional magnetic resonance cholangiopancreatography (MRCP) for the detection of common bile duct (CBD) stone. One hundred thirty-seven consecutive patients with suspected CBD stone underwent MRCP including single-shot b-TFE sequence. Twenty-five patients were confirmed with CBD stone by endoscopic retrograde cholangiopancreatography or ultrasonography. Two radiologists reviewed two image protocols: protocol A (conventional MRCP protocol: unenhanced T1-, T2-, and respiratory-triggered three-dimensional fat-suppressed single-shot turbo spin-echo MRCP sequence) and protocol B (protocol A plus single-shot b-TFE sequence). The sensitivity, specificity, positive (PPV) and negative predictive value (NPV), and area under the receiver-operating-characteristic (ROC) curve (AUC) for the detection of CBD stone were compared. The sensitivity (72%) and NPV (94%) were the same between the two protocols. However, protocol B was greater in the specificity (99%) and PPV (94%) than protocol A (92% and 67%, respectively) (P = 0.0078 and 0.031, respectively). The AUC was significantly greater for protocol B (0.93) than for protocol A (0.86) (P = 0.026). Inclusion of single-shot b-TFE sequence to conventional MRCP significantly improved the specificity and PPV for the detection of CBD stone.
NASA Astrophysics Data System (ADS)
Tsao, Shih-Ming; Lai, Ji-Ching; Horng, Horng-Er; Liu, Tu-Chen; Hong, Chin-Yih
2017-04-01
Aptamers are oligonucleotides that can bind to specific target molecules. Most aptamers are generated using random libraries in the standard systematic evolution of ligands by exponential enrichment (SELEX). Each random library contains oligonucleotides with a randomized central region and two fixed primer regions at both ends. The fixed primer regions are necessary for amplifying target-bound sequences by PCR. However, these extra-sequences may cause non-specific bindings, which potentially interfere with good binding for random sequences. The Magnetic-Assisted Rapid Aptamer Selection (MARAS) is a newly developed protocol for generating single-strand DNA aptamers. No repeat selection cycle is required in the protocol. This study proposes and demonstrates a method to isolate aptamers for C-reactive proteins (CRP) from a randomized ssDNA library containing no fixed sequences at 5‧ and 3‧ termini using the MARAS platform. Furthermore, the isolated primer-free aptamer was sequenced and binding affinity for CRP was analyzed. The specificity of the obtained aptamer was validated using blind serum samples. The result was consistent with monoclonal antibody-based nephelometry analysis, which indicated that a primer-free aptamer has high specificity toward targets. MARAS is a feasible platform for efficiently generating primer-free aptamers for clinical diagnoses.
Costa, José Hélio; Arnholdt-Schmitt, Birgit
2017-01-01
The alternative oxidase (AOX) gene family is a hot candidate for functional marker development that could help plant breeding on yield stability through more robust plants based on multi-stress tolerance. However, there is missing knowledge on the interplay between gene family members that might interfere with the efficiency of marker development. It is common view that AOX1 and AOX2 have different physiological roles. Nevertheless, both family member groups act in terms of molecular-biochemical function as "typical" alternative oxidases and co-regulation of AOX1 and AOX2 had been reported. Although conserved sequence differences had been identified, the basis for differential effects on physiology regulation is not sufficiently explored.This protocol gives instructions for a bioinformatics approach that supports discovering potential interaction of AOX family members in regulating growth and development. It further provides a strategy to elucidate the relevance of gene sequence diversity and copy number variation for final functionality in target tissues and finally the whole plant. Thus, overall this protocol provides the means for efficiently identifying plant AOX variants as functional marker candidates related to growth and development.
A Survey on an Energy-Efficient and Energy-Balanced Routing Protocol for Wireless Sensor Networks.
Ogundile, Olayinka O; Alfa, Attahiru S
2017-05-10
Wireless sensor networks (WSNs) form an important part of industrial application. There has been growing interest in the potential use of WSNs in applications such as environment monitoring, disaster management, health care monitoring, intelligence surveillance and defence reconnaissance. In these applications, the sensor nodes (SNs) are envisaged to be deployed in sizeable numbers in an outlying area, and it is quite difficult to replace these SNs after complete deployment in many scenarios. Therefore, as SNs are predominantly battery powered devices, the energy consumption of the nodes must be properly managed in order to prolong the network lifetime and functionality to a rational time. Different energy-efficient and energy-balanced routing protocols have been proposed in literature over the years. The energy-efficient routing protocols strive to increase the network lifetime by minimizing the energy consumption in each SN. On the other hand, the energy-balanced routing protocols protract the network lifetime by uniformly balancing the energy consumption among the nodes in the network. There have been various survey papers put forward by researchers to review the performance and classify the different energy-efficient routing protocols for WSNs. However, there seems to be no clear survey emphasizing the importance, concepts, and principles of load-balanced energy routing protocols for WSNs. In this paper, we provide a clear picture of both the energy-efficient and energy-balanced routing protocols for WSNs. More importantly, this paper presents an extensive survey of the different state-of-the-art energy-efficient and energy-balanced routing protocols. A taxonomy is introduced in this paper to classify the surveyed energy-efficient and energy-balanced routing protocols based on their proposed mode of communication towards the base station (BS). In addition, we classified these routing protocols based on the solution types or algorithms, and the input decision variables defined in the routing algorithm. The strengths and weaknesses of the choice of the decision variables used in the design of these energy-efficient and energy-balanced routing protocols are emphasised. Finally, we suggest possible research directions in order to optimize the energy consumption in sensor networks.
A Survey on an Energy-Efficient and Energy-Balanced Routing Protocol for Wireless Sensor Networks
Ogundile, Olayinka O.; Alfa, Attahiru S.
2017-01-01
Wireless sensor networks (WSNs) form an important part of industrial application. There has been growing interest in the potential use of WSNs in applications such as environment monitoring, disaster management, health care monitoring, intelligence surveillance and defence reconnaissance. In these applications, the sensor nodes (SNs) are envisaged to be deployed in sizeable numbers in an outlying area, and it is quite difficult to replace these SNs after complete deployment in many scenarios. Therefore, as SNs are predominantly battery powered devices, the energy consumption of the nodes must be properly managed in order to prolong the network lifetime and functionality to a rational time. Different energy-efficient and energy-balanced routing protocols have been proposed in literature over the years. The energy-efficient routing protocols strive to increase the network lifetime by minimizing the energy consumption in each SN. On the other hand, the energy-balanced routing protocols protract the network lifetime by uniformly balancing the energy consumption among the nodes in the network. There have been various survey papers put forward by researchers to review the performance and classify the different energy-efficient routing protocols for WSNs. However, there seems to be no clear survey emphasizing the importance, concepts, and principles of load-balanced energy routing protocols for WSNs. In this paper, we provide a clear picture of both the energy-efficient and energy-balanced routing protocols for WSNs. More importantly, this paper presents an extensive survey of the different state-of-the-art energy-efficient and energy-balanced routing protocols. A taxonomy is introduced in this paper to classify the surveyed energy-efficient and energy-balanced routing protocols based on their proposed mode of communication towards the base station (BS). In addition, we classified these routing protocols based on the solution types or algorithms, and the input decision variables defined in the routing algorithm. The strengths and weaknesses of the choice of the decision variables used in the design of these energy-efficient and energy-balanced routing protocols are emphasised. Finally, we suggest possible research directions in order to optimize the energy consumption in sensor networks. PMID:28489054
T1 weighted fat/water separated PROPELLER acquired with dual bandwidths.
Rydén, Henric; Berglund, Johan; Norbeck, Ola; Avventi, Enrico; Skare, Stefan
2018-04-24
To describe a fat/water separated dual receiver bandwidth (rBW) spin echo PROPELLER sequence that eliminates the dead time associated with single rBW sequences. A nonuniform noise whitening by regularization of the fat/water inverse problem is proposed, to enable dual rBW reconstructions. Bipolar, flyback, and dual spin echo sequences were developed. All sequences acquire two echoes with different rBW without dead time. Chemical shift displacement was corrected by performing the fat/water separation in k-space, prior to gridding. The proposed sequences were compared to fat saturation, and single rBW sequences, in terms of SNR and CNR efficiency, using clinically relevant acquisition parameters. The impact of motion was investigated. Chemical shift correction greatly improved the image quality, especially at high resolution acquired with low rBW, and also improved motion estimates. SNR efficiency of the dual spin echo sequence was up to 20% higher than the single rBW acquisition, while CNR efficiency was 50% higher for the bipolar acquisition. Noise whitening was deemed necessary for all dual rBW acquisitions, rendering high image quality with strong and homogenous fat suppression. Dual rBW sequences eliminate the dead time present in single rBW sequences, which improves SNR efficiency. In combination with the proposed regularization, this enables highly efficient T1-weighted PROPELLER images without chemical shift displacement. © 2018 International Society for Magnetic Resonance in Medicine.
Al-Khalifah, Nasser S; Shanavaskhan, A E
2017-01-01
Ambiguity in the total number of date palm cultivars across the world is pointing toward the necessity for an enumerative study using standard morphological and molecular markers. Among molecular markers, DNA markers are more suitable and ubiquitous to most applications. They are highly polymorphic in nature, frequently occurring in genomes, easy to access, and highly reproducible. Various molecular markers such as restriction fragment length polymorphism (RFLP), amplified fragment length polymorphism (AFLP), simple sequence repeats (SSR), inter-simple sequence repeats (ISSR), and random amplified polymorphic DNA (RAPD) markers have been successfully used as efficient tools for analysis of genetic variation in date palm. This chapter explains a stepwise protocol for extracting total genomic DNA from date palm leaves. A user-friendly protocol for RAPD analysis and a table showing the primers used in different molecular techniques that produce polymorphisms in date palm are also provided.
Real-time echocardiogram transmission protocol based on regions and visualization modes.
Cavero, Eva; Alesanco, Álvaro; García, José
2014-09-01
This paper proposes an Echocardiogram Transmission Protocol (ETP) for real-time end-to-end transmission of echocardiograms over IP networks. The ETP has been designed taking into account the echocardiogram characteristics of each visualized region, encoding each region according to its data type, visualization characteristics and diagnostic importance in order to improve the coding and thus the transmission efficiency. Furthermore, each region is sent separately and different error protection techniques can be used for each region. This leads to an efficient use of resources and provides greater protection for those regions with more clinical information. Synchronization is implemented for regions that change over time. The echocardiogram composition is different for each device. The protocol is valid for all echocardiogram devices thanks to the incorporation of configuration information which includes the composition of the echocardiogram. The efficiency of the ETP has been proved in terms of the number of bits sent with the proposed protocol. The codec and transmission rates used for the regions of interest have been set according to previous recommendations. Although the saving in the codified bits depends on the video composition, a coding gain higher than 7% with respect to without using ETP has been achieved.
Komaroff, A L; Flatley, M; Browne, C; Sherman, H; Fineberg, S E; Knopp, R H
1976-04-01
Briefly trained physicians assistants using protocols (clinical algorithms) for diabetes, hypertension, and related chronic arteriosclerotic and hypertensive heart disease abstrated information from the medical record and obtained history and physical examination data on every patient-visit to a city hospital chronic disease clinic over a 18-month period. The care rendered by the protocol system was compared with care rendered by a "traditional" system in the same clinic in which physicians delegated few clinical tasks. Increased thoroughness in collecting clinical data in the protocol system led to an increase in the recognition of new pathology. Outcome criteria reflected equivalent quality of care in both groups. Efficiency time-motion studies demonstrated a 20 per cent saving in physician time with the protocol system. Coct estimates, based on the time spent with patients by various providers and on the laboratory-test-ordering patterns, demonstrated equivalent costs of the two systems, given optimal staffing patterns. Laboratory tests were a major element of the cost of patient care,and the clinical yield per unit cost of different tests varied widely.
Verification of 2A peptide cleavage.
Szymczak-Workman, Andrea L; Vignali, Kate M; Vignali, Dario A A
2012-02-01
The need for reliable, multicistronic vectors for multigene delivery is at the forefront of biomedical technology. It is now possible to express multiple proteins from a single open reading frame (ORF) using 2A peptide-linked multicistronic vectors. These small sequences, when cloned between genes, allow for efficient, stoichiometric production of discrete protein products within a single vector through a novel "cleavage" event within the 2A peptide sequence. The easiest and most effective way to assess 2A cleavage is to perform transient transfection of 293T cells (human embryonic kidney cells) followed by western blot analysis, as described in this protocol. 293T cells are easy to grow and can be efficiently transfected with a variety of vectors. Cleavage can be assessed by detection with antibodies against the target proteins or anti-2A serum.
Vergani, Stefano; Korsunsky, Ilya; Mazzarello, Andrea Nicola; Ferrer, Gerardo; Chiorazzi, Nicholas; Bagnara, Davide
2017-01-01
Efficient and accurate high-throughput DNA sequencing of the adaptive immune receptor repertoire (AIRR) is necessary to study immune diversity in healthy subjects and disease-related conditions. The high complexity and diversity of the AIRR coupled with the limited amount of starting material, which can compromise identification of the full biological diversity makes such sequencing particularly challenging. AIRR sequencing protocols often fail to fully capture the sampled AIRR diversity, especially for samples containing restricted numbers of B lymphocytes. Here, we describe a library preparation method for immunoglobulin sequencing that results in an exhaustive full-length repertoire where virtually every sampled B-cell is sequenced. This maximizes the likelihood of identifying and quantifying the entire IGHV-D-J repertoire of a sample, including the detection of rearrangements present in only one cell in the starting population. The methodology establishes the importance of circumventing genetic material dilution in the preamplification phases and incorporates the use of certain described concepts: (1) balancing the starting material amount and depth of sequencing, (2) avoiding IGHV gene-specific amplification, and (3) using Unique Molecular Identifier. Together, this methodology is highly efficient, in particular for detecting rare rearrangements in the sampled population and when only a limited amount of starting material is available.
Correlation and Stacking of Relative Paleointensity and Oxygen Isotope Data
NASA Astrophysics Data System (ADS)
Lurcock, P. C.; Channell, J. E.; Lee, D.
2012-12-01
The transformation of a depth-series into a time-series is routinely implemented in the geological sciences. This transformation often involves correlation of a depth-series to an astronomically calibrated time-series. Eyeball tie-points with linear interpolation are still regularly used, although these have the disadvantages of being non-repeatable and not based on firm correlation criteria. Two automated correlation methods are compared: the simulated annealing algorithm (Huybers and Wunsch, 2004) and the Match protocol (Lisiecki and Lisiecki, 2002). Simulated annealing seeks to minimize energy (cross-correlation) as "temperature" is slowly decreased. The Match protocol divides records into intervals, applies penalty functions that constrain accumulation rates, and minimizes the sum of the squares of the differences between two series while maintaining the data sequence in each series. Paired relative paleointensity (RPI) and oxygen isotope records, such as those from IODP Site U1308 and/or reference stacks such as LR04 and PISO, are warped using known warping functions, and then the un-warped and warped time-series are correlated to evaluate the efficiency of the correlation methods. Correlations are performed in tandem to simultaneously optimize RPI and oxygen isotope data. Noise spectra are introduced at differing levels to determine correlation efficiency as noise levels change. A third potential method, known as dynamic time warping, involves minimizing the sum of distances between correlated point pairs across the whole series. A "cost matrix" between the two series is analyzed to find a least-cost path through the matrix. This least-cost path is used to nonlinearly map the time/depth of one record onto the depth/time of another. Dynamic time warping can be expanded to more than two dimensions and used to stack multiple time-series. This procedure can improve on arithmetic stacks, which often lose coherent high-frequency content during the stacking process.
High-Throughput Next-Generation Sequencing of Polioviruses
Montmayeur, Anna M.; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J.; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L.; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A.; Oberste, M. Steven; Burns, Cara C.
2016-01-01
ABSTRACT The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance. PMID:27927929
Modified CTAB and TRIzol protocols improve RNA extraction from chemically complex Embryophyta1
Jordon-Thaden, Ingrid E.; Chanderbali, Andre S.; Gitzendanner, Matthew A.; Soltis, Douglas E.
2015-01-01
Premise of the study: Here we present a series of protocols for RNA extraction across a diverse array of plants; we focus on woody, aromatic, aquatic, and other chemically complex taxa. Methods and Results: Ninety-one taxa were subjected to RNA extraction with three methods presented here: (1) TRIzol/TURBO DNA-free kits using the manufacturer’s protocol with the addition of sarkosyl; (2) a combination method using cetyltrimethylammonium bromide (CTAB) and TRIzol/sarkosyl/TURBO DNA-free; and (3) a combination of CTAB and QIAGEN RNeasy Plant Mini Kit. Bench-ready protocols are given. Conclusions: After an iterative process of working with chemically complex taxa, we conclude that the use of TRIzol supplemented with sarkosyl and the TURBO DNA-free kit is an effective, efficient, and robust method for obtaining RNA from 100 mg of leaf tissue of land plant species (Embryophyta) examined. Our protocols can be used to provide RNA of suitable stability, quantity, and quality for transcriptome sequencing. PMID:25995975
NASA Astrophysics Data System (ADS)
Xu, Ling; Zhao, Zhiwen
2017-08-01
A new quantum protocol with the assistance of a semi-honest third party (TP) is proposed, which allows the participants comparing the equality of their private information without disclosing them. Different from previous protocols, this protocol utilizes quantum key distribution against the collective-dephasing noise and the collective-rotation noise, which is more robust and abandons few samples, to transmit the classical information. In addition, this protocol utilizes the GHZ-like state and the χ + state to produce the entanglement swapping. And the Bell basis and the dual basis are used to measure the particle pair so that 3 bits of each participant's private information can be compared in each comparison time, which is more efficient and consumes fewer comparison times. Meanwhile, there is no need of unitary operation and hash function in this protocol. At the end, various kinds of outside attack and participant attack are discussed and analyzed to be invalid, so it can complete the comparison in security.
An Energy Efficient MAC Protocol for Multi-Hop Swallowable Body Sensor Networks
Lin, Lin; Yang, Chengfeng; Wong, Kai Juan; Yan, Hao; Shen, Junwen; Phee, Soo Jay
2014-01-01
Swallowable body sensor networks (BSNs) are composed of sensors which are swallowed by patients and send the collected data to the outside coordinator. These sensors are energy constraint and the batteries are difficult to be replaced. The medium access control (MAC) protocol plays an important role in energy management. This paper investigates an energy efficient MAC protocol design for swallowable BSNs. Multi-hop communication is analyzed and proved more energy efficient than single-hop communication within the human body when the circuitry power is low. Based on this result, a centrally controlled time slotting schedule is proposed. The major workload is shifted from the sensors to the coordinator. The coordinator collects the path-loss map and calculates the schedules, including routing, slot assignment and transmission power. Sensor nodes follow the schedules to send data in a multi-hop way. The proposed protocol is compared with the IEEE 802.15.6 protocol in terms of energy consumption. The results show that it is more energy efficient than IEEE 802.15.6 for swallowable BSN scenarios. PMID:25330049
A generic assay for whole-genome amplification and deep sequencing of enterovirus A71
Tan, Le Van; Tuyen, Nguyen Thi Kim; Thanh, Tran Tan; Ngan, Tran Thuy; Van, Hoang Minh Tu; Sabanathan, Saraswathy; Van, Tran Thi My; Thanh, Le Thi My; Nguyet, Lam Anh; Geoghegan, Jemma L.; Ong, Kien Chai; Perera, David; Hang, Vu Thi Ty; Ny, Nguyen Thi Han; Anh, Nguyen To; Ha, Do Quang; Qui, Phan Tu; Viet, Do Chau; Tuan, Ha Manh; Wong, Kum Thong; Holmes, Edward C.; Chau, Nguyen Van Vinh; Thwaites, Guy; van Doorn, H. Rogier
2015-01-01
Enterovirus A71 (EV-A71) has emerged as the most important cause of large outbreaks of severe and sometimes fatal hand, foot and mouth disease (HFMD) across the Asia-Pacific region. EV-A71 outbreaks have been associated with (sub)genogroup switches, sometimes accompanied by recombination events. Understanding EV-A71 population dynamics is therefore essential for understanding this emerging infection, and may provide pivotal information for vaccine development. Despite the public health burden of EV-A71, relatively few EV-A71 complete-genome sequences are available for analysis and from limited geographical localities. The availability of an efficient procedure for whole-genome sequencing would stimulate effort to generate more viral sequence data. Herein, we report for the first time the development of a next-generation sequencing based protocol for whole-genome sequencing of EV-A71 directly from clinical specimens. We were able to sequence viruses of subgenogroup C4 and B5, while RNA from culture materials of diverse EV-A71 subgenogroups belonging to both genogroup B and C was successfully amplified. The nature of intra-host genetic diversity was explored in 22 clinical samples, revealing 107 positions carrying minor variants (ranging from 0 to 15 variants per sample). Our analysis of EV-A71 strains sampled in 2013 showed that they all belonged to subgenogroup B5, representing the first report of this subgenogroup in Vietnam. In conclusion, we have successfully developed a high-throughput next-generation sequencing-based assay for whole-genome sequencing of EV-A71 from clinical samples. PMID:25704598
Efficiency and large deviations in time-asymmetric stochastic heat engines
Gingrich, Todd R.; Rotskoff, Grant M.; Vaikuntanathan, Suriyanarayanan; ...
2014-10-24
In a stochastic heat engine driven by a cyclic non-equilibrium protocol, fluctuations in work and heat give rise to a fluctuating efficiency. Using computer simulations and tools from large deviation theory, we have examined these fluctuations in detail for a model two-state engine. We find in general that the form of efficiency probability distributions is similar to those described by Verley et al (2014 Nat. Commun. 5 4721), in particular featuring a local minimum in the long-time limit. In contrast to the time-symmetric engine protocols studied previously, however, this minimum need not occur at the value characteristic of a reversible Carnot engine. Furthermore, while the local minimum may reside at the global minimum of a large deviation rate function, it does not generally correspond to the least likely efficiency measured over finite time. Lastly, we introduce a general approximation for the finite-time efficiency distribution,more » $$P(\\eta )$$, based on large deviation statistics of work and heat, that remains very accurate even when $$P(\\eta )$$ deviates significantly from its large deviation form.« less
Robust Sub-nanomolar Library Preparation for High Throughput Next Generation Sequencing.
Wu, Wells W; Phue, Je-Nie; Lee, Chun-Ting; Lin, Changyi; Xu, Lai; Wang, Rong; Zhang, Yaqin; Shen, Rong-Fong
2018-05-04
Current library preparation protocols for Illumina HiSeq and MiSeq DNA sequencers require ≥2 nM initial library for subsequent loading of denatured cDNA onto flow cells. Such amounts are not always attainable from samples having a relatively low DNA or RNA input; or those for which a limited number of PCR amplification cycles is preferred (less PCR bias and/or more even coverage). A well-tested sub-nanomolar library preparation protocol for Illumina sequencers has however not been reported. The aim of this study is to provide a much needed working protocol for sub-nanomolar libraries to achieve outcomes as informative as those obtained with the higher library input (≥ 2 nM) recommended by Illumina's protocols. Extensive studies were conducted to validate a robust sub-nanomolar (initial library of 100 pM) protocol using PhiX DNA (as a control), genomic DNA (Bordetella bronchiseptica and microbial mock community B for 16S rRNA gene sequencing), messenger RNA, microRNA, and other small noncoding RNA samples. The utility of our protocol was further explored for PhiX library concentrations as low as 25 pM, which generated only slightly fewer than 50% of the reads achieved under the standard Illumina protocol starting with > 2 nM. A sub-nanomolar library preparation protocol (100 pM) could generate next generation sequencing (NGS) results as robust as the standard Illumina protocol. Following the sub-nanomolar protocol, libraries with initial concentrations as low as 25 pM could also be sequenced to yield satisfactory and reproducible sequencing results.
Bayer, Thomas; Adler, Werner; Janka, Rolf; Uder, Michael; Roemer, Frank
2017-12-01
To study the feasibility of magnetic resonance cinematography of the fingers (MRCF) with comparison of image quality of different protocols for depicting the finger anatomy during motion. MRCF was performed during a full flexion and extension movement in 14 healthy volunteers using a finger-gating device. Three real-time sequences (frame rates 17-59 images/min) and one proton density (PD) sequence (3 images/min) were acquired during incremental and continuous motion. Analyses were performed independently by three readers. Qualitative image analysis included Likert-scale grading from 0 (useless) to 5 (excellent) and specific visual analog scale (VAS) grading from 0 (insufficient) to 100 (excellent). Signal-to-noise calculation was performed. Overall percentage agreement and mean absolute disagreement were calculated. Within the real-time sequences a high frame-rate true fast imaging with steady-state free precession (TRUFI) yielded the best image quality with Likert and overall VAS scores of 3.0 ± 0.2 and 60.4 ± 25.3, respectively. The best sequence regarding image quality was an incremental PD with mean values of 4.8 ± 0.2 and 91.2 ± 9.4, respectively. Overall percentage agreement and mean absolute disagreement were 47.9 and 0.7, respectively. No statistically significant SNR differences were found between continuous and incremental motion for the real-time protocols. MRCF is feasible with appropriate image quality during continuous motion using a finger-gating device. Almost perfect image quality is achievable with incremental PD imaging, which represents a compromise for MRCF with the drawback of prolonged scanning time.
Guo, Q; Mintier, G; Ma-Edmonds, M; Storton, D; Wang, X; Xiao, X; Kienzle, B; Zhao, D; Feder, John N
2018-02-01
Using CRISPR/Cas9 delivered as a RNA modality in conjunction with a lipid specifically formulated for large RNA molecules, we demonstrate that homology directed repair (HDR) rates between 20-40% can be achieved in induced pluripotent stem cells (iPSC). Furthermore, low HDR rates (between 1-20%) can be enhanced two- to ten-fold in both iPSCs and HEK293 cells by 'cold shocking' cells at 32 °C for 24-48 hours following transfection. This method can also increases the proportion of loci that have undergone complete sequence conversion across the donor sequence, or 'perfect HDR', as opposed to partial sequence conversion where nucleotides more distal to the CRISPR cut site are less efficiently incorporated ('partial HDR'). We demonstrate that the structure of the single-stranded DNA oligo donor can influence the fidelity of HDR, with oligos symmetric with respect to the CRISPR cleavage site and complementary to the target strand being more efficient at directing 'perfect HDR' compared to asymmetric non-target strand complementary oligos. Our protocol represents an efficient method for making CRISPR-mediated, specific DNA sequence changes within the genome that will facilitate the rapid generation of genetic models of human disease in iPSCs as well as other genome engineered cell lines.
Optimization of White-Matter-Nulled Magnetization Prepared Rapid Gradient Echo (MP-RAGE) Imaging
Saranathan, Manojkumar; Tourdias, Thomas; Bayram, Ersin; Ghanouni, Pejman; Rutt, Brian K.
2014-01-01
Purpose To optimize the white-matter-nulled (WMn) Magnetization Prepared Rapid Gradient Echo (MP-RAGE) sequence at 7T, with comparisons to 3T. Methods Optimal parameters for maximising SNR efficiency were derived. The effect of flip angle and TR on image blurring was modeled using simulations and validated in vivo. A novel 2D-centric radial fan beam (RFB) k-space segmentation scheme was used to shorten scan times and improve parallel imaging. Healthy subjects as well as patients with multiple sclerosis and tremor were scanned using the optimized protocols. Results Inversion repetition times (TS) of 4.5s and 6s were found to yield the highest SNR efficiency for WMn MP-RAGE at 3T and 7T, respectively. Blurring was more sensitive to flip in WMn than in CSFn MP-RAGE and relatively insensitive to TR for both regimes. The 2D RFB scheme had 19% and 47% higher thalamic SNR and SNR efficiency than the 1D centric scheme for WMn MP-RAGE. Compared to 3T, SNR and SNR efficiency were higher for the 7T WMn regime by 56% and 41% respectively. MS lesions in the cortex and thalamus as well as thalamic subnuclei in tremor patients were clearly delineated using WMn MP-RAGE. Conclusion Optimization and new view ordering enabled MP-RAGE imaging with 0.8–1 mm3 isotropic spatial resolution in scan times of 5 minutes with whole brain coverage. PMID:24889754
Zhang, Na; Zhang, Lei; Yang, Qi; Pei, Anqi; Tong, Xiaoxin; Chung, Yiu-Cho; Liu, Xin
2017-06-01
To implement a fast (~15min) MRI protocol for carotid plaque screening using 3D multi-contrast MRI sequences without contrast agent on a 3Tesla MRI scanner. 7 healthy volunteers and 25 patients with clinically confirmed transient ischemic attack or suspected cerebrovascular ischemia were included in this study. The proposed protocol, including 3D T1-weighted and T2-weighted SPACE (variable-flip-angle 3D turbo spin echo), and T1-weighted magnetization prepared rapid acquisition gradient echo (MPRAGE) was performed first and was followed by 2D T1-weighted and T2-weighted turbo spin echo, and post-contrast T1-weighted SPACE sequences. Image quality, number of plaques, and vessel wall thicknesses measured at the intersection of the plaques were evaluated and compared between sequences. Average examination time of the proposed protocol was 14.6min. The average image quality scores of 3D T1-weighted, T2-weighted SPACE, and T1-weighted magnetization prepared rapid acquisition gradient echo were 3.69, 3.75, and 3.48, respectively. There was no significant difference in detecting the number of plaques and vulnerable plaques using pre-contrast 3D images with or without post-contrast T1-weighted SPACE. The 3D SPACE and 2D turbo spin echo sequences had excellent agreement (R=0.96 for T1-weighted and 0.98 for T2-weighted, p<0.001) regarding vessel wall thickness measurements. The proposed protocol demonstrated the feasibility of attaining carotid plaque screening within a 15-minute scan, which provided sufficient anatomical coverage and critical diagnostic information. This protocol offers the potential for rapid and reliable screening for carotid plaques without contrast agent. Copyright © 2016. Published by Elsevier Inc.
PVP-SVM: Sequence-Based Prediction of Phage Virion Proteins Using a Support Vector Machine
Manavalan, Balachandran; Shin, Tae H.; Lee, Gwang
2018-01-01
Accurately identifying bacteriophage virion proteins from uncharacterized sequences is important to understand interactions between the phage and its host bacteria in order to develop new antibacterial drugs. However, identification of such proteins using experimental techniques is expensive and often time consuming; hence, development of an efficient computational algorithm for the prediction of phage virion proteins (PVPs) prior to in vitro experimentation is needed. Here, we describe a support vector machine (SVM)-based PVP predictor, called PVP-SVM, which was trained with 136 optimal features. A feature selection protocol was employed to identify the optimal features from a large set that included amino acid composition, dipeptide composition, atomic composition, physicochemical properties, and chain-transition-distribution. PVP-SVM achieved an accuracy of 0.870 during leave-one-out cross-validation, which was 6% higher than control SVM predictors trained with all features, indicating the efficiency of the feature selection method. Furthermore, PVP-SVM displayed superior performance compared to the currently available method, PVPred, and two other machine-learning methods developed in this study when objectively evaluated with an independent dataset. For the convenience of the scientific community, a user-friendly and publicly accessible web server has been established at www.thegleelab.org/PVP-SVM/PVP-SVM.html. PMID:29616000
PVP-SVM: Sequence-Based Prediction of Phage Virion Proteins Using a Support Vector Machine.
Manavalan, Balachandran; Shin, Tae H; Lee, Gwang
2018-01-01
Accurately identifying bacteriophage virion proteins from uncharacterized sequences is important to understand interactions between the phage and its host bacteria in order to develop new antibacterial drugs. However, identification of such proteins using experimental techniques is expensive and often time consuming; hence, development of an efficient computational algorithm for the prediction of phage virion proteins (PVPs) prior to in vitro experimentation is needed. Here, we describe a support vector machine (SVM)-based PVP predictor, called PVP-SVM, which was trained with 136 optimal features. A feature selection protocol was employed to identify the optimal features from a large set that included amino acid composition, dipeptide composition, atomic composition, physicochemical properties, and chain-transition-distribution. PVP-SVM achieved an accuracy of 0.870 during leave-one-out cross-validation, which was 6% higher than control SVM predictors trained with all features, indicating the efficiency of the feature selection method. Furthermore, PVP-SVM displayed superior performance compared to the currently available method, PVPred, and two other machine-learning methods developed in this study when objectively evaluated with an independent dataset. For the convenience of the scientific community, a user-friendly and publicly accessible web server has been established at www.thegleelab.org/PVP-SVM/PVP-SVM.html.
NASA Astrophysics Data System (ADS)
Scott, M. L.; Gagarin, N.; Mekemson, J. R.; Chintakunta, S. R.
2011-06-01
Until recently, civil engineering material calibration data could only be obtained from material sample cores or via time consuming, stationary calibration measurements in a limited number of locations. Calibration data are used to determine material propagation velocities of electromagnetic waves in test materials for use in layer thickness measurements and subsurface imaging. Limitations these calibration methods impose have been a significant impediment to broader use of nondestructive evaluation methods such as ground-penetrating radar (GPR). In 2006, a new rapid, continuous calibration approach was designed using simulation software to address these measurement limitations during a Federal Highway Administration (FHWA) research and development effort. This continuous calibration method combines a digitally-synthesized step-frequency (SF)-GPR array and a data collection protocol sequence for the common midpoint (CMP) method. Modeling and laboratory test results for various data collection protocols and materials are presented in this paper. The continuous-CMP concept was finally implemented for FHWA in a prototype demonstration system called the Advanced Pavement Evaluation (APE) system in 2009. Data from the continuous-CMP protocol is processed using a semblance/coherency analysis to determine material propagation velocities. Continuously calibrated pavement thicknesses measured with the APE system in 2009 are presented. This method is efficient, accurate, and cost-effective.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, M. L.; Gagarin, N.; Mekemson, J. R.
Until recently, civil engineering material calibration data could only be obtained from material sample cores or via time consuming, stationary calibration measurements in a limited number of locations. Calibration data are used to determine material propagation velocities of electromagnetic waves in test materials for use in layer thickness measurements and subsurface imaging. Limitations these calibration methods impose have been a significant impediment to broader use of nondestructive evaluation methods such as ground-penetrating radar (GPR). In 2006, a new rapid, continuous calibration approach was designed using simulation software to address these measurement limitations during a Federal Highway Administration (FHWA) research andmore » development effort. This continuous calibration method combines a digitally-synthesized step-frequency (SF)-GPR array and a data collection protocol sequence for the common midpoint (CMP) method. Modeling and laboratory test results for various data collection protocols and materials are presented in this paper. The continuous-CMP concept was finally implemented for FHWA in a prototype demonstration system called the Advanced Pavement Evaluation (APE) system in 2009. Data from the continuous-CMP protocol is processed using a semblance/coherency analysis to determine material propagation velocities. Continuously calibrated pavement thicknesses measured with the APE system in 2009 are presented. This method is efficient, accurate, and cost-effective.« less
PRESAGE: PRivacy-preserving gEnetic testing via SoftwAre Guard Extension.
Chen, Feng; Wang, Chenghong; Dai, Wenrui; Jiang, Xiaoqian; Mohammed, Noman; Al Aziz, Md Momin; Sadat, Md Nazmus; Sahinalp, Cenk; Lauter, Kristin; Wang, Shuang
2017-07-26
Advances in DNA sequencing technologies have prompted a wide range of genomic applications to improve healthcare and facilitate biomedical research. However, privacy and security concerns have emerged as a challenge for utilizing cloud computing to handle sensitive genomic data. We present one of the first implementations of Software Guard Extension (SGX) based securely outsourced genetic testing framework, which leverages multiple cryptographic protocols and minimal perfect hash scheme to enable efficient and secure data storage and computation outsourcing. We compared the performance of the proposed PRESAGE framework with the state-of-the-art homomorphic encryption scheme, as well as the plaintext implementation. The experimental results demonstrated significant performance over the homomorphic encryption methods and a small computational overhead in comparison to plaintext implementation. The proposed PRESAGE provides an alternative solution for secure and efficient genomic data outsourcing in an untrusted cloud by using a hybrid framework that combines secure hardware and multiple crypto protocols.
Tagliafico, A; Succio, G; Neumaier, C E; Baio, G; Serafini, G; Ghidara, M; Calabrese, M; Martinoli, C
2012-01-01
Objective The purpose of our study was to determine whether a three-dimensional (3D) isotropic resolution fast spin echo sequence (FSE-cube) has similar image quality and diagnostic performance to a routine MRI protocol for brachial plexus evaluation in volunteers and symptomatic patients at 3.0 T. Institutional review board approval and written informed consent were guaranteed. Methods In this prospective study FSE-cube was added to the standard brachial plexus examination protocol in eight patients (mean age, 50.2 years) with brachial plexus pathologies and in six volunteers (mean age, 54 years). Nerve visibility, tissue contrast, edge sharpness, image blurring, motion artefact and acquisition time were calculated for FSE-cube sequences and for the standard protocol on a standardised five-point scale. The visibility of brachial plexus nerve and surrounding tissues at four levels (roots, interscalene area, costoclavicular space and axillary level) was assessed. Results Image quality and nerve visibility did not significantly differ between FSE-cube and the standard protocol (p>0.05). Acquisition time was statistically and clinically significantly shorter with FSE-cube (p<0.05). Pathological findings were seen equally well with FSE-cube and the standard protocol. Conclusion 3D FSE-cube provided similar image quality in a shorter acquisition time and enabled excellent visualisation of brachial plexus anatomy and pathology in any orientation, regardless of the original scanning plane. PMID:21343321
A More Efficient Contextuality Distillation Protocol
NASA Astrophysics Data System (ADS)
Meng, Hui-xian; Cao, Huai-xin; Wang, Wen-hua; Fan, Ya-jing; Chen, Liang
2018-03-01
Based on the fact that both nonlocality and contextuality are resource theories, it is natural to ask how to amplify them more efficiently. In this paper, we present a contextuality distillation protocol which produces an n-cycle box B ∗ B ' from two given n-cycle boxes B and B '. It works efficiently for a class of contextual n-cycle ( n ≥ 4) boxes which we termed as "the generalized correlated contextual n-cycle boxes". For any two generalized correlated contextual n-cycle boxes B and B ', B ∗ B ' is more contextual than both B and B '. Moreover, they can be distilled toward to the maximally contextual box C H n as the times of iteration goes to infinity. Among the known protocols, our protocol has the strongest approximate ability and is optimal in terms of its distillation rate. What is worth noting is that our protocol can witness a larger set of nonlocal boxes that make communication complexity trivial than the protocol in Brunner and Skrzypczyk (Phys. Rev. Lett. 102, 160403 2009), this might be helpful for exploring the problem that why quantum nonlocality is limited.
A More Efficient Contextuality Distillation Protocol
NASA Astrophysics Data System (ADS)
Meng, Hui-xian; Cao, Huai-xin; Wang, Wen-hua; Fan, Ya-jing; Chen, Liang
2017-12-01
Based on the fact that both nonlocality and contextuality are resource theories, it is natural to ask how to amplify them more efficiently. In this paper, we present a contextuality distillation protocol which produces an n-cycle box B ∗ B ' from two given n-cycle boxes B and B '. It works efficiently for a class of contextual n-cycle (n ≥ 4) boxes which we termed as "the generalized correlated contextual n-cycle boxes". For any two generalized correlated contextual n-cycle boxes B and B ', B ∗ B ' is more contextual than both B and B '. Moreover, they can be distilled toward to the maximally contextual box C H n as the times of iteration goes to infinity. Among the known protocols, our protocol has the strongest approximate ability and is optimal in terms of its distillation rate. What is worth noting is that our protocol can witness a larger set of nonlocal boxes that make communication complexity trivial than the protocol in Brunner and Skrzypczyk (Phys. Rev. Lett. 102, 160403 2009), this might be helpful for exploring the problem that why quantum nonlocality is limited.
O'Dywer, Lian; Littlewood, Simon J; Rahman, Shahla; Spencer, R James; Barber, Sophy K; Russell, Joanne S
2016-01-01
To use a two-arm parallel trial to compare treatment efficiency between a self-ligating and a conventional preadjusted edgewise appliance system. A prospective multi-center randomized controlled clinical trial was conducted in three hospital orthodontic departments. Subjects were randomly allocated to receive treatment with either a self-ligating (3M SmartClip) or conventional (3M Victory) preadjusted edgewise appliance bracket system using a computer-generated random sequence concealed in opaque envelopes, with stratification for operator and center. Two operators followed a standardized protocol regarding bracket bonding procedure and archwire sequence. Efficiency of each ligation system was assessed by comparing the duration of treatment (months), total number of appointments (scheduled and emergency visits), and number of bracket bond failures. One hundred thirty-eight subjects (mean age 14 years 11 months) were enrolled in the study, of which 135 subjects (97.8%) completed treatment. The mean treatment time and number of visits were 25.12 months and 19.97 visits in the SmartClip group and 25.80 months and 20.37 visits in the Victory group. The overall bond failure rate was 6.6% for the SmartClip and 7.2% for Victory, with a similar debond distribution between the two appliances. No significant differences were found between the bracket systems in any of the outcome measures. No serious harm was observed from either bracket system. There was no clinically significant difference in treatment efficiency between treatment with a self-ligating bracket system and a conventional ligation system.
Tumeh, Paul C; Koya, Richard C; Chodon, Thinle; Graham, Nicholas A; Graeber, Thomas G; Comin-Anduix, Begoña; Ribas, Antoni
2010-10-01
Optimized conditions for the ex vivo activation, genetic manipulation, and expansion of human lymphocytes for adoptive cell therapy may lead to protocols that maximize their in vivo function. We analyzed the effects of 4 clinical grade activation and expansion protocols over 3 weeks on cell proliferative rate, immunophenotype, cell metabolism, and transduction efficiency of human peripheral blood mononuclear cells (PBMCs). Peak lentiviral transduction efficiency was early (days 2 to 4), at a time when cells showed a larger size, maximal uptake of metabolic substrates, and the highest level of proximal T-cell receptor signaling engagement. Anti-CD2/3/28 activation beads induced greater proliferation rate and skewed PBMCs early on to a CD4 phenotype when compared with the cells cultured in OKT3. Multicolor surface phenotyping demonstrated that changes in T-cell surface markers that define T-cell functional phenotypes were dependent on the time spent in culture as opposed to the particular activation protocol. In conclusion, ex vivo activation of human PBMCs for adoptive cell therapy demonstrate defined immunophenotypic and functional signatures over time, with cells early on showing larger sizes, higher transduction efficiency, maximal metabolic activity, and zeta-chain-associated protein-70 activation.
Tumeh, Paul C.; Koya, Richard C.; Chodon, Thinle; Graham, Nicholas A.; Graeber, Thomas G.; Comin-Anduix, Begoña; Ribas, Antoni
2011-01-01
Optimized conditions for the ex vivo activation, genetic manipulation, and expansion of human lymphocytes for adoptive cell therapy (ACT) may lead to protocols that maximize their in vivo function. We analyzed the effects of four clinical grade activation and expansion protocols over three weeks on cell proliferative rate, immunophenotype, cell metabolism, and transduction efficiency of human peripheral blood mononuclear cells (PBMCs). Peak lentiviral transduction efficiency was early (days 2 to 4), at a time when cells demonstrated a larger size, maximal uptake of metabolic substrates, and the highest level of proximal TCR signaling engagement. Anti-CD2/3/28 activation beads induced greater proliferation rate and skewed PBMCs early on to a CD4 phenotype when compared to the cells cultured in OKT3. Multicolor surface phenotyping demonstrated that changes in T cell surface markers that define T cell functional phenotypes were dependent on the time spent in culture as opposed to the particular activation protocol. In conclusion, ex vivo activation of human PBMCs for ACT demonstrate defined immunophenotypic and functional signatures over time, with cells early on showing larger sizes, higher transduction efficiency, maximal metabolic activity and ZAP-70 activation. PMID:20842061
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.'s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.'s protocol and existing similar protocols.
USDA-ARS?s Scientific Manuscript database
Estradiol production is essential for reproductive efficiency. This study compared numbers of follicles in beef cows that did or did not have elevated preovulatory estradiol during a fixed-time AI (FTAI) protocol. In experiment 1, 5 low estradiol (LowE2) and 5 high estradiol (HighE2) cows were slaug...
Mackey, Aaron J; Pearson, William R
2004-10-01
Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.
None
2014-12-01
The recent development of methods applying next-generation sequencing to microbial community characterization has led to the proliferation of these studies in a wide variety of sample types. Yet, variation in the physical properties of environmental samples demands that optimal DNA extraction techniques be explored for each new environment. The microbiota associated with many species of insects offer an extraction challenge as they are frequently surrounded by an armored exoskeleton, inhibiting disruption of the tissues within. In this study, we examine the efficacy of several commonly used protocols for extracting bacterial DNA from ants. While bacterial community composition recovered using Illuminamore » 16S rRNA amplicon sequencing was not detectably biased by any method, the quantity of bacterial DNA varied drastically, reducing the number of samples that could be amplified and sequenced. These results indicate that the concentration necessary for dependable sequencing is around 10,000 copies of target DNA per microliter. Exoskeletal pulverization and tissue digestion increased the reliability of extractions, suggesting that these steps should be included in any study of insect-associated microorganisms that relies on obtaining microbial DNA from intact body segments. Although laboratory and analysis techniques should be standardized across diverse sample types as much as possible, minimal modifications such as these will increase the number of environments in which bacterial communities can be successfully studied.« less
A Concurrent Multiple Negotiation Protocol Based on Colored Petri Nets.
Niu, Lei; Ren, Fenghui; Zhang, Minjie; Bai, Quan
2017-11-01
Concurrent multiple negotiation (CMN) provides a mechanism for an agent to simultaneously conduct more than one negotiation. There may exist different interdependency relationships among these negotiations and these interdependency relationships can impact the outcomes of these negotiations. The outcomes of these concurrent negotiations contribute together for the agent to achieve an overall negotiation goal. Handling a CMN while considering interdependency relationships among multiple negotiations is a challenging research problem. This paper: 1) comprehensively highlights research problems of negotiations at concurrent negotiation level; 2) provides a graph-based CMN model with consideration of the interdependency relationships; and 3) proposes a colored Petri net-based negotiation protocol for conducting CMNs. With the proposed protocol, a CMN can be efficiently and concurrently processed and negotiation agreements can be efficiently achieved. Experimental results indicate the effectiveness and efficiency of the proposed protocol in terms of the negotiation success rate, the negotiation time and the negotiation outcome.
Deterministic and efficient quantum cryptography based on Bell's theorem
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Zengbing; Pan Jianwei; Physikalisches Institut, Universitaet Heidelberg, Philosophenweg 12, 69120 Heidelberg
2006-05-15
We propose a double-entanglement-based quantum cryptography protocol that is both efficient and deterministic. The proposal uses photon pairs with entanglement both in polarization and in time degrees of freedom; each measurement in which both of the two communicating parties register a photon can establish one and only one perfect correlation, and thus deterministically create a key bit. Eavesdropping can be detected by violation of local realism. A variation of the protocol shows a higher security, similar to the six-state protocol, under individual attacks. Our scheme allows a robust implementation under the current technology.
Pastor, Géraldine; Jiménez-González, María; Plaza-García, Sandra; Beraza, Marta; Reese, Torsten
2017-06-01
A newly adapted zoomed ultrafast low-angle RARE (U-FLARE) sequence is described for abdominal imaging applications at 11.7 Tesla and compared with the standard echo-plannar imaging (EPI) and snapshot fast low angle shot (FLASH) methods. Ultrafast EPI and snapshot-FLASH protocols were evaluated to determine relaxation times in phantoms and in the mouse kidney in vivo. Owing to their apparent shortcomings, imaging artefacts, signal-to-noise ratio (SNR), and variability in the determination of relaxation times, these methods are compared with the newly implemented zoomed U-FLARE sequence. Snapshot-FLASH has a lower SNR when compared with the zoomed U-FLARE sequence and EPI. The variability in the measurement of relaxation times is higher in the Look-Locker sequences than in inversion recovery experiments. Respectively, the average T1 and T2 values at 11.7 Tesla are as follows: kidney cortex, 1810 and 29 ms; kidney medulla, 2100 and 25 ms; subcutaneous tumour, 2365 and 28 ms. This study demonstrates that the zoomed U-FLARE sequence yields single-shot single-slice images with good anatomical resolution and high SNR at 11.7 Tesla. Thus, it offers a viable alternative to standard protocols for mapping very fast parameters, such as T1 and T2, or dynamic processes in vivo at high field.
2014-01-01
Background Although it is possible to recover the complete mitogenome directly from shotgun sequencing data, currently reported methods and pipelines are still relatively time consuming and costly. Using a sample of the Australian freshwater crayfish Engaeus lengana, we demonstrate that it is possible to achieve three-day turnaround time (four hours hands-on time) from tissue sample to NCBI-ready submission file through the integration of MiSeq sequencing platform, Nextera sample preparation protocol, MITObim assembly algorithm and MITOS annotation pipeline. Results The complete mitochondrial genome of the parastacid freshwater crayfish, Engaeus lengana, was recovered by modest shotgun sequencing (1.2 giga bases) using the Illumina MiSeq benchtop sequencing platform. Genome assembly using the MITObim mitogenome assembler recovered the mitochondrial genome as a single contig with a 97-fold mean coverage (min. = 17; max. = 138). The mitogenome consists of 15,934 base pairs and contains the typical 37 mitochondrial genes and a non-coding AT-rich region. The genome arrangement is similar to the only other published parastacid mitogenome from the Australian genus Cherax. Conclusions We infer that the gene order arrangement found in Cherax destructor is common to Australian crayfish and may be a derived feature of the southern hemisphere family Parastacidae. Further, we report to our knowledge, the simplest and fastest protocol for the recovery and assembly of complete mitochondrial genomes using the MiSeq benchtop sequencer. PMID:24484414
On Alarm Protocol in Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Cichoń, Jacek; Kapelko, Rafał; Lemiesz, Jakub; Zawada, Marcin
We consider the problem of efficient alarm protocol for ad-hoc radio networks consisting of devices that try to gain access for transmission through a shared radio communication channel. The problem arise in tasks that sensors have to quickly inform the target user about an alert situation such as presence of fire, dangerous radiation, seismic vibrations, and more. In this paper, we present a protocol which uses O(logn) time slots and show that Ω(logn/loglogn) is a lower bound for used time slots.
USDA-ARS?s Scientific Manuscript database
We needed to obtain an alternative to conventional cloning to generate high-quality DNA sequences from a variety of nuclear orthologs for phylogenetic studies in potato, to save time and money and to avoid problems typically encountered in cloning. We tested a variety of SSCP protocols to include pu...
Schäffer, Sylvia; Zachos, Frank E.
2017-01-01
DNA-barcoding is a rapidly developing method for efficiently identifying samples to species level by means of short standard DNA sequences. However, reliable species assignment requires the availability of a comprehensive DNA barcode reference library, and hence numerous initiatives aim at generating such barcode databases for particular taxa or geographic regions. Historical museum collections represent a potentially invaluable source for the DNA-barcoding of many taxa. This is particularly true for birds and mammals, for which collecting fresh (voucher) material is often very difficult to (nearly) impossible due to the special animal welfare and conservation regulations that apply to vertebrates in general, and birds and mammals in particular. Moreover, even great efforts might not guarantee sufficiently complete sampling of fresh material in a short period of time. DNA extracted from historical samples is usually degraded, such that only short fragments can be amplified, rendering the recovery of the barcoding region as a single fragment impossible. Here, we present a new set of primers that allows the efficient amplification and sequencing of the entire barcoding region in most higher taxa of Central European birds and mammals in six overlapping fragments, thus greatly increasing the value of historical museum collections for generating DNA barcode reference libraries. Applying our new primer set in recently established NGS protocols promises to further increase the efficiency of barcoding old bird and mammal specimens. PMID:28358863
Schäffer, Sylvia; Zachos, Frank E; Koblmüller, Stephan
2017-01-01
DNA-barcoding is a rapidly developing method for efficiently identifying samples to species level by means of short standard DNA sequences. However, reliable species assignment requires the availability of a comprehensive DNA barcode reference library, and hence numerous initiatives aim at generating such barcode databases for particular taxa or geographic regions. Historical museum collections represent a potentially invaluable source for the DNA-barcoding of many taxa. This is particularly true for birds and mammals, for which collecting fresh (voucher) material is often very difficult to (nearly) impossible due to the special animal welfare and conservation regulations that apply to vertebrates in general, and birds and mammals in particular. Moreover, even great efforts might not guarantee sufficiently complete sampling of fresh material in a short period of time. DNA extracted from historical samples is usually degraded, such that only short fragments can be amplified, rendering the recovery of the barcoding region as a single fragment impossible. Here, we present a new set of primers that allows the efficient amplification and sequencing of the entire barcoding region in most higher taxa of Central European birds and mammals in six overlapping fragments, thus greatly increasing the value of historical museum collections for generating DNA barcode reference libraries. Applying our new primer set in recently established NGS protocols promises to further increase the efficiency of barcoding old bird and mammal specimens.
Brichtová, Eva; Šenkyřík, J
2017-05-01
A low radiation burden is essential during diagnostic procedures in pediatric patients due to their high tissue sensitivity. Using MR examination instead of the routinely used CT reduces the radiation exposure and the risk of adverse stochastic effects. Our retrospective study evaluated the possibility of using ultrafast single-shot (SSh) sequences and turbo spin echo (TSE) sequences in rapid MR brain imaging in pediatric patients with hydrocephalus and a programmable ventriculoperitoneal drainage system. SSh sequences seem to be suitable for examining pediatric patients due to the speed of using this technique, but significant susceptibility artifacts due to the programmable drainage valve degrade the image quality. Therefore, a rapid MR examination protocol based on TSE sequences, less sensitive to artifacts due to ferromagnetic components, has been developed. Of 61 pediatric patients who were examined using MR and the SSh sequence protocol, a group of 15 patients with hydrocephalus and a programmable drainage system also underwent TSE sequence MR imaging. The susceptibility artifact volume in both rapid MR protocols was evaluated using a semiautomatic volumetry system. A statistically significant decrease in the susceptibility artifact volume has been demonstrated in TSE sequence imaging in comparison with SSh sequences. Using TSE sequences reduced the influence of artifacts from the programmable valve, and the image quality in all cases was rated as excellent. In all patients, rapid MR examinations were performed without any need for intravenous sedation or general anesthesia. Our study results strongly suggest the superiority of the TSE sequence MR protocol compared to the SSh sequence protocol in pediatric patients with a programmable ventriculoperitoneal drainage system due to a significant reduction of susceptibility artifact volume. Both rapid sequence MR protocols provide quick and satisfactory brain imaging with no ionizing radiation and a reduced need for intravenous or general anesthesia.
Protein Sequencing with Tandem Mass Spectrometry
NASA Astrophysics Data System (ADS)
Ziady, Assem G.; Kinter, Michael
The recent introduction of electrospray ionization techniques that are suitable for peptides and whole proteins has allowed for the design of mass spectrometric protocols that provide accurate sequence information for proteins. The advantages gained by these approaches over traditional Edman Degradation sequencing include faster analysis and femtomole, sometimes attomole, sensitivity. The ability to efficiently identify proteins has allowed investigators to conduct studies on their differential expression or modification in response to various treatments or disease states. In this chapter, we discuss the use of electrospray tandem mass spectrometry, a technique whereby protein-derived peptides are subjected to fragmentation in the gas phase, revealing sequence information for the protein. This powerful technique has been instrumental for the study of proteins and markers associated with various disorders, including heart disease, cancer, and cystic fibrosis. We use the study of protein expression in cystic fibrosis as an example.
NASA Astrophysics Data System (ADS)
Li, Na; Li, Jian; Li, Lei-Lei; Wang, Zheng; Wang, Tao
2016-08-01
A deterministic secure quantum communication and authentication protocol based on extended GHZ-W state and quantum one-time pad is proposed. In the protocol, state | φ -> is used as the carrier. One photon of | φ -> state is sent to Alice, and Alice obtains a random key by measuring photons with bases determined by ID. The information of bases is secret to others except Alice and Bob. Extended GHZ-W states are used as decoy photons, the positions of which in information sequence are encoded with identity string ID of the legal user, and the eavesdropping detection rate reaches 81%. The eavesdropping detection based on extended GHZ-W state combines with authentication and the secret ID ensures the security of the protocol.
Review and publication of protocol submissions to Trials - what have we learned in 10 years?
Li, Tianjing; Boutron, Isabelle; Al-Shahi Salman, Rustam; Cobo, Erik; Flemyng, Ella; Grimshaw, Jeremy M; Altman, Douglas G
2016-12-16
Trials has 10 years of experience in providing open access publication of protocols for randomised controlled trials. In this editorial, the senior editors and editors-in-chief of Trials discuss editorial issues regarding managing trial protocol submissions, including the content and format of the protocol, timing of submission, approaches to tracking protocol amendments, and the purpose of peer reviewing a protocol submission. With the clarification and guidance provided, we hope we can make the process of publishing trial protocols more efficient and useful to trial investigators and readers.
Accurate typing of short tandem repeats from genome-wide sequencing data and its applications.
Fungtammasan, Arkarachai; Ananda, Guruprasad; Hile, Suzanne E; Su, Marcia Shu-Wei; Sun, Chen; Harris, Robert; Medvedev, Paul; Eckert, Kristin; Makova, Kateryna D
2015-05-01
Short tandem repeats (STRs) are implicated in dozens of human genetic diseases and contribute significantly to genome variation and instability. Yet profiling STRs from short-read sequencing data is challenging because of their high sequencing error rates. Here, we developed STR-FM, short tandem repeat profiling using flank-based mapping, a computational pipeline that can detect the full spectrum of STR alleles from short-read data, can adapt to emerging read-mapping algorithms, and can be applied to heterogeneous genetic samples (e.g., tumors, viruses, and genomes of organelles). We used STR-FM to study STR error rates and patterns in publicly available human and in-house generated ultradeep plasmid sequencing data sets. We discovered that STRs sequenced with a PCR-free protocol have up to ninefold fewer errors than those sequenced with a PCR-containing protocol. We constructed an error correction model for genotyping STRs that can distinguish heterozygous alleles containing STRs with consecutive repeat numbers. Applying our model and pipeline to Illumina sequencing data with 100-bp reads, we could confidently genotype several disease-related long trinucleotide STRs. Utilizing this pipeline, for the first time we determined the genome-wide STR germline mutation rate from a deeply sequenced human pedigree. Additionally, we built a tool that recommends minimal sequencing depth for accurate STR genotyping, depending on repeat length and sequencing read length. The required read depth increases with STR length and is lower for a PCR-free protocol. This suite of tools addresses the pressing challenges surrounding STR genotyping, and thus is of wide interest to researchers investigating disease-related STRs and STR evolution. © 2015 Fungtammasan et al.; Published by Cold Spring Harbor Laboratory Press.
Synthesis of Polysubstituted Pyridines via a One-Pot Metal-Free Strategy.
Wei, Hongbo; Li, Yun; Xiao, Ke; Cheng, Bin; Wang, Huifei; Hu, Lin; Zhai, Hongbin
2015-12-18
An efficient strategy for the one-pot synthesis of polysubstituted pyridines via a cascade reaction from aldehydes, phosphorus ylides, and propargyl azide is reported. The reaction sequence involves a Wittig reaction, a Staudinger reaction, an aza-Wittig reaction, a 6π-3-azatriene electrocyclization, and a 1,3-H shift. This protocol provides quick access to the polysubstituted pyridines from readily available substrates in good to excellent yields.
Visualization of nucleic acids with synthetic exciton-controlled fluorescent oligonucleotide probes.
Wang, Dan Ohtan; Okamoto, Akimitsu
2015-01-01
Engineered probes to adapt new photochemical properties upon recognition of target nucleic acids offer powerful tools to DNA and RNA visualization technologies. Herein, we describe a rapid and effective visualization method of nucleic acids in both fixed and living cells with hybridization-sensitive fluorescent oligonucleotide probes. These probes are efficiently quenched in an aqueous environment due to the homodimeric, excitonic interactions between fluorophores but become highly fluorescent upon hybridization to DNA or RNA with complementary sequences. The fast hybridization kinetics and quick fluorescence activation of the new probes allow applications to simplify the conventional fluorescent in situ hybridization protocols and reduce the amount of time to process the samples. Furthermore, hybridization-sensitive fluorescence emission of the probes allows monitoring dynamic behaviors of RNA in living cells.
'PACLIMS': a component LIM system for high-throughput functional genomic analysis.
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-04-12
Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the approximately 11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors.
'PACLIMS': A component LIM system for high-throughput functional genomic analysis
Donofrio, Nicole; Rajagopalon, Ravi; Brown, Douglas; Diener, Stephen; Windham, Donald; Nolin, Shelly; Floyd, Anna; Mitchell, Thomas; Galadima, Natalia; Tucker, Sara; Orbach, Marc J; Patel, Gayatri; Farman, Mark; Pampanwar, Vishal; Soderlund, Cari; Lee, Yong-Hwan; Dean, Ralph A
2005-01-01
Background Recent advances in sequencing techniques leading to cost reduction have resulted in the generation of a growing number of sequenced eukaryotic genomes. Computational tools greatly assist in defining open reading frames and assigning tentative annotations. However, gene functions cannot be asserted without biological support through, among other things, mutational analysis. In taking a genome-wide approach to functionally annotate an entire organism, in this application the ~11,000 predicted genes in the rice blast fungus (Magnaporthe grisea), an effective platform for tracking and storing both the biological materials created and the data produced across several participating institutions was required. Results The platform designed, named PACLIMS, was built to support our high throughput pipeline for generating 50,000 random insertion mutants of Magnaporthe grisea. To be a useful tool for materials and data tracking and storage, PACLIMS was designed to be simple to use, modifiable to accommodate refinement of research protocols, and cost-efficient. Data entry into PACLIMS was simplified through the use of barcodes and scanners, thus reducing the potential human error, time constraints, and labor. This platform was designed in concert with our experimental protocol so that it leads the researchers through each step of the process from mutant generation through phenotypic assays, thus ensuring that every mutant produced is handled in an identical manner and all necessary data is captured. Conclusion Many sequenced eukaryotes have reached the point where computational analyses are no longer sufficient and require biological support for their predicted genes. Consequently, there is an increasing need for platforms that support high throughput genome-wide mutational analyses. While PACLIMS was designed specifically for this project, the source and ideas present in its implementation can be used as a model for other high throughput mutational endeavors. PMID:15826298
Quantum dense key distribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degiovanni, I.P.; Ruo Berchera, I.; Castelletto, S.
2004-03-01
This paper proposes a protocol for quantum dense key distribution. This protocol embeds the benefits of a quantum dense coding and a quantum key distribution and is able to generate shared secret keys four times more efficiently than the Bennet-Brassard 1984 protocol. We hereinafter prove the security of this scheme against individual eavesdropping attacks, and we present preliminary experimental results, showing its feasibility.
Energy-efficient quantum frequency estimation
NASA Astrophysics Data System (ADS)
Liuzzo-Scorpo, Pietro; Correa, Luis A.; Pollock, Felix A.; Górecka, Agnieszka; Modi, Kavan; Adesso, Gerardo
2018-06-01
The problem of estimating the frequency of a two-level atom in a noisy environment is studied. Our interest is to minimise both the energetic cost of the protocol and the statistical uncertainty of the estimate. In particular, we prepare a probe in a ‘GHZ-diagonal’ state by means of a sequence of qubit gates applied on an ensemble of n atoms in thermal equilibrium. Noise is introduced via a phenomenological time-non-local quantum master equation, which gives rise to a phase-covariant dissipative dynamics. After an interval of free evolution, the n-atom probe is globally measured at an interrogation time chosen to minimise the error bars of the final estimate. We model explicitly a measurement scheme which becomes optimal in a suitable parameter range, and are thus able to calculate the total energetic expenditure of the protocol. Interestingly, we observe that scaling up our multipartite entangled probes offers no precision enhancement when the total available energy {\\boldsymbol{ \\mathcal E }} is limited. This is at stark contrast with standard frequency estimation, where larger probes—more sensitive but also more ‘expensive’ to prepare—are always preferred. Replacing {\\boldsymbol{ \\mathcal E }} by the resource that places the most stringent limitation on each specific experimental setup, would thus help to formulate more realistic metrological prescriptions.
Fojtu, Michaela; Gumulec, Jaromir; Balvan, Jan; Raudenska, Martina; Sztalmachova, Marketa; Polanska, Hana; Smerkova, Kristyna; Adam, Vojtech; Kizek, Rene; Masarik, Michal
2014-02-01
Determination of serum mRNA gained a lot of attention in recent years, particularly from the perspective of disease markers. Streptavidin-modified paramagnetic particles (SMPs) seem an interesting technique, mainly due to possible automated isolation and high efficiency. The aim of this study was to optimize serum isolation protocol to reduce the consumption of chemicals and sample volume. The following factors were optimized: amounts of (i) paramagnetic particles, (ii) oligo(dT)20 probe, (iii) serum, and (iv) the binding sequence (SMPs, oligo(dT)20 , serum vs. oligo(dT)20 , serum and SMPs). RNA content was measured, and the expression of metallothionein-2A as possible prostate cancer marker was analyzed to demonstrate measurable RNA content with ability for RT-PCR detection. Isolation is possible on serum volume range (10-200 μL) without altering of efficiency or purity. Amount of SMPs can be reduced up to 5 μL, with optimal results within 10-30 μL SMPs. Volume of oligo(dT)20 does not affect efficiency, when used within 0.1-0.4 μL. This optimized protocol was also modified to fit needs of automated one-step single-tube analysis with identical efficiency compared to conventional setup. One-step analysis protocol is considered a promising simplification, making RNA isolation suitable for automatable process. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ma, J; Son, J; Arun, B
Purpose: To develop and demonstrate a short breast (sb) MRI protocol that acquires both T2-weighted and dynamic contrast-enhanced T1-weighted images in approximately ten minutes. Methods: The sb-MRI protocol consists of two novel pulse sequences. The first is a flexible fast spin-echo triple-echo Dixon (FTED) sequence for high-resolution fat-suppressed T2-weighted imaging, and the second is a 3D fast dual-echo spoiled gradient sequence (FLEX) for volumetric fat-suppressed T1-weighted imaging before and post contrast agent injection. The flexible FTED sequence replaces each single readout during every echo-spacing period of FSE with three fast-switching bipolar readouts to produce three raw images in a singlemore » acquisition. These three raw images are then post-processed using a Dixon algorithm to generate separate water-only and fat-only images. The FLEX sequence acquires two echoes using dual-echo readout after each RF excitation and the corresponding images are post-processed using a similar Dixon algorithm to yield water-only and fat-only images. The sb-MRI protocol was implemented on a 3T MRI scanner and used for patients who had undergone concurrent clinical MRI for breast cancer screening. Results: With the same scan parameters (eg, spatial coverage, field of view, spatial and temporal resolution) as the clinical protocol, the total scan-time of the sb-MRI protocol (including the localizer, bilateral T2-weighted, and dynamic contrast-enhanced T1-weighted images) was 11 minutes. In comparison, the clinical breast MRI protocol took 43 minutes. Uniform fat suppression and high image quality were consistently achieved by sb-MRI. Conclusion: We demonstrated a sb-MRI protocol comprising both T2-weighted and dynamic contrast-enhanced T1-weighted images can be performed in approximately ten minutes. The spatial and temporal resolution of the images easily satisfies the current breast MRI accreditation guidelines by the American College of Radiology. The protocol has the potential of making breast MRI more widely accessible to and more tolerable by the patients. JMA is the inventor of United States patents that are owned by the University of Texas Board of Regents and currently licensed to GE Healthcare and Siemens Gmbh.« less
Applications of the pipeline environment for visual informatics and genomics computations
2011-01-01
Background Contemporary informatics and genomics research require efficient, flexible and robust management of large heterogeneous data, advanced computational tools, powerful visualization, reliable hardware infrastructure, interoperability of computational resources, and detailed data and analysis-protocol provenance. The Pipeline is a client-server distributed computational environment that facilitates the visual graphical construction, execution, monitoring, validation and dissemination of advanced data analysis protocols. Results This paper reports on the applications of the LONI Pipeline environment to address two informatics challenges - graphical management of diverse genomics tools, and the interoperability of informatics software. Specifically, this manuscript presents the concrete details of deploying general informatics suites and individual software tools to new hardware infrastructures, the design, validation and execution of new visual analysis protocols via the Pipeline graphical interface, and integration of diverse informatics tools via the Pipeline eXtensible Markup Language syntax. We demonstrate each of these processes using several established informatics packages (e.g., miBLAST, EMBOSS, mrFAST, GWASS, MAQ, SAMtools, Bowtie) for basic local sequence alignment and search, molecular biology data analysis, and genome-wide association studies. These examples demonstrate the power of the Pipeline graphical workflow environment to enable integration of bioinformatics resources which provide a well-defined syntax for dynamic specification of the input/output parameters and the run-time execution controls. Conclusions The LONI Pipeline environment http://pipeline.loni.ucla.edu provides a flexible graphical infrastructure for efficient biomedical computing and distributed informatics research. The interactive Pipeline resource manager enables the utilization and interoperability of diverse types of informatics resources. The Pipeline client-server model provides computational power to a broad spectrum of informatics investigators - experienced developers and novice users, user with or without access to advanced computational-resources (e.g., Grid, data), as well as basic and translational scientists. The open development, validation and dissemination of computational networks (pipeline workflows) facilitates the sharing of knowledge, tools, protocols and best practices, and enables the unbiased validation and replication of scientific findings by the entire community. PMID:21791102
Omori's Law Applied to Mining-Induced Seismicity and Re-entry Protocol Development
NASA Astrophysics Data System (ADS)
Vallejos, J. A.; McKinnon, S. D.
2010-02-01
This paper describes a detailed study of the Modified Omori's law n( t) = K/( c + t) p applied to 163 mining-induced aftershock sequences from four different mine environments in Ontario, Canada. We demonstrate, using a rigorous statistical analysis, that this equation can be adequately used to describe the decay rate of mining-induced aftershock sequences. The parameters K, p and c are estimated using a uniform method that employs the maximum likelihood procedure and the Anderson-Darling statistic. To estimate consistent decay parameters, the method considers only the time interval that satisfies power-law behavior. The p value differs from sequence to sequence, with most (98%) ranging from 0.4 to 1.6. The parameter K can be satisfactorily expressed by: K = κN 1, where κ is an activity ratio and N 1 is the measured number of events occurring during the first hour after the principal event. The average κ values are in a well-defined range. Theoretically κ ≤ 0.8, and empirically κ ∈ [0.3-0.5]. These two findings enable us to develop a real-time event rate re-entry protocol 1 h after the principal event. Despite the fact that the Omori formula is temporally self-similar, we found a characteristic time T MC at the maximum curvature point, which is a function of Omori's law parameters. For a time sequence obeying an Omori process, T MC marks the transition from highest to lowest event rate change. Using solely the aftershock decay rate, therefore, we recommend T MC as a preliminary estimate of the time at which it may be considered appropriate to re-enter an area affected by a blast or large event. We found that T MC can be estimated without specifying a p value by the expression: T MC = a N {1/ b }, where a and b are two parameters dependent on local conditions. Both parameters presented well-constrained empirical ranges for the sites analyzed: a ∈ [0.3-0.5] and b ∈ [0.5-0.7]. These findings provide concise and well-justified guidelines for event rate re-entry protocol development.
Neurovascular Study of the Trigeminal Nerve at 3 T MRI
Gonzalez, Nadia; Muñoz, Alexandra; Bravo, Fernando; Sarroca, Daniel; Morales, Carlos
2015-01-01
This study aimed to show a novel visualization method to investigate neurovascular compression of the trigeminal nerve (TN) using a volume-rendering fusion imaging technique of 3D fast imaging employing steady-state acquisition (3D FIESTA) and coregistered 3D time of flight MR angiography (3D TOF MRA) sequences, which we called “neurovascular study of the trigeminal nerve”. We prospectively studied 30 patients with unilateral trigeminal neuralgia (TN) and 50 subjects without symptoms of TN (control group), on a 3 Tesla scanner. All patients were assessed using 3D FIESTA and 3D TOF MRA sequences centered on the pons, as well as a standard brain protocol including axial T1, T2, FLAIR and GRE sequences to exclude other pathologies that could cause TN. Post-contrast T1-weighted sequences were also performed. All cases showing arterial imprinting on the trigeminal nerve (n = 11) were identified on the ipsilateral side of the pain. No significant relationship was found between the presence of an artery in contact with the trigeminal nerve and TN. Eight cases were found showing arterial contact on the ipsilateral side of the pain and five cases of arterial contact on the contralateral side. The fusion imaging technique of 3D FIESTA and 3D TOF MRA sequences, combining the high anatomical detail provided by the 3D FIESTA sequence with the 3D TOF MRA sequence and its capacity to depict arterial structures, results in a tool that enables quick and efficient visualization and assessment of the relationship between the trigeminal nerve and the neighboring vascular structures. PMID:25924169
Dilliott, Allison A; Farhan, Sali M K; Ghani, Mahdi; Sato, Christine; Liang, Eric; Zhang, Ming; McIntyre, Adam D; Cao, Henian; Racacho, Lemuel; Robinson, John F; Strong, Michael J; Masellis, Mario; Bulman, Dennis E; Rogaeva, Ekaterina; Lang, Anthony; Tartaglia, Carmela; Finger, Elizabeth; Zinman, Lorne; Turnbull, John; Freedman, Morris; Swartz, Rick; Black, Sandra E; Hegele, Robert A
2018-04-04
Next-generation sequencing (NGS) is quickly revolutionizing how research into the genetic determinants of constitutional disease is performed. The technique is highly efficient with millions of sequencing reads being produced in a short time span and at relatively low cost. Specifically, targeted NGS is able to focus investigations to genomic regions of particular interest based on the disease of study. Not only does this further reduce costs and increase the speed of the process, but it lessens the computational burden that often accompanies NGS. Although targeted NGS is restricted to certain regions of the genome, preventing identification of potential novel loci of interest, it can be an excellent technique when faced with a phenotypically and genetically heterogeneous disease, for which there are previously known genetic associations. Because of the complex nature of the sequencing technique, it is important to closely adhere to protocols and methodologies in order to achieve sequencing reads of high coverage and quality. Further, once sequencing reads are obtained, a sophisticated bioinformatics workflow is utilized to accurately map reads to a reference genome, to call variants, and to ensure the variants pass quality metrics. Variants must also be annotated and curated based on their clinical significance, which can be standardized by applying the American College of Medical Genetics and Genomics Pathogenicity Guidelines. The methods presented herein will display the steps involved in generating and analyzing NGS data from a targeted sequencing panel, using the ONDRISeq neurodegenerative disease panel as a model, to identify variants that may be of clinical significance.
Intelligent routing protocol for ad hoc wireless network
NASA Astrophysics Data System (ADS)
Peng, Chaorong; Chen, Chang Wen
2006-05-01
A novel routing scheme for mobile ad hoc networks (MANETs), which combines hybrid and multi-inter-routing path properties with a distributed topology discovery route mechanism using control agents is proposed in this paper. In recent years, a variety of hybrid routing protocols for Mobile Ad hoc wireless networks (MANETs) have been developed. Which is proactively maintains routing information for a local neighborhood, while reactively acquiring routes to destinations beyond the global. The hybrid protocol reduces routing discovery latency and the end-to-end delay by providing high connectivity without requiring much of the scarce network capacity. On the other side the hybrid routing protocols in MANETs likes Zone Routing Protocol still need route "re-discover" time when a route between zones link break. Sine the topology update information needs to be broadcast routing request on local zone. Due to this delay, the routing protocol may not be applicable for real-time data and multimedia communication. We utilize the advantages of a clustering organization and multi-routing path in routing protocol to achieve several goals at the same time. Firstly, IRP efficiently saves network bandwidth and reduces route reconstruction time when a routing path fails. The IRP protocol does not require global periodic routing advertisements, local control agents will automatically monitor and repair broke links. Secondly, it efficiently reduces congestion and traffic "bottlenecks" for ClusterHeads in clustering network. Thirdly, it reduces significant overheads associated with maintaining clusters. Fourthly, it improves clusters stability due to dynamic topology changing frequently. In this paper, we present the Intelligent Routing Protocol. First, we discuss the problem of routing in ad hoc networks and the motivation of IRP. We describe the hierarchical architecture of IRP. We describe the routing process and illustrate it with an example. Further, we describe the control manage mechanisms, which are used to control active route and reduce the traffic amount in the route discovery procedure. Finial, the numerical experiments are given to show the effectiveness of IRP routing protocol.
Time-of-flight magnetic resonance angiography (TOF-MRA) of the normal equine head.
Manso-Díaz, G; García-Real, M I; Casteleyn, C; San-Román, F; Taeymans, O
2013-03-01
Noncontrast magnetic resonance angiography (MRA) is widely used in human and small animal medicine. However, this technique has not yet been described in the horse, and compared to other angiographic techniques MRA could be more cost efficient and potentially safer. The aim of this study was to provide a comprehensive anatomical reference of the normal equine head vasculature using a noncontrast MRA technique, on both low- and high-field MRI. Five healthy adult horses were examined, 4 with a low-field magnet (0.23T) and the remaining one with a high-field magnet (1.5T). The magnetic resonance angiography sequence used was TOF (time-of-flight) 2D-MRA and CT images of a vascular corrosion cast were subsequently used as anatomical references. The MRA imaging protocol provided good visualisation of all major intra- and extracranial vessels down to a size of approximately 2 mm in diameter on both low- and high-field systems. This resulted in identification of vessels to the order of 3rd-4th branches of ramification. The visibility of the arteries was higher than of the veins, which showed lower signal intensity. Overall, MRA obtained with the high-field protocol provided better visualisation of the arteries, showing all the small arterial branches with a superior resolution. The use of a specific vascular sequence such as TOF 2D-MRA allows good visualisation of the equine head vasculature and eliminates the need for contrast media for MRA. Magnetic resonance angiography allows for visualisation of the vasculature of the equine head. Vessel morphology, symmetry and size can be evaluated and this may possibly play a role in preoperative planning or characterisation of diseases of the head, such as neoplasia or guttural pouch mycosis. © 2012 EVJ Ltd.
ERIC Educational Resources Information Center
Cromley, Jennifer G.; Wills, Theodore W.
2016-01-01
Van den Broek's landscape model explicitly posits sequences of moves during reading in real time. Two other models that implicitly describe sequences of processes during reading are tested in the present research. Coded think-aloud data from 24 undergraduate students reading scientific text were analysed with lag-sequential techniques to compare…
Zhang, Yiming; Jin, Quan; Wang, Shuting; Ren, Ren
2011-05-01
The mobile behavior of 1481 peptides in ion mobility spectrometry (IMS), which are generated by protease digestion of the Drosophila melanogaster proteome, is modeled and predicted based on two different types of characterization methods, i.e. sequence-based approach and structure-based approach. In this procedure, the sequence-based approach considers both the amino acid composition of a peptide and the local environment profile of each amino acid in the peptide; the structure-based approach is performed with the CODESSA protocol, which regards a peptide as a common organic compound and generates more than 200 statistically significant variables to characterize the whole structure profile of a peptide molecule. Subsequently, the nonlinear support vector machine (SVM) and Gaussian process (GP) as well as linear partial least squares (PLS) regression is employed to correlate the structural parameters of the characterizations with the IMS drift times of these peptides. The obtained quantitative structure-spectrum relationship (QSSR) models are evaluated rigorously and investigated systematically via both one-deep and two-deep cross-validations as well as the rigorous Monte Carlo cross-validation (MCCV). We also give a comprehensive comparison on the resulting statistics arising from the different combinations of variable types with modeling methods and find that the sequence-based approach can give the QSSR models with better fitting ability and predictive power but worse interpretability than the structure-based approach. In addition, though the QSSR modeling using sequence-based approach is not needed for the preparation of the minimization structures of peptides before the modeling, it would be considerably efficient as compared to that using structure-based approach. Copyright © 2011 Elsevier Ltd. All rights reserved.
Efficiency of different protocols for enamel clean-up after bracket debonding: an in vitro study
Sigilião, Lara Carvalho Freitas; Marquezan, Mariana; Elias, Carlos Nelson; Ruellas, Antônio Carlos; Sant'Anna, Eduardo Franzotti
2015-01-01
Objective: This study aimed to assess the efficiency of six protocols for cleaning-up tooth enamel after bracket debonding. Methods: A total of 60 premolars were divided into six groups, according to the tools used for clean-up: 12-blade bur at low speed (G12L), 12-blade bur at high speed (G12H), 30-blade bur at low speed (G30L), DU10CO ORTHO polisher (GDU), Renew System (GR) and Diagloss polisher (GD). Mean roughness (Ra) and mean roughness depth (Rz) of enamel surface were analyzed with a profilometer. Paired t-test was used to assess Ra and Rz before and after enamel clean-up. ANOVA/Tukey tests were used for intergroup comparison. The duration of removal procedures was recorded. The association between time and variation in enamel roughness (∆Ra, ∆Rz) were evaluated by Pearson's correlation test. Enamel topography was assessed by scanning electron microscopy (SEM). Results: In Groups G12L and G12H, original enamel roughness did not change significantly. In Groups G30L, GDU, GR and GD, a smoother surface (p < 0.05) was found after clean-up. In Groups G30L and GD, the protocols used were more time-consuming than those used in the other groups. Negative and moderate correlation was observed between time and (∆Ra, ∆Rz); Ra and (∆Ra, ∆Rz); Rz (r = - 0.445, r = - 0.475, p < 0.01). Conclusion: All enamel clean-up protocols were efficient because they did not result in increased surface roughness. The longer the time spent performing the protocol, the lower the surface roughness. PMID:26560825
Reddy, Alavalapati Goutham; Das, Ashok Kumar; Odelu, Vanga; Yoo, Kee-Young
2016-01-01
Biometric based authentication protocols for multi-server architectures have gained momentum in recent times due to advancements in wireless technologies and associated constraints. Lu et al. recently proposed a robust biometric based authentication with key agreement protocol for a multi-server environment using smart cards. They claimed that their protocol is efficient and resistant to prominent security attacks. The careful investigation of this paper proves that Lu et al.’s protocol does not provide user anonymity, perfect forward secrecy and is susceptible to server and user impersonation attacks, man-in-middle attacks and clock synchronization problems. In addition, this paper proposes an enhanced biometric based authentication with key-agreement protocol for multi-server architecture based on elliptic curve cryptography using smartcards. We proved that the proposed protocol achieves mutual authentication using Burrows-Abadi-Needham (BAN) logic. The formal security of the proposed protocol is verified using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our protocol can withstand active and passive attacks. The formal and informal security analyses and performance analysis demonstrates that the proposed protocol is robust and efficient compared to Lu et al.’s protocol and existing similar protocols. PMID:27163786
Banothu, Janardhan; Gali, Rajitha; Velpula, Ravibabu; Bavantula, Rajitha; Crooks, Peter A.
2013-01-01
Highly efficient and eco-friendly protocol for the synthesis of bis(3-indolyl)methanes by the electrophilic substitution reaction of indole with aldehydes catalyzed by poly(4-vinylpyridinium)hydrogen sulfate was described. Excellent yields, shorter reaction times, simple work-up procedure, avoiding hazardous organic solvents, and reusability of the catalyst are the most obvious advantages of this method. PMID:24052864
Analysis and Visualization of ChIP-Seq and RNA-Seq Sequence Alignments Using ngs.plot.
Loh, Yong-Hwee Eddie; Shen, Li
2016-01-01
The continual maturation and increasing applications of next-generation sequencing technology in scientific research have yielded ever-increasing amounts of data that need to be effectively and efficiently analyzed and innovatively mined for new biological insights. We have developed ngs.plot-a quick and easy-to-use bioinformatics tool that performs visualizations of the spatial relationships between sequencing alignment enrichment and specific genomic features or regions. More importantly, ngs.plot is customizable beyond the use of standard genomic feature databases to allow the analysis and visualization of user-specified regions of interest generated by the user's own hypotheses. In this protocol, we demonstrate and explain the use of ngs.plot using command line executions, as well as a web-based workflow on the Galaxy framework. We replicate the underlying commands used in the analysis of a true biological dataset that we had reported and published earlier and demonstrate how ngs.plot can easily generate publication-ready figures. With ngs.plot, users would be able to efficiently and innovatively mine their own datasets without having to be involved in the technical aspects of sequence coverage calculations and genomic databases.
Energy-efficient and fast data gathering protocols for indoor wireless sensor networks.
Tümer, Abdullah Erdal; Gündüz, Mesut
2010-01-01
Wireless Sensor Networks have become an important technology with numerous potential applications for the interaction of computers and the physical environment in civilian and military areas. In the routing protocols that are specifically designed for the applications used by sensor networks, the limited available power of the sensor nodes has been taken into consideration in order to extend the lifetime of the networks. In this paper, two protocols based on LEACH and called R-EERP and S-EERP with base and threshold values are presented. R-EERP and S-EERP are two efficient energy aware routing protocols that can be used for some critical applications such as detecting dangerous gases (methane, ammonium, carbon monoxide, etc.) in an indoor environment. In R-EERP, sensor nodes are deployed randomly in a field similar to LEACH. In S-EERP, nodes are deployed sequentially in the rooms of the flats of a multi-story building. In both protocols, nodes forming clusters do not change during a cluster change time, only the cluster heads change. Furthermore, an XOR operation is performed on the collected data in order to prevent the sending of the same data sensed by the nodes close to each other. Simulation results show that our proposed protocols are more energy-efficient than the conventional LEACH protocol.
Fonseca, Alejandra; Renjifo-Ibáñez, Camila; Renjifo, Juan Manuel; Cabrera, Rodrigo
2018-03-21
Snake venoms are a mixture of different molecules that can be used in the design of drugs for various diseases. The study of these venoms has relied on strategies that use complete venom extracted from animals in captivity or from venom glands that require the sacrifice of the animals. Colombia, a country with political and geographical conflicts has difficult access to certain regions. A strategy that can prevent the sacrifice of animals and could allow the study of samples collected in the field is necessary. We report the use of lyophilized venom from Crotalus durissus cumanensis as a model to test, for the first time, a protocol for the amplification of complete toxins from Colombian venom samples collected in the field. In this protocol, primers were designed from conserved region from Crotalus sp. mRNA and EST regions to maximize the likelihood of coding sequence amplification. We obtained the sequences of Metalloproteinases II, Disintegrins, Disintegrin-Like, Phospholipases A 2, C-type Lectins and Serine proteinases from Crotalus durissus cumanensis and compared them to different Crotalus sp sequences available on databases obtaining concordance between the toxins amplified and those reported. Our strategy allows the use of lyophilized venom to obtain complete toxin sequences from samples collected in the field and the study of poorly characterized venoms in challenging environments. Copyright © 2018 Elsevier Ltd. All rights reserved.
Efficient generation of monoclonal antibodies from single rhesus macaque antibody secreting cells.
Meng, Weixu; Li, Leike; Xiong, Wei; Fan, Xuejun; Deng, Hui; Bett, Andrew J; Chen, Zhifeng; Tang, Aimin; Cox, Kara S; Joyce, Joseph G; Freed, Daniel C; Thoryk, Elizabeth; Fu, Tong-Ming; Casimiro, Danilo R; Zhang, Ningyan; A Vora, Kalpit; An, Zhiqiang
2015-01-01
Nonhuman primates (NHPs) are used as a preclinical model for vaccine development, and the antibody profiles to experimental vaccines in NHPs can provide critical information for both vaccine design and translation to clinical efficacy. However, an efficient protocol for generating monoclonal antibodies from single antibody secreting cells of NHPs is currently lacking. In this study we established a robust protocol for cloning immunoglobulin (IG) variable domain genes from single rhesus macaque (Macaca mulatta) antibody secreting cells. A sorting strategy was developed using a panel of molecular markers (CD3, CD19, CD20, surface IgG, intracellular IgG, CD27, Ki67 and CD38) to identify the kinetics of B cell response after vaccination. Specific primers for the rhesus macaque IG genes were designed and validated using cDNA isolated from macaque peripheral blood mononuclear cells. Cloning efficiency was averaged at 90% for variable heavy (VH) and light (VL) domains, and 78.5% of the clones (n = 335) were matched VH and VL pairs. Sequence analysis revealed that diverse IGHV subgroups (for VH) and IGKV and IGLV subgroups (for VL) were represented in the cloned antibodies. The protocol was tested in a study using an experimental dengue vaccine candidate. About 26.6% of the monoclonal antibodies cloned from the vaccinated rhesus macaques react with the dengue vaccine antigens. These results validate the protocol for cloning monoclonal antibodies in response to vaccination from single macaque antibody secreting cells, which have general applicability for determining monoclonal antibody profiles in response to other immunogens or vaccine studies of interest in NHPs.
Computationally mapping sequence space to understand evolutionary protein engineering.
Armstrong, Kathryn A; Tidor, Bruce
2008-01-01
Evolutionary protein engineering has been dramatically successful, producing a wide variety of new proteins with altered stability, binding affinity, and enzymatic activity. However, the success of such procedures is often unreliable, and the impact of the choice of protein, engineering goal, and evolutionary procedure is not well understood. We have created a framework for understanding aspects of the protein engineering process by computationally mapping regions of feasible sequence space for three small proteins using structure-based design protocols. We then tested the ability of different evolutionary search strategies to explore these sequence spaces. The results point to a non-intuitive relationship between the error-prone PCR mutation rate and the number of rounds of replication. The evolutionary relationships among feasible sequences reveal hub-like sequences that serve as particularly fruitful starting sequences for evolutionary search. Moreover, genetic recombination procedures were examined, and tradeoffs relating sequence diversity and search efficiency were identified. This framework allows us to consider the impact of protein structure on the allowed sequence space and therefore on the challenges that each protein presents to error-prone PCR and genetic recombination procedures.
Bidirectional Retroviral Integration Site PCR Methodology and Quantitative Data Analysis Workflow.
Suryawanshi, Gajendra W; Xu, Song; Xie, Yiming; Chou, Tom; Kim, Namshin; Chen, Irvin S Y; Kim, Sanggu
2017-06-14
Integration Site (IS) assays are a critical component of the study of retroviral integration sites and their biological significance. In recent retroviral gene therapy studies, IS assays, in combination with next-generation sequencing, have been used as a cell-tracking tool to characterize clonal stem cell populations sharing the same IS. For the accurate comparison of repopulating stem cell clones within and across different samples, the detection sensitivity, data reproducibility, and high-throughput capacity of the assay are among the most important assay qualities. This work provides a detailed protocol and data analysis workflow for bidirectional IS analysis. The bidirectional assay can simultaneously sequence both upstream and downstream vector-host junctions. Compared to conventional unidirectional IS sequencing approaches, the bidirectional approach significantly improves IS detection rates and the characterization of integration events at both ends of the target DNA. The data analysis pipeline described here accurately identifies and enumerates identical IS sequences through multiple steps of comparison that map IS sequences onto the reference genome and determine sequencing errors. Using an optimized assay procedure, we have recently published the detailed repopulation patterns of thousands of Hematopoietic Stem Cell (HSC) clones following transplant in rhesus macaques, demonstrating for the first time the precise time point of HSC repopulation and the functional heterogeneity of HSCs in the primate system. The following protocol describes the step-by-step experimental procedure and data analysis workflow that accurately identifies and quantifies identical IS sequences.
Sand, Olivier; Thomas-Chollier, Morgane; Vervisch, Eric; van Helden, Jacques
2008-01-01
This protocol shows how to access the Regulatory Sequence Analysis Tools (RSAT) via a programmatic interface in order to automate the analysis of multiple data sets. We describe the steps for writing a Perl client that connects to the RSAT Web services and implements a workflow to discover putative cis-acting elements in promoters of gene clusters. In the presented example, we apply this workflow to lists of transcription factor target genes resulting from ChIP-chip experiments. For each factor, the protocol predicts the binding motifs by detecting significantly overrepresented hexanucleotides in the target promoters and generates a feature map that displays the positions of putative binding sites along the promoter sequences. This protocol is addressed to bioinformaticians and biologists with programming skills (notions of Perl). Running time is approximately 6 min on the example data set.
NASA Astrophysics Data System (ADS)
Zhou, Zheng; Liu, Chen; Shen, Wensheng; Dong, Zhen; Chen, Zhe; Huang, Peng; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng
2017-04-01
A binary spike-time-dependent plasticity (STDP) protocol based on one resistive-switching random access memory (RRAM) device was proposed and experimentally demonstrated in the fabricated RRAM array. Based on the STDP protocol, a novel unsupervised online pattern recognition system including RRAM synapses and CMOS neurons is developed. Our simulations show that the system can efficiently compete the handwritten digits recognition task, which indicates the feasibility of using the RRAM-based binary STDP protocol in neuromorphic computing systems to obtain good performance.
A novel approach to automatic threat detection in MMW imagery of people scanned in portals
NASA Astrophysics Data System (ADS)
Vaidya, Nitin M.; Williams, Thomas
2008-04-01
We have developed a novel approach to performing automatic detection of concealed threat objects in passive MMW imagery of people scanned in a portal setting. It is applicable to the significant class of imaging scanners that use the protocol of having the subject rotate in front of the camera in order to image them from several closely spaced directions. Customary methods of dealing with MMW sequences rely on the analysis of the spatial images in a frame-by-frame manner, with information extracted from separate frames combined by some subsequent technique of data association and tracking over time. We contend that the pooling of information over time in traditional methods is not as direct as can be and potentially less efficient in distinguishing threats from clutter. We have formulated a more direct approach to extracting information about the scene as it evolves over time. We propose an atypical spatio-temporal arrangement of the MMW image data - to which we give the descriptive name Row Evolution Image (REI) sequence. This representation exploits the singular aspect of having the subject rotate in front of the camera. We point out which features in REIs are most relevant to detecting threats, and describe the algorithms we have developed to extract them. We demonstrate results of successful automatic detection of threats, including ones whose faint image contrast renders their disambiguation from clutter very challenging. We highlight the ease afforded by the REI approach in permitting specialization of the detection algorithms to different parts of the subject body. Finally, we describe the execution efficiency advantages of our approach, given its natural fit to parallel processing. mage
ChIP-seq and RNA-seq methods to study circadian control of transcription in mammals
Takahashi, Joseph S.; Kumar, Vivek; Nakashe, Prachi; Koike, Nobuya; Huang, Hung-Chung; Green, Carla B.; Kim, Tae-Kyung
2015-01-01
Genome-wide analyses have revolutionized our ability to study the transcriptional regulation of circadian rhythms. The advent of next-generation sequencing methods has facilitated the use of two such technologies, ChIP-seq and RNA-seq. In this chapter, we describe detailed methods and protocols for these two techniques, with emphasis on their usage in circadian rhythm experiments in the mouse liver, a major target organ of the circadian clock system. Critical factors for these methods are highlighted and issues arising with time series samples for ChIP-seq and RNA-seq are discussed. Finally detailed protocols for library preparation suitable for Illumina sequencing platforms are presented. PMID:25662462
Im, Eung Jun; Bais, Anthony J; Yang, Wen; Ma, Qiangzhong; Guo, Xiuyang; Sepe, Steven M; Junghans, Richard P
2014-01-01
Transduction and expression procedures in gene therapy protocols may optimally transfer more than a single gene to correct a defect and/or transmit new functions to recipient cells or organisms. This may be accomplished by transduction with two (or more) vectors, or, more efficiently, in a single vector. Occasionally, it may be useful to coexpress homologous genes or chimeric proteins with regions of shared homology. Retroviridae include the dominant vector systems for gene transfer (e.g., gamma-retro and lentiviruses) and are capable of such multigene expression. However, these same viruses are known for efficient recombination–deletion when domains are duplicated within the viral genome. This problem can be averted by resorting to two-vector strategies (two-chain two-vector), but at a penalty to cost, convenience, and efficiency. Employing a chimeric antigen receptor system as an example, we confirm that coexpression of two genes with homologous domains in a single gamma-retroviral vector (two-chain single-vector) leads to recombination–deletion between repeated sequences, excising the equivalent of one of the chimeric antigen receptors. Here, we show that a degenerate codon substitution strategy in the two-chain single-vector format efficiently suppressed intravector deletional loss with rescue of balanced gene coexpression by minimizing sequence homology between repeated domains and preserving the final protein sequence. PMID:25419532
Optimization of protocol design: a path to efficient, lower cost clinical trial execution
Malikova, Marina A
2016-01-01
Managing clinical trials requires strategic planning and efficient execution. In order to achieve a timely delivery of important clinical trials’ outcomes, it is useful to establish standardized trial management guidelines and develop robust scoring methodology for evaluation of study protocol complexity. This review will explore the challenges clinical teams face in developing protocols to ensure that the right patients are enrolled and the right data are collected to demonstrate that a drug is safe and efficacious, while managing study costs and study complexity based on proposed comprehensive scoring model. Key factors to consider when developing protocols and techniques to minimize complexity will be discussed. A methodology to identify processes at planning phase, approaches to increase fiscal return and mitigate fiscal compliance risk for clinical trials will be addressed. PMID:28031939
Lu, Fu-Hao; McKenzie, Neil; Kettleborough, George; Heavens, Darren; Clark, Matthew D; Bevan, Michael W
2018-05-01
The accurate sequencing and assembly of very large, often polyploid, genomes remains a challenging task, limiting long-range sequence information and phased sequence variation for applications such as plant breeding. The 15-Gb hexaploid bread wheat (Triticum aestivum) genome has been particularly challenging to sequence, and several different approaches have recently generated long-range assemblies. Mapping and understanding the types of assembly errors are important for optimising future sequencing and assembly approaches and for comparative genomics. Here we use a Fosill 38-kb jumping library to assess medium and longer-range order of different publicly available wheat genome assemblies. Modifications to the Fosill protocol generated longer Illumina sequences and enabled comprehensive genome coverage. Analyses of two independent Bacterial Artificial Chromosome (BAC)-based chromosome-scale assemblies, two independent Illumina whole genome shotgun assemblies, and a hybrid Single Molecule Real Time (SMRT-PacBio) and short read (Illumina) assembly were carried out. We revealed a surprising scale and variety of discrepancies using Fosill mate-pair mapping and validated several of each class. In addition, Fosill mate-pairs were used to scaffold a whole genome Illumina assembly, leading to a 3-fold increase in N50 values. Our analyses, using an independent means to validate different wheat genome assemblies, show that whole genome shotgun assemblies based solely on Illumina sequences are significantly more accurate by all measures compared to BAC-based chromosome-scale assemblies and hybrid SMRT-Illumina approaches. Although current whole genome assemblies are reasonably accurate and useful, additional improvements will be needed to generate complete assemblies of wheat genomes using open-source, computationally efficient, and cost-effective methods.
Kim, Tae Hoon; Dekker, Job
2018-05-01
Owing to its digital nature, ChIP-seq has become the standard method for genome-wide ChIP analysis. Using next-generation sequencing platforms (notably the Illumina Genome Analyzer), millions of short sequence reads can be obtained. The densities of recovered ChIP sequence reads along the genome are used to determine the binding sites of the protein. Although a relatively small amount of ChIP DNA is required for ChIP-seq, the current sequencing platforms still require amplification of the ChIP DNA by ligation-mediated PCR (LM-PCR). This protocol, which involves linker ligation followed by size selection, is the standard ChIP-seq protocol using an Illumina Genome Analyzer. The size-selected ChIP DNA is amplified by LM-PCR and size-selected for the second time. The purified ChIP DNA is then loaded into the Genome Analyzer. The ChIP DNA can also be processed in parallel for ChIP-chip results. © 2018 Cold Spring Harbor Laboratory Press.
Mapping the miRNA interactome by crosslinking ligation and sequencing of hybrids (CLASH)
Helwak, Aleksandra; Tollervey, David
2014-01-01
RNA-RNA interactions play critical roles in many cellular processes but studying them is difficult and laborious. Here, we describe an experimental procedure, termed crosslinking ligation and sequencing of hybrids (CLASH), which allows high-throughput identification of sites of RNA-RNA interaction. During CLASH, a tagged bait protein is UV crosslinked in vivo to stabilise RNA interactions and purified under denaturing conditions. RNAs associated with the bait protein are partially truncated, and the ends of RNA-duplexes are ligated together. Following linker addition, cDNA library preparation and high-throughput sequencing, the ligated duplexes give rise to chimeric cDNAs, which unambiguously identify RNA-RNA interaction sites independent of bioinformatic predictions. This protocol is optimized for studying miRNA targets bound by Argonaute proteins, but should be easily adapted for other RNA-binding proteins and classes of RNA. The protocol requires around 5 days to complete, excluding the time required for high-throughput sequencing and bioinformatic analyses. PMID:24577361
Protocol for rapid sequence intubation in pediatric patients -- a four-year study.
Marvez-Valls, Eduardo; Houry, Debra; Ernst, Amy A; Weiss, Steven J; Killeen, James
2002-04-01
To evaluate a protocol for rapid sequence intubation (RSI) for pediatric patients in a Level 1 trauma center. Retrospective review of prospectively gathered Continuing Quality Improvement (CQI) data at an inner city Level 1 trauma center with an emergency medicine residency program. Protocols for RSI were established prior to initiating the study. All pediatric intubations at the center from February 1996 to February 2000 were included. Statistical analysis included descriptive statistics for categorical data and Chi-square for comparisons between groups. Over the 4-year study period there were 83 pediatric intubations ranging in age from 18 months to 17 years; mean age 8.6. All had data collected at the time of intubation. There were 20 (24%) females and 62 (76%) males (p<0.001). Reasons for intubation were related to trauma in 71 (86%) and medical reasons in 12 (14%) (p<0.001). Of the trauma intubations 7 (10%) were for gunshot wounds, 39 (55%) were secondary to MVCs, and the remainder (25; 35%) were from assaults, falls, and closed head injuries. The non-trauma intubations were for smoke inhalation, overdose, seizure, HIV related complications, eclampsia, and near drowning. Intubations were successful with one attempt in 65 (78%) cases. No surgical airways were necessary. Rocuronium was used in 4 cases. Protocol deviations did not lead to complications. This protocol based pediatric rapid sequence intubation method worked well in an EM residency program. More intubations were in males and more were necessary due to trauma in this group.
Li, Jie; Li, Qiyue; Qu, Yugui; Zhao, Baohua
2011-01-01
Conventional MAC protocols for wireless sensor network perform poorly when faced with a delay-tolerant mobile network environment. Characterized by a highly dynamic and sparse topology, poor network connectivity as well as data delay-tolerance, delay-tolerant mobile sensor networks exacerbate the severe power constraints and memory limitations of nodes. This paper proposes an energy-efficient MAC protocol using dynamic queue management (EQ-MAC) for power saving and data queue management. Via data transfers initiated by the target sink and the use of a dynamic queue management strategy based on priority, EQ-MAC effectively avoids untargeted transfers, increases the chance of successful data transmission, and makes useful data reach the target terminal in a timely manner. Experimental results show that EQ-MAC has high energy efficiency in comparison with a conventional MAC protocol. It also achieves a 46% decrease in packet drop probability, 79% increase in system throughput, and 25% decrease in mean packet delay.
Li, Jie; Li, Qiyue; Qu, Yugui; Zhao, Baohua
2011-01-01
Conventional MAC protocols for wireless sensor network perform poorly when faced with a delay-tolerant mobile network environment. Characterized by a highly dynamic and sparse topology, poor network connectivity as well as data delay-tolerance, delay-tolerant mobile sensor networks exacerbate the severe power constraints and memory limitations of nodes. This paper proposes an energy-efficient MAC protocol using dynamic queue management (EQ-MAC) for power saving and data queue management. Via data transfers initiated by the target sink and the use of a dynamic queue management strategy based on priority, EQ-MAC effectively avoids untargeted transfers, increases the chance of successful data transmission, and makes useful data reach the target terminal in a timely manner. Experimental results show that EQ-MAC has high energy efficiency in comparison with a conventional MAC protocol. It also achieves a 46% decrease in packet drop probability, 79% increase in system throughput, and 25% decrease in mean packet delay. PMID:22319385
Effective DNA Inhibitors of Cathepsin G by In Vitro Selection
Gatto, Barbara; Vianini, Elena; Lucatello, Lorena; Sissi, Claudia; Moltrasio, Danilo; Pescador, Rodolfo; Porta, Roberto; Palumbo, Manlio
2008-01-01
Cathepsin G (CatG) is a chymotrypsin-like protease released upon degranulation of neutrophils. In several inflammatory and ischaemic diseases the impaired balance between CatG and its physiological inhibitors leads to tissue destruction and platelet aggregation. Inhibitors of CatG are suitable for the treatment of inflammatory diseases and procoagulant conditions. DNA released upon the death of neutrophils at injury sites binds CatG. Moreover, short DNA fragments are more inhibitory than genomic DNA. Defibrotide, a single stranded polydeoxyribonucleotide with antithrombotic effect is also a potent CatG inhibitor. Given the above experimental evidences we employed a selection protocol to assess whether DNA inhibition of CatG may be ascribed to specific sequences present in defibrotide DNA. A Selex protocol was applied to identify the single-stranded DNA sequences exhibiting the highest affinity for CatG, the diversity of a combinatorial pool of oligodeoxyribonucleotides being a good representation of the complexity found in defibrotide. Biophysical and biochemical studies confirmed that the selected sequences bind tightly to the target enzyme and also efficiently inhibit its catalytic activity. Sequence analysis carried out to unveil a motif responsible for CatG recognition showed a recurrence of alternating TG repeats in the selected CatG binders, adopting an extended conformation that grants maximal interaction with the highly charged protein surface. This unprecedented finding is validated by our results showing high affinity and inhibition of CatG by specific DNA sequences of variable length designed to maximally reduce pairing/folding interactions. PMID:19325843
Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data.
Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo
2011-12-15
High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein-DNA and protein-RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy eduardo.eyras@upf.edu Supplementary data are available at Bioinformatics online.
Recurrence time statistics: versatile tools for genomic DNA sequence analysis.
Cao, Yinhe; Tung, Wen-Wen; Gao, J B
2004-01-01
With the completion of the human and a few model organisms' genomes, and the genomes of many other organisms waiting to be sequenced, it has become increasingly important to develop faster computational tools which are capable of easily identifying the structures and extracting features from DNA sequences. One of the more important structures in a DNA sequence is repeat-related. Often they have to be masked before protein coding regions along a DNA sequence are to be identified or redundant expressed sequence tags (ESTs) are to be sequenced. Here we report a novel recurrence time based method for sequence analysis. The method can conveniently study all kinds of periodicity and exhaustively find all repeat-related features from a genomic DNA sequence. An efficient codon index is also derived from the recurrence time statistics, which has the salient features of being largely species-independent and working well on very short sequences. Efficient codon indices are key elements of successful gene finding algorithms, and are particularly useful for determining whether a suspected EST belongs to a coding or non-coding region. We illustrate the power of the method by studying the genomes of E. coli, the yeast S. cervisivae, the nematode worm C. elegans, and the human, Homo sapiens. Computationally, our method is very efficient. It allows us to carry out analysis of genomes on the whole genomic scale by a PC.
Lightweight SIP/SDP compression scheme (LSSCS)
NASA Astrophysics Data System (ADS)
Wu, Jian J.; Demetrescu, Cristian
2001-10-01
In UMTS new IP based services with tight delay constraints will be deployed over the W-CDMA air interface such as IP multimedia and interactive services. To integrate the wireline and wireless IP services, 3GPP standard forum adopted the Session Initiation Protocol (SIP) as the call control protocol for the UMTS Release 5, which will implement next generation, all IP networks for real-time QoS services. In the current form the SIP protocol is not suitable for wireless transmission due to its large message size which will need either a big radio pipe for transmission or it will take far much longer to transmit than the current GSM Call Control (CC) message sequence. In this paper we present a novel compression algorithm called Lightweight SIP/SDP Compression Scheme (LSSCS), which acts at the SIP application layer and therefore removes the information redundancy before it is sent to the network and transport layer. A binary octet-aligned header is added to the compressed SIP/SDP message before sending it to the network layer. The receiver uses this binary header as well as the pre-cached information to regenerate the original SIP/SDP message. The key features of the LSSCS compression scheme are presented in this paper along with implementation examples. It is shown that this compression algorithm makes SIP transmission efficient over the radio interface without losing the SIP generality and flexibility.
Al-Majid, Abdullah M.; Barakat, Assem; AL-Najjar, Hany J.; Mabkhot, Yahia N.; Ghabbour, Hazem A.; Fun, Hoong-Kun
2013-01-01
A simple protocol, involving the green synthesis for the construction of novel bis-pyrimidine derivatives, 3a–i and 4a–e are accomplished by the aqueous diethylamine media promoted tandem Aldol-Michael reaction between two molecules of barbituric acid derivatives 1a,b with various aldehydes. This efficient synthetic protocol using an economic and environmentally friendly reaction media with versatility and shorter reaction time provides bis-pyrimidine derivatives with high yields (88%–99%). PMID:24317435
Lewandowska, Dagmara W; Zagordi, Osvaldo; Geissberger, Fabienne-Desirée; Kufner, Verena; Schmutz, Stefan; Böni, Jürg; Metzner, Karin J; Trkola, Alexandra; Huber, Michael
2017-08-08
Sequence-specific PCR is the most common approach for virus identification in diagnostic laboratories. However, as specific PCR only detects pre-defined targets, novel virus strains or viruses not included in routine test panels will be missed. Recently, advances in high-throughput sequencing allow for virus-sequence-independent identification of entire virus populations in clinical samples, yet standardized protocols are needed to allow broad application in clinical diagnostics. Here, we describe a comprehensive sample preparation protocol for high-throughput metagenomic virus sequencing using random amplification of total nucleic acids from clinical samples. In order to optimize metagenomic sequencing for application in virus diagnostics, we tested different enrichment and amplification procedures on plasma samples spiked with RNA and DNA viruses. A protocol including filtration, nuclease digestion, and random amplification of RNA and DNA in separate reactions provided the best results, allowing reliable recovery of viral genomes and a good correlation of the relative number of sequencing reads with the virus input. We further validated our method by sequencing a multiplexed viral pathogen reagent containing a range of human viruses from different virus families. Our method proved successful in detecting the majority of the included viruses with high read numbers and compared well to other protocols in the field validated against the same reference reagent. Our sequencing protocol does work not only with plasma but also with other clinical samples such as urine and throat swabs. The workflow for virus metagenomic sequencing that we established proved successful in detecting a variety of viruses in different clinical samples. Our protocol supplements existing virus-specific detection strategies providing opportunities to identify atypical and novel viruses commonly not accounted for in routine diagnostic panels.
Provable classically intractable sampling with measurement-based computation in constant time
NASA Astrophysics Data System (ADS)
Sanders, Stephen; Miller, Jacob; Miyake, Akimasa
We present a constant-time measurement-based quantum computation (MQC) protocol to perform a classically intractable sampling problem. We sample from the output probability distribution of a subclass of the instantaneous quantum polynomial time circuits introduced by Bremner, Montanaro and Shepherd. In contrast with the usual circuit model, our MQC implementation includes additional randomness due to byproduct operators associated with the computation. Despite this additional randomness we show that our sampling task cannot be efficiently simulated by a classical computer. We extend previous results to verify the quantum supremacy of our sampling protocol efficiently using only single-qubit Pauli measurements. Center for Quantum Information and Control, Department of Physics and Astronomy, University of New Mexico, Albuquerque, NM 87131, USA.
Roberts, William L; McKinley, Danette W; Boulet, John R
2010-05-01
Due to the high-stakes nature of medical exams it is prudent for test agencies to critically evaluate test data and control for potential threats to validity. For the typical multiple station performance assessments used in medicine, it may take time for examinees to become comfortable with the test format and administrative protocol. Since each examinee in the rotational sequence starts with a different task (e.g., simulated clinical encounter), those who are administered non-scored pretest material on their first station may have an advantage compared to those who are not. The purpose of this study is to investigate whether pass/fail rates are different across the sequence of pretest encounters administered during the testing day. First-time takers were grouped by the sequential order in which they were administered the pretest encounter. No statistically significant difference in fail rates was found between examinees who started with the pretest encounter and those who encountered the pretest encounter later in the sequence. Results indicate that current examination administration protocols do not present a threat to the validity of test score interpretations.
Improving transmission efficiency of large sequence alignment/map (SAM) files.
Sakib, Muhammad Nazmus; Tang, Jijun; Zheng, W Jim; Huang, Chin-Tser
2011-01-01
Research in bioinformatics primarily involves collection and analysis of a large volume of genomic data. Naturally, it demands efficient storage and transfer of this huge amount of data. In recent years, some research has been done to find efficient compression algorithms to reduce the size of various sequencing data. One way to improve the transmission time of large files is to apply a maximum lossless compression on them. In this paper, we present SAMZIP, a specialized encoding scheme, for sequence alignment data in SAM (Sequence Alignment/Map) format, which improves the compression ratio of existing compression tools available. In order to achieve this, we exploit the prior knowledge of the file format and specifications. Our experimental results show that our encoding scheme improves compression ratio, thereby reducing overall transmission time significantly.
Szczesny, Roman J.; Kowalska, Katarzyna; Klosowska-Kosicka, Kamila; Chlebowski, Aleksander; Owczarek, Ewelina P.; Warkocki, Zbigniew; Kulinski, Tomasz M.; Adamska, Dorota; Affek, Kamila; Jedroszkowiak, Agata; Kotrys, Anna V.; Tomecki, Rafal; Krawczyk, Pawel S.; Borowski, Lukasz S.; Dziembowski, Andrzej
2018-01-01
Deciphering a function of a given protein requires investigating various biological aspects. Usually, the protein of interest is expressed with a fusion tag that aids or allows subsequent analyses. Additionally, downregulation or inactivation of the studied gene enables functional studies. Development of the CRISPR/Cas9 methodology opened many possibilities but in many cases it is restricted to non-essential genes. Recombinase-dependent gene integration methods, like the Flp-In system, are very good alternatives. The system is widely used in different research areas, which calls for the existence of compatible vectors and efficient protocols that ensure straightforward DNA cloning and generation of stable cell lines. We have created and validated a robust series of 52 vectors for streamlined generation of stable mammalian cell lines using the FLP recombinase-based methodology. Using the sequence-independent DNA cloning method all constructs for a given coding-sequence can be made with just three universal PCR primers. Our collection allows tetracycline-inducible expression of proteins with various tags suitable for protein localization, FRET, bimolecular fluorescence complementation (BiFC), protein dynamics studies (FRAP), co-immunoprecipitation, the RNA tethering assay and cell sorting. Some of the vectors contain a bidirectional promoter for concomitant expression of miRNA and mRNA, so that a gene can be silenced and its product replaced by a mutated miRNA-insensitive version. Our toolkit and protocols have allowed us to create more than 500 constructs with ease. We demonstrate the efficacy of our vectors by creating stable cell lines with various tagged proteins (numatrin, fibrillarin, coilin, centrin, THOC5, PCNA). We have analysed transgene expression over time to provide a guideline for future experiments and compared the effectiveness of commonly used inducers for tetracycline-responsive promoters. As proof of concept we examined the role of the exoribonuclease XRN2 in transcription termination by RNAseq. PMID:29590189
An Abbreviated Protocol for High-Risk Screening Breast MRI Saves Time and Resources.
Harvey, Susan C; Di Carlo, Phillip A; Lee, Bonmyong; Obadina, Eniola; Sippo, Dorothy; Mullen, Lisa
2016-04-01
To review the ability of an abbreviated, high-risk, screening, breast MRI protocol to detect cancer and save resources. High-risk screening breast MR images were reviewed, from both an abbreviated protocol and a full diagnostic protocol. Differences in cancer detection, scanner utilization, interpretation times, and need for additional imaging were recorded in an integrated data form, and reviewed and compared. A total of 568 MRI cases were reviewed, with the abbreviated and full protocols. No difference was found in the number of cancers detected. Scan times were decreased by 18.8 minutes per case, for a total of 10,678 minutes (178 hours). Interpretation time, on average, was 1.55 minutes for the abbreviated protocol, compared with 6.43 minutes for the full protocol. Review of the full protocol led to a significant change in the final BI-RADS(®) assessment in 12 of 568 (2.1%) cases. Abbreviated MRI is as effective as full-protocol MRI for demonstration of cancers in the high-risk screening setting, with only 12 (2.1%) cases recommended for additional MRI evaluation. The efficiency and resource savings of an abbreviated protocol would be significant, and would allow for opportunities to provide MRI for additional patients, as well as improved radiologist time management and workflow, with the potential to add real-time MRI interpretation or double reading. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
An Abbreviated Protocol for High-Risk Screening Breast MRI Saves Time and Resources.
Harvey, Susan C; Di Carlo, Phillip A; Lee, Bonmyong; Obadina, Eniola; Sippo, Dorothy; Mullen, Lisa
2016-11-01
To review the ability of an abbreviated, high-risk, screening, breast MRI protocol to detect cancer and save resources. High-risk screening breast MR images were reviewed, from both an abbreviated protocol and a full diagnostic protocol. Differences in cancer detection, scanner utilization, interpretation times, and need for additional imaging were recorded in an integrated data form, and reviewed and compared. A total of 568 MRI cases were reviewed, with the abbreviated and full protocols. No difference was found in the number of cancers detected. Scan times were decreased by 18.8 minutes per case, for a total of 10,678 minutes (178 hours). Interpretation time, on average, was 1.55 minutes for the abbreviated protocol, compared with 6.43 minutes for the full protocol. Review of the full protocol led to a significant change in the final BI-RADS ® assessment in 12 of 568 (2.1%) cases. Abbreviated MRI is as effective as full-protocol MRI for demonstration of cancers in the high-risk screening setting, with only 12 (2.1 %) cases recommended for additional MRI evaluation. The efficiency and resource savings of an abbreviated protocol would be significant, and would allow for opportunities to provide MRI for additional patients, as well as improved radiologist time management and workflow, with the potential to add real-time MRI interpretation or double reading. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
A High-Throughput Process for the Solid-Phase Purification of Synthetic DNA Sequences
Grajkowski, Andrzej; Cieślak, Jacek; Beaucage, Serge L.
2017-01-01
An efficient process for the purification of synthetic phosphorothioate and native DNA sequences is presented. The process is based on the use of an aminopropylated silica gel support functionalized with aminooxyalkyl functions to enable capture of DNA sequences through an oximation reaction with the keto function of a linker conjugated to the 5′-terminus of DNA sequences. Deoxyribonucleoside phosphoramidites carrying this linker, as a 5′-hydroxyl protecting group, have been synthesized for incorporation into DNA sequences during the last coupling step of a standard solid-phase synthesis protocol executed on a controlled pore glass (CPG) support. Solid-phase capture of the nucleobase- and phosphate-deprotected DNA sequences released from the CPG support is demonstrated to proceed near quantitatively. Shorter than full-length DNA sequences are first washed away from the capture support; the solid-phase purified DNA sequences are then released from this support upon reaction with tetra-n-butylammonium fluoride in dry dimethylsulfoxide (DMSO) and precipitated in tetrahydrofuran (THF). The purity of solid-phase-purified DNA sequences exceeds 98%. The simulated high-throughput and scalability features of the solid-phase purification process are demonstrated without sacrificing purity of the DNA sequences. PMID:28628204
Optimal quantum operations at zero energy cost
NASA Astrophysics Data System (ADS)
Chiribella, Giulio; Yang, Yuxiang
2017-08-01
Quantum technologies are developing powerful tools to generate and manipulate coherent superpositions of different energy levels. Envisaging a new generation of energy-efficient quantum devices, here we explore how coherence can be manipulated without exchanging energy with the surrounding environment. We start from the task of converting a coherent superposition of energy eigenstates into another. We identify the optimal energy-preserving operations, both in the deterministic and in the probabilistic scenario. We then design a recursive protocol, wherein a branching sequence of energy-preserving filters increases the probability of success while reaching maximum fidelity at each iteration. Building on the recursive protocol, we construct efficient approximations of the optimal fidelity-probability trade-off, by taking coherent superpositions of the different branches generated by probabilistic filtering. The benefits of this construction are illustrated in applications to quantum metrology, quantum cloning, coherent state amplification, and ancilla-driven computation. Finally, we extend our results to transitions where the input state is generally mixed and we apply our findings to the task of purifying quantum coherence.
Clark, Stephen J; Smallwood, Sébastien A; Lee, Heather J; Krueger, Felix; Reik, Wolf; Kelsey, Gavin
2017-03-01
DNA methylation (DNAme) is an important epigenetic mark in diverse species. Our current understanding of DNAme is based on measurements from bulk cell samples, which obscures intercellular differences and prevents analyses of rare cell types. Thus, the ability to measure DNAme in single cells has the potential to make important contributions to the understanding of several key biological processes, such as embryonic development, disease progression and aging. We have recently reported a method for generating genome-wide DNAme maps from single cells, using single-cell bisulfite sequencing (scBS-seq), allowing the quantitative measurement of DNAme at up to 50% of CpG dinucleotides throughout the mouse genome. Here we present a detailed protocol for scBS-seq that includes our most recent developments to optimize recovery of CpGs, mapping efficiency and success rate; reduce hands-on time; and increase sample throughput with the option of using an automated liquid handler. We provide step-by-step instructions for each stage of the method, comprising cell lysis and bisulfite (BS) conversion, preamplification and adaptor tagging, library amplification, sequencing and, lastly, alignment and methylation calling. An individual with relevant molecular biology expertise can complete library preparation within 3 d. Subsequent computational steps require 1-3 d for someone with bioinformatics expertise.
Sacino, Amanda N; Shuster, Jonathan J; Nowicki, Kamil; Carek, Peter J; Wegman, Martin P; Listhaus, Alyson; Gibney, Joseph M; Chang, Ku-Lang
2016-02-01
As the number of patients with access to care increases, outpatient clinics will need to implement innovative strategies to maintain or enhance clinic efficiency. One viable alternative involves reverse triage. A reverse triage protocol was implemented during a student-run free clinic. Each patient's chief complaint(s) were obtained at the beginning of the clinic session and ranked by increasing complexity. "Complexity" was defined as the subjective amount of time required to provide a full, thorough evaluation of a patient. Less complex cases were prioritized first since they could be expedited through clinic processing and allow for more time and resources to be dedicated to complex cases. Descriptive statistics were used to characterize and summarize the data obtained. Categorical variables were analyzed using chi-square. A time series analysis of the outcome versus centered time in weeks was also conducted. The average number of patients seen per clinic session increased by 35% (9.5 versus 12.8) from pre-implementation of the reverse triage protocol to 6 months after the implementation of the protocol. The implementation of a reverse triage in an outpatient setting significantly increased clinic efficiency as noted by a significant increase in the number of patients seen during a clinic session.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torella, JP; Lienert, F; Boehm, CR
2014-08-07
Recombination-based DNA construction methods, such as Gibson assembly, have made it possible to easily and simultaneously assemble multiple DNA parts, and they hold promise for the development and optimization of metabolic pathways and functional genetic circuits. Over time, however, these pathways and circuits have become more complex, and the increasing need for standardization and insulation of genetic parts has resulted in sequence redundancies-for example, repeated terminator and insulator sequences-that complicate recombination-based assembly. We and others have recently developed DNA assembly methods, which we refer to collectively as unique nucleotide sequence (UNS)-guided assembly, in which individual DNA parts are flanked withmore » UNSs to facilitate the ordered, recombination-based assembly of repetitive sequences. Here we present a detailed protocol for UNS-guided assembly that enables researchers to convert multiple DNA parts into sequenced, correctly assembled constructs, or into high-quality combinatorial libraries in only 2-3 d. If the DNA parts must be generated from scratch, an additional 2-5 d are necessary. This protocol requires no specialized equipment and can easily be implemented by a student with experience in basic cloning techniques.« less
Torella, Joseph P.; Lienert, Florian; Boehm, Christian R.; Chen, Jan-Hung; Way, Jeffrey C.; Silver, Pamela A.
2016-01-01
Recombination-based DNA construction methods, such as Gibson assembly, have made it possible to easily and simultaneously assemble multiple DNA parts and hold promise for the development and optimization of metabolic pathways and functional genetic circuits. Over time, however, these pathways and circuits have become more complex, and the increasing need for standardization and insulation of genetic parts has resulted in sequence redundancies — for example repeated terminator and insulator sequences — that complicate recombination-based assembly. We and others have recently developed DNA assembly methods that we refer to collectively as unique nucleotide sequence (UNS)-guided assembly, in which individual DNA parts are flanked with UNSs to facilitate the ordered, recombination-based assembly of repetitive sequences. Here we present a detailed protocol for UNS-guided assembly that enables researchers to convert multiple DNA parts into sequenced, correctly-assembled constructs, or into high-quality combinatorial libraries in only 2–3 days. If the DNA parts must be generated from scratch, an additional 2–5 days are necessary. This protocol requires no specialized equipment and can easily be implemented by a student with experience in basic cloning techniques. PMID:25101822
Dabney, Jesse; Knapp, Michael; Glocke, Isabelle; Gansauge, Marie-Theres; Weihmann, Antje; Nickel, Birgit; Valdiosera, Cristina; García, Nuria; Pääbo, Svante; Arsuaga, Juan-Luis; Meyer, Matthias
2013-09-24
Although an inverse relationship is expected in ancient DNA samples between the number of surviving DNA fragments and their length, ancient DNA sequencing libraries are strikingly deficient in molecules shorter than 40 bp. We find that a loss of short molecules can occur during DNA extraction and present an improved silica-based extraction protocol that enables their efficient retrieval. In combination with single-stranded DNA library preparation, this method enabled us to reconstruct the mitochondrial genome sequence from a Middle Pleistocene cave bear (Ursus deningeri) bone excavated at Sima de los Huesos in the Sierra de Atapuerca, Spain. Phylogenetic reconstructions indicate that the U. deningeri sequence forms an early diverging sister lineage to all Western European Late Pleistocene cave bears. Our results prove that authentic ancient DNA can be preserved for hundreds of thousand years outside of permafrost. Moreover, the techniques presented enable the retrieval of phylogenetically informative sequences from samples in which virtually all DNA is diminished to fragments shorter than 50 bp.
Dabney, Jesse; Knapp, Michael; Glocke, Isabelle; Gansauge, Marie-Theres; Weihmann, Antje; Nickel, Birgit; Valdiosera, Cristina; García, Nuria; Pääbo, Svante; Arsuaga, Juan-Luis; Meyer, Matthias
2013-01-01
Although an inverse relationship is expected in ancient DNA samples between the number of surviving DNA fragments and their length, ancient DNA sequencing libraries are strikingly deficient in molecules shorter than 40 bp. We find that a loss of short molecules can occur during DNA extraction and present an improved silica-based extraction protocol that enables their efficient retrieval. In combination with single-stranded DNA library preparation, this method enabled us to reconstruct the mitochondrial genome sequence from a Middle Pleistocene cave bear (Ursus deningeri) bone excavated at Sima de los Huesos in the Sierra de Atapuerca, Spain. Phylogenetic reconstructions indicate that the U. deningeri sequence forms an early diverging sister lineage to all Western European Late Pleistocene cave bears. Our results prove that authentic ancient DNA can be preserved for hundreds of thousand years outside of permafrost. Moreover, the techniques presented enable the retrieval of phylogenetically informative sequences from samples in which virtually all DNA is diminished to fragments shorter than 50 bp. PMID:24019490
Simpson, Jared
2018-01-24
Wellcome Trust Sanger Institute's Jared Simpson on Memory efficient sequence analysis using compressed data structures at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
Moazzam Jazi, Maryam; Rajaei, Saideh; Seyedi, Seyed Mahdi
2015-10-01
The quality and quantity of RNA are critical for successful downstream transcriptome-based studies such as microarrays and RNA sequencing (RNA-Seq). RNA isolation from woody plants, such as Pistacia vera, with very high amounts of polyphenols and polysaccharides is an enormous challenge. Here, we describe a highly efficient protocol that overcomes the limitations posed by poor quality and low yield of isolated RNA from pistachio and various recalcitrant woody plants. The key factors that resulted in a yield of 150 μg of high quality RNA per 200 mg of plant tissue include the elimination of phenol from the extraction buffer, raising the concentration of β-mercaptoethanol, long time incubation at 65 °C, and nucleic acid precipitation with optimized volume of NaCl and isopropyl alcohol. Also, the A260/A280 and A260/A230 of extracted RNA were about 1.9-2.1and 2.2-2.3, respectively, revealing the high purity. Since the isolated RNA passed highly stringent quality control standards for sensitive reactions, including RNA sequencing and real-time PCR, it can be considered as a reliable and cost-effective method for RNA extraction from woody plants.
NASA Astrophysics Data System (ADS)
Kesuma, Hendra; Niederkleine, Kris; Schmale, Sebastian; Ahobala, Tejas; Paul, Steffen; Sebald, Johannes
2016-08-01
In this work we design and implement efficient time synchronization/stamping method for Wireless Sensor Network inside the Vehicle Equipment Bay (VEB) of the ARIANE 5. The sensor nodes in the network do not require real time clock (RTC) hardware to store and stamp each measurement data performed by the sensors. There will be only the measurement sequence information, previous time (clock) information, measurement data and its related data protocol information sent back to the Access Point (AP). This lead to less data transmission, less energy and less time required by the sensor nodes to operate and also leads to longer battery life time. The Visible Light Communication (VLC) is used, to provide energy, to synchronize time and to deliver the commands to the sensor nodes in the network. By employing star network topology, a part of solar cell as receiver, the conventional receiver (RF/Infrared) is neglected to reduce amount of hardware and energy consumption. The infrared transmitter on the sensor node is deployed to minimize the electromagnetic interference in the launcher and does not require a complicated circuit in comparison to a RF transmitter.
High-throughput real-time quantitative reverse transcription PCR.
Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F
2006-02-01
Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.
Balint, B; Ivanović, Z; Petakov, M; Taseski, J; Jovcić, G; Stojanović, N; Milenković, P
1999-03-01
The efficiency of five different cryopreservation protocols (our original controlled-rate and noncontrolled-rate protocols) was evaluated on the basis of the recovery after thawing of very primitive pluripotent hemopoietic stem cells (MRA(CFU-GM), pluripotent progenitors (CFU-Sd12) and committed granulocyte-monocyte progenitors (CFU-GM) in mouse bone marrow. Although the nucleated cell recovery and viability determined immediately after the thawing and washing of the cells were found to be similar, whether controlled-rate or noncontrolled-rate cryopreservation protocols were used, the recovery of MRA(CFU-GM), CFU-Sd12 and CFU-GM varied depending on the type of protocol and the cryoprotector (DMSO) concentrations used. It was shown that the controlled-rate protocol was more efficient, enabling better MRA(CFU-GM), CFU-Sd12 and CFU-GM recovery from frozen samples. The most efficient was the controlled-rate protocol of cryopreservation designed to compensate for the release of fusion heat, which enabled a better survival of CFU-Sd12 and CFU-GM when combined with a lower (5%) DMSO concentration. On the contrary, a satisfactory survival rate of very primitive stem cells (MRA(CFU-GM)) was achieved only when 10% DMSO was included with a five-step protocol of cryopreservation. These results point to adequately used controlled-rate freezing as essential for a highly efficient cryopreservation of some of the categories of hematopoietic stem and progenitor cells. At the same time, it was obvious that a higher DMSO concentration was necessary for the cryopreservation of very primitive stem cells, but not, however, for more mature progenitor cells (CFU-S, CFU-GM). These results imply the existence of a mechanism that decreases the intracellular concentration of DMSO in primitive MRA cells, which is not the case for less primitive progenitors.
Efficient genome editing of differentiated renal epithelial cells.
Hofherr, Alexis; Busch, Tilman; Huber, Nora; Nold, Andreas; Bohn, Albert; Viau, Amandine; Bienaimé, Frank; Kuehn, E Wolfgang; Arnold, Sebastian J; Köttgen, Michael
2017-02-01
Recent advances in genome editing technologies have enabled the rapid and precise manipulation of genomes, including the targeted introduction, alteration, and removal of genomic sequences. However, respective methods have been described mainly in non-differentiated or haploid cell types. Genome editing of well-differentiated renal epithelial cells has been hampered by a range of technological issues, including optimal design, efficient expression of multiple genome editing constructs, attainable mutation rates, and best screening strategies. Here, we present an easily implementable workflow for the rapid generation of targeted heterozygous and homozygous genomic sequence alterations in renal cells using transcription activator-like effector nucleases (TALENs) and the clustered regularly interspaced short palindromic repeat (CRISPR) system. We demonstrate the versatility of established protocols by generating novel cellular models for studying autosomal dominant polycystic kidney disease (ADPKD). Furthermore, we show that cell culture-validated genetic modifications can be readily applied to mouse embryonic stem cells (mESCs) for the generation of corresponding mouse models. The described procedure for efficient genome editing can be applied to any cell type to study physiological and pathophysiological functions in the context of precisely engineered genotypes.
Approaches for improving thermostability characteristics in cellulases.
Anbar, Michael; Bayer, Edward A
2012-01-01
Many efforts have been invested to reduce the cost of biofuel production to substitute renewable sources of energy for fossil-based fuels. At the forefront of these efforts are the initiatives to convert plant-derived cellulosic material to biofuels. Although significant improvements have been achieved recently in cellulase engineering in both efficiency and cost reduction, complete degradation of lignocellulosic material still requires very long periods of time and high enzyme loads. Thermostable cellulases offer many advantages in the bioconversion process, which include increase in specific activity, higher levels of stability, inhibition of microbial growth, increase in mass transfer rate due to lower fluid viscosity, and greater flexibility in the bioprocess. Besides rational design methods, which require deep understanding of protein structure-function relationship, two of the major methods for improvement in specific cellulase properties are directed evolution and knowledge-based library design based on multiple sequence alignments. In this chapter, we provide protocols for constructing and screening of improved thermostable cellulases. Modifications of these protocols may also be used for screening for other improved properties of cellulases such as pH tolerance, high salt, and more. Copyright © 2012 Elsevier Inc. All rights reserved.
Alam, Pravej; Khan, Zainul Abdeen; Abdin, Malik Zainul; Khan, Jawaid A; Ahmad, Parvaiz; Elkholy, Shereen F; Sharaf-Eldin, Mahmoud A
2017-05-01
Catharanthus roseus is an important medicinal plant known for its pharmacological qualities such as antimicrobial, anticancerous, antifeedant, antisterility, antidiabetic activities. More than 130 bioactive compounds like vinblastine, vindoline and vincristine have been synthesized in this plant. Extensive studies have been carried out for optimization regeneration and transformation protocols. Most of the protocol described are laborious and time-consuming. Due to sophisticated protocol of regeneration and genetic transformation, the production of these bioactive molecules is less and not feasible to be commercialized worldwide. Here we have optimized the efficient protocol for regeneration and transformation to minimize the time scale and enhance the transformation frequency through Agrobacterium and sonication-assisted transformation (SAAT) method. In this study, hypocotyl explants responded best for maximal production of transformed shoots. The callus percentage were recorded 52% with 1.0 mg L -1 (BAP) and 0.5 mg L -1 (NAA) while 80% shoot percentage obtained with 4.0 mg L -1 (BAP) and 0.05 mg L -1 (NAA). The microscopic studies revealed that the expression of GFP was clearly localized in leaf tissue of the C. roseus after transformation of pRepGFP0029 construct. Consequently, transformation efficiency was revealed on the basis of GFP localization. The transformation efficiency of SAAT method was 6.0% comparable to 3.5% as conventional method. Further, PCR analysis confirmed the integration of the nptII gene in the transformed plantlets of C. roseus.
Wenkel, Evelyn; Janka, Rolf; Geppert, Christian; Kaemmerer, Nadine; Hartmann, Arndt; Uder, Michael; Hammon, Matthias; Brand, Michael
2017-02-01
Purpose The aim was to evaluate a minimum echo time (minTE) protocol for breast magnetic resonance imaging (MRI) in patients with breast lesions compared to a standard TE (nTE) time protocol. Methods Breasts of 144 women were examined with a 1.5 Tesla MRI scanner. Additionally to the standard gradient-echo sequence with nTE (4.8 ms), a variant with minimum TE (1.2 ms) was used in an interleaved fashion which leads to a better temporal resolution and should reduce the scan time by approximately 50 %. Lesion sizes were measured and the signal-to-noise ratio (SNR) as well as the contrast-to-noise ratio (CNR) were calculated. Subjective confidence was evaluated using a 3-point scale before looking at the nTE sequences (1 = very sure that I can identify a lesion and classify it, 2 = quite sure that I can identify a lesion and classify it, 3 = definitely want to see nTE for final assessment) and the subjective image quality of all examinations was evaluated using a four-grade scale (1 = sharp, 2 = slight blur, 3 = moderate blur and 4 = severe blur/not evaluable) for lesion and skin sharpness. Lesion morphology and contrast enhancement were also evaluated. Results With minTE sequences, no lesion was rated with "definitely want to see nTE sequences for final assessment". The difference of the longitudinal and transverse diameter did not differ significantly (p > 0.05). With minTE, lesions and skin were rated to be significantly more blurry (p < 0.01 for lesions and p < 0.05 for skin). There was no difference between both sequences with respect to SNR, CNR, lesion morphology, contrast enhancement and detection of multifocal disease. Conclusion Dynamic breast MRI with a minTE protocol is feasible without a major loss of information (SNR, CNR, lesion morphology, contrast enhancement and lesion sizes) and the temporal resolution can be increased by a factor of 2 using minTE sequences. Key points · Increase of temporal resolution for a better in-flow curve.. · Dynamic breast MRI with a shorter TE time is possible without relevant loss of information.. · Possible decrease of the overall scan time.. Citation Format · Wenkel E, Janka R, Geppert C et al. Breast MRI at Very Short TE (minTE): Image Analysis of minTE Sequences on Non-Fat-Saturated, Subtracted T1-Weighted Images. Fortschr Röntgenstr 2017; 189: 137 - 145. © Georg Thieme Verlag KG Stuttgart · New York.
Saleem, Kashif; Derhab, Abdelouahid; Orgun, Mehmet A; Al-Muhtadi, Jalal; Rodrigues, Joel J P C; Khalil, Mohammed Sayim; Ali Ahmed, Adel
2016-03-31
The deployment of intelligent remote surveillance systems depends on wireless sensor networks (WSNs) composed of various miniature resource-constrained wireless sensor nodes. The development of routing protocols for WSNs is a major challenge because of their severe resource constraints, ad hoc topology and dynamic nature. Among those proposed routing protocols, the biology-inspired self-organized secure autonomous routing protocol (BIOSARP) involves an artificial immune system (AIS) that requires a certain amount of time to build up knowledge of neighboring nodes. The AIS algorithm uses this knowledge to distinguish between self and non-self neighboring nodes. The knowledge-building phase is a critical period in the WSN lifespan and requires active security measures. This paper proposes an enhanced BIOSARP (E-BIOSARP) that incorporates a random key encryption mechanism in a cost-effective manner to provide active security measures in WSNs. A detailed description of E-BIOSARP is presented, followed by an extensive security and performance analysis to demonstrate its efficiency. A scenario with E-BIOSARP is implemented in network simulator 2 (ns-2) and is populated with malicious nodes for analysis. Furthermore, E-BIOSARP is compared with state-of-the-art secure routing protocols in terms of processing time, delivery ratio, energy consumption, and packet overhead. The findings show that the proposed mechanism can efficiently protect WSNs from selective forwarding, brute-force or exhaustive key search, spoofing, eavesdropping, replaying or altering of routing information, cloning, acknowledgment spoofing, HELLO flood attacks, and Sybil attacks.
Saleem, Kashif; Derhab, Abdelouahid; Orgun, Mehmet A.; Al-Muhtadi, Jalal; Rodrigues, Joel J. P. C.; Khalil, Mohammed Sayim; Ali Ahmed, Adel
2016-01-01
The deployment of intelligent remote surveillance systems depends on wireless sensor networks (WSNs) composed of various miniature resource-constrained wireless sensor nodes. The development of routing protocols for WSNs is a major challenge because of their severe resource constraints, ad hoc topology and dynamic nature. Among those proposed routing protocols, the biology-inspired self-organized secure autonomous routing protocol (BIOSARP) involves an artificial immune system (AIS) that requires a certain amount of time to build up knowledge of neighboring nodes. The AIS algorithm uses this knowledge to distinguish between self and non-self neighboring nodes. The knowledge-building phase is a critical period in the WSN lifespan and requires active security measures. This paper proposes an enhanced BIOSARP (E-BIOSARP) that incorporates a random key encryption mechanism in a cost-effective manner to provide active security measures in WSNs. A detailed description of E-BIOSARP is presented, followed by an extensive security and performance analysis to demonstrate its efficiency. A scenario with E-BIOSARP is implemented in network simulator 2 (ns-2) and is populated with malicious nodes for analysis. Furthermore, E-BIOSARP is compared with state-of-the-art secure routing protocols in terms of processing time, delivery ratio, energy consumption, and packet overhead. The findings show that the proposed mechanism can efficiently protect WSNs from selective forwarding, brute-force or exhaustive key search, spoofing, eavesdropping, replaying or altering of routing information, cloning, acknowledgment spoofing, HELLO flood attacks, and Sybil attacks. PMID:27043572
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shea, S; Diak, A; Surucu, M
2015-06-15
Purpose: To investigate the effect of readout bandwidth and voxel size on the appearance of distortion artifacts caused by a titanium brachytherapy applicator. Methods: An acrylic phantom was constructed to rigidly hold a MR conditional, titanium Fletcher-Suit-Delclos-style applicator set (Varian Medical Systems) for imaging on CT (Philips Brilliance) and 1.5T MRI (Siemens Magnetom Aera). Several variants of MRI parameters were tried for 2D T2-weighted turbo spin echo imaging in comparison against the standard clinical protocol with the criteria to keep relative SNR loss less than 20% and imaging time as short as possible. Two 3D sequences were also used formore » comparison with similar parameters. The applicator tandem was segmented on axial CT images (0.4×0.4×1.5mm {sup 3} resolution) and the CT images were registered to the 3D MR images in Eclipse (Varian). The applicator volume was then overlaid on all MRI sets in 3D-Slicer and distances were measured from the tandem tip to the MRI artifact edge in right/left/superior and anterior/posterior/superior directions from coronal and sagittal 2D acquisitions, respectively, or 3D data reformats. Artifact regions were also manually contoured in coronal/sagittal orientations for area measurements. Results: As would be expected, reductions in voxel size and increases in readout bandwidth reduced artifact size (average max artifact length decreased by 0.95 mm and average max area decrease by 0.27 cm{sup 2}). Interestingly, bandwidth increases yielded reductions in area (0.19 cm{sup 2}) and in distance measurements (1 mm) even with voxel increases, as compared to a standard protocol. This could be useful when high performance protocols are not feasible due to long imaging times. Conclusion: We have characterized artifacts caused by cervical brachytherapy applicator across multiple sequence parameters at 1.5T. Future work will focus on finalizing an optimal protocol that balances artifact reduction with imaging time and then testing this new protocol in patients.« less
Guo, Rui; Wen, Qiaoyan; Jin, Zhengping; Zhang, Hua
2013-01-01
Sensor networks have opened up new opportunities in healthcare systems, which can transmit patient's condition to health professional's hand-held devices in time. The patient's physiological signals are very sensitive and the networks are extremely vulnerable to many attacks. It must be ensured that patient's privacy is not exposed to unauthorized entities. Therefore, the control of access to healthcare systems has become a crucial challenge. An efficient and secure authentication protocol will thus be needed in wireless medical sensor networks. In this paper, we propose a certificateless authentication scheme without bilinear pairing while providing patient anonymity. Compared with other related protocols, the proposed scheme needs less computation and communication cost and preserves stronger security. Our performance evaluations show that this protocol is more practical for healthcare system in wireless medical sensor networks.
Guo, Rui; Wen, Qiaoyan; Jin, Zhengping; Zhang, Hua
2013-01-01
Sensor networks have opened up new opportunities in healthcare systems, which can transmit patient's condition to health professional's hand-held devices in time. The patient's physiological signals are very sensitive and the networks are extremely vulnerable to many attacks. It must be ensured that patient's privacy is not exposed to unauthorized entities. Therefore, the control of access to healthcare systems has become a crucial challenge. An efficient and secure authentication protocol will thus be needed in wireless medical sensor networks. In this paper, we propose a certificateless authentication scheme without bilinear pairing while providing patient anonymity. Compared with other related protocols, the proposed scheme needs less computation and communication cost and preserves stronger security. Our performance evaluations show that this protocol is more practical for healthcare system in wireless medical sensor networks. PMID:23710147
Optimizing a dynamical decoupling protocol for solid-state electronic spin ensembles in diamond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farfurnik, D.; Jarmola, A.; Pham, L. M.
2015-08-24
In this study, we demonstrate significant improvements of the spin coherence time of a dense ensemble of nitrogen-vacancy (NV) centers in diamond through optimized dynamical decoupling (DD). Cooling the sample down to 77 K suppresses longitudinal spin relaxation T 1 effects and DD microwave pulses are used to increase the transverse coherence time T 2 from ~0.7ms up to ~30ms. Furthermore, we extend previous work of single-axis (Carr-Purcell-Meiboom-Gill) DD towards the preservation of arbitrary spin states. Following a theoretical and experimental characterization of pulse and detuning errors, we compare the performance of various DD protocols. We also identify that themore » optimal control scheme for preserving an arbitrary spin state is a recursive protocol, the concatenated version of the XY8 pulse sequence. The improved spin coherence might have an immediate impact on improvements of the sensitivities of ac magnetometry. Moreover, the protocol can be used on denser diamond samples to increase coherence times up to NV-NV interaction time scales, a major step towards the creation of quantum collective NV spin states.« less
ProDeGe: A computational protocol for fully automated decontamination of genomes
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott; ...
2015-06-09
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
ProDeGe: A computational protocol for fully automated decontamination of genomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tennessen, Kristin; Andersen, Evan; Clingenpeel, Scott
Single amplified genomes and genomes assembled from metagenomes have enabled the exploration of uncultured microorganisms at an unprecedented scale. However, both these types of products are plagued by contamination. Since these genomes are now being generated in a high-throughput manner and sequences from them are propagating into public databases to drive novel scientific discoveries, rigorous quality controls and decontamination protocols are urgently needed. Here, we present ProDeGe (Protocol for fully automated Decontamination of Genomes), the first computational protocol for fully automated decontamination of draft genomes. ProDeGe classifies sequences into two classes—clean and contaminant—using a combination of homology and feature-based methodologies.more » On average, 84% of sequence from the non-target organism is removed from the data set (specificity) and 84% of the sequence from the target organism is retained (sensitivity). Lastly, the procedure operates successfully at a rate of ~0.30 CPU core hours per megabase of sequence and can be applied to any type of genome sequence.« less
Energy-efficient boarder node medium access control protocol for wireless sensor networks.
Razaque, Abdul; Elleithy, Khaled M
2014-03-12
This paper introduces the design, implementation, and performance analysis of the scalable and mobility-aware hybrid protocol named boarder node medium access control (BN-MAC) for wireless sensor networks (WSNs), which leverages the characteristics of scheduled and contention-based MAC protocols. Like contention-based MAC protocols, BN-MAC achieves high channel utilization, network adaptability under heavy traffic and mobility, and low latency and overhead. Like schedule-based MAC protocols, BN-MAC reduces idle listening time, emissions, and collision handling at low cost at one-hop neighbor nodes and achieves high channel utilization under heavy network loads. BN-MAC is particularly designed for region-wise WSNs. Each region is controlled by a boarder node (BN), which is of paramount importance. The BN coordinates with the remaining nodes within and beyond the region. Unlike other hybrid MAC protocols, BN-MAC incorporates three promising models that further reduce the energy consumption, idle listening time, overhearing, and congestion to improve the throughput and reduce the latency. One of the models used with BN-MAC is automatic active and sleep (AAS), which reduces the ideal listening time. When nodes finish their monitoring process, AAS lets them automatically go into the sleep state to avoid the idle listening state. Another model used in BN-MAC is the intelligent decision-making (IDM) model, which helps the nodes sense the nature of the environment. Based on the nature of the environment, the nodes decide whether to use the active or passive mode. This decision power of the nodes further reduces energy consumption because the nodes turn off the radio of the transceiver in the passive mode. The third model is the least-distance smart neighboring search (LDSNS), which determines the shortest efficient path to the one-hop neighbor and also provides cross-layering support to handle the mobility of the nodes. The BN-MAC also incorporates a semi-synchronous feature with a low duty cycle, which is advantageous for reducing the latency and energy consumption for several WSN application areas to improve the throughput. BN-MAC uses a unique window slot size to enhance the contention resolution issue for improved throughput. BN-MAC also prefers to communicate within a one-hop destination using Anycast, which maintains load balancing to maintain network reliability. BN-MAC is introduced with the goal of supporting four major application areas: monitoring and behavioral areas, controlling natural disasters, human-centric applications, and tracking mobility and static home automation devices from remote places. These application areas require a congestion-free mobility-supported MAC protocol to guarantee reliable data delivery. BN-MAC was evaluated using network simulator-2 (ns2) and compared with other hybrid MAC protocols, such as Zebra medium access control (Z-MAC), advertisement-based MAC (A-MAC), Speck-MAC, adaptive duty cycle SMAC (ADC-SMAC), and low-power real-time medium access control (LPR-MAC). The simulation results indicate that BN-MAC is a robust and energy-efficient protocol that outperforms other hybrid MAC protocols in the context of quality of service (QoS) parameters, such as energy consumption, latency, throughput, channel access time, successful delivery rate, coverage efficiency, and average duty cycle.
Energy-Efficient Boarder Node Medium Access Control Protocol for Wireless Sensor Networks
Razaque, Abdul; Elleithy, Khaled M.
2014-01-01
This paper introduces the design, implementation, and performance analysis of the scalable and mobility-aware hybrid protocol named boarder node medium access control (BN-MAC) for wireless sensor networks (WSNs), which leverages the characteristics of scheduled and contention-based MAC protocols. Like contention-based MAC protocols, BN-MAC achieves high channel utilization, network adaptability under heavy traffic and mobility, and low latency and overhead. Like schedule-based MAC protocols, BN-MAC reduces idle listening time, emissions, and collision handling at low cost at one-hop neighbor nodes and achieves high channel utilization under heavy network loads. BN-MAC is particularly designed for region-wise WSNs. Each region is controlled by a boarder node (BN), which is of paramount importance. The BN coordinates with the remaining nodes within and beyond the region. Unlike other hybrid MAC protocols, BN-MAC incorporates three promising models that further reduce the energy consumption, idle listening time, overhearing, and congestion to improve the throughput and reduce the latency. One of the models used with BN-MAC is automatic active and sleep (AAS), which reduces the ideal listening time. When nodes finish their monitoring process, AAS lets them automatically go into the sleep state to avoid the idle listening state. Another model used in BN-MAC is the intelligent decision-making (IDM) model, which helps the nodes sense the nature of the environment. Based on the nature of the environment, the nodes decide whether to use the active or passive mode. This decision power of the nodes further reduces energy consumption because the nodes turn off the radio of the transceiver in the passive mode. The third model is the least-distance smart neighboring search (LDSNS), which determines the shortest efficient path to the one-hop neighbor and also provides cross-layering support to handle the mobility of the nodes. The BN-MAC also incorporates a semi-synchronous feature with a low duty cycle, which is advantageous for reducing the latency and energy consumption for several WSN application areas to improve the throughput. BN-MAC uses a unique window slot size to enhance the contention resolution issue for improved throughput. BN-MAC also prefers to communicate within a one-hop destination using Anycast, which maintains load balancing to maintain network reliability. BN-MAC is introduced with the goal of supporting four major application areas: monitoring and behavioral areas, controlling natural disasters, human-centric applications, and tracking mobility and static home automation devices from remote places. These application areas require a congestion-free mobility-supported MAC protocol to guarantee reliable data delivery. BN-MAC was evaluated using network simulator-2 (ns2) and compared with other hybrid MAC protocols, such as Zebra medium access control (Z-MAC), advertisement-based MAC (A-MAC), Speck-MAC, adaptive duty cycle SMAC (ADC-SMAC), and low-power real-time medium access control (LPR-MAC). The simulation results indicate that BN-MAC is a robust and energy-efficient protocol that outperforms other hybrid MAC protocols in the context of quality of service (QoS) parameters, such as energy consumption, latency, throughput, channel access time, successful delivery rate, coverage efficiency, and average duty cycle. PMID:24625737
NASA Astrophysics Data System (ADS)
Yang, YuGuang; Zhang, YuChen; Xu, Gang; Chen, XiuBo; Zhou, Yi-Hua; Shi, WeiMin
2018-03-01
Li et al. first proposed a quantum hash function (QHF) in a quantum-walk architecture. In their scheme, two two-particle interactions, i.e., I interaction and π-phase interaction are introduced and the choice of I or π-phase interactions at each iteration depends on a message bit. In this paper, we propose an efficient QHF by dense coding of coin operators in discrete-time quantum walk. Compared with existing QHFs, our protocol has the following advantages: the efficiency of the QHF can be doubled and even more; only one particle is enough and two-particle interactions are unnecessary so that quantum resources are saved. It is a clue to apply the dense coding technique to quantum cryptographic protocols, especially to the applications with restricted quantum resources.
An optimized protocol for generation and analysis of Ion Proton sequencing reads for RNA-Seq.
Yuan, Yongxian; Xu, Huaiqian; Leung, Ross Ka-Kit
2016-05-26
Previous studies compared running cost, time and other performance measures of popular sequencing platforms. However, comprehensive assessment of library construction and analysis protocols for Proton sequencing platform remains unexplored. Unlike Illumina sequencing platforms, Proton reads are heterogeneous in length and quality. When sequencing data from different platforms are combined, this can result in reads with various read length. Whether the performance of the commonly used software for handling such kind of data is satisfactory is unknown. By using universal human reference RNA as the initial material, RNaseIII and chemical fragmentation methods in library construction showed similar result in gene and junction discovery number and expression level estimated accuracy. In contrast, sequencing quality, read length and the choice of software affected mapping rate to a much larger extent. Unspliced aligner TMAP attained the highest mapping rate (97.27 % to genome, 86.46 % to transcriptome), though 47.83 % of mapped reads were clipped. Long reads could paradoxically reduce mapping in junctions. With reference annotation guide, the mapping rate of TopHat2 significantly increased from 75.79 to 92.09 %, especially for long (>150 bp) reads. Sailfish, a k-mer based gene expression quantifier attained highly consistent results with that of TaqMan array and highest sensitivity. We provided for the first time, the reference statistics of library preparation methods, gene detection and quantification and junction discovery for RNA-Seq by the Ion Proton platform. Chemical fragmentation performed equally well with the enzyme-based one. The optimal Ion Proton sequencing options and analysis software have been evaluated.
Sharma, Davinder; Golla, Naresh; Singh, Dheer; Onteru, Suneel K
2018-03-01
The next-generation sequencing (NGS) based RNA sequencing (RNA-Seq) and transcriptome profiling offers an opportunity to unveil complex biological processes. Successful RNA-Seq and transcriptome profiling requires a large amount of high-quality RNA. However, NGS-quality RNA isolation is extremely difficult from recalcitrant adipose tissue (AT) with high lipid content and low cell numbers. Further, the amount and biochemical composition of AT lipid varies depending upon the animal species which can pose different degree of resistance to RNA extraction. Currently available approaches may work effectively in one species but can be almost unproductive in another species. Herein, we report a two step protocol for the extraction of NGS quality RNA from AT across a broad range of animal species. © 2017 Wiley Periodicals, Inc.
WEAMR — A Weighted Energy Aware Multipath Reliable Routing Mechanism for Hotline-Based WSNs
Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung
2013-01-01
Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs. PMID:23669714
WEAMR-a weighted energy aware multipath reliable routing mechanism for hotline-based WSNs.
Tufail, Ali; Qamar, Arslan; Khan, Adil Mehmood; Baig, Waleed Akram; Kim, Ki-Hyung
2013-05-13
Reliable source to sink communication is the most important factor for an efficient routing protocol especially in domains of military, healthcare and disaster recovery applications. We present weighted energy aware multipath reliable routing (WEAMR), a novel energy aware multipath routing protocol which utilizes hotline-assisted routing to meet such requirements for mission critical applications. The protocol reduces the number of average hops from source to destination and provides unmatched reliability as compared to well known reactive ad hoc protocols i.e., AODV and AOMDV. Our protocol makes efficient use of network paths based on weighted cost calculation and intelligently selects the best possible paths for data transmissions. The path cost calculation considers end to end number of hops, latency and minimum energy node value in the path. In case of path failure path recalculation is done efficiently with minimum latency and control packets overhead. Our evaluation shows that our proposal provides better end-to-end delivery with less routing overhead and higher packet delivery success ratio compared to AODV and AOMDV. The use of multipath also increases overall life time of WSN network using optimum energy available paths between sender and receiver in WDNs.
A MAC Protocol for Medical Monitoring Applications of Wireless Body Area Networks
Shu, Minglei; Yuan, Dongfeng; Zhang, Chongqing; Wang, Yinglong; Chen, Changfang
2015-01-01
Targeting the medical monitoring applications of wireless body area networks (WBANs), a hybrid medium access control protocol using an interrupt mechanism (I-MAC) is proposed to improve the energy and time slot utilization efficiency and to meet the data delivery delay requirement at the same time. Unlike existing hybrid MAC protocols, a superframe structure with a longer length is adopted to avoid unnecessary beacons. The time slots are mostly allocated to nodes with periodic data sources. Short interruption slots are inserted into the superframe to convey the urgent data and to guarantee the real-time requirements of these data. During these interruption slots, the coordinator can break the running superframe and start a new superframe. A contention access period (CAP) is only activated when there are more data that need to be delivered. Experimental results show the effectiveness of the proposed MAC protocol in WBANs with low urgent traffic. PMID:26046596
Decoding DNA labels by melting curve analysis using real-time PCR.
Balog, József A; Fehér, Liliána Z; Puskás, László G
2017-12-01
Synthetic DNA has been used as an authentication code for a diverse number of applications. However, existing decoding approaches are based on either DNA sequencing or the determination of DNA length variations. Here, we present a simple alternative protocol for labeling different objects using a small number of short DNA sequences that differ in their melting points. Code amplification and decoding can be done in two steps using quantitative PCR (qPCR). To obtain a DNA barcode with high complexity, we defined 8 template groups, each having 4 different DNA templates, yielding 158 (>2.5 billion) combinations of different individual melting temperature (Tm) values and corresponding ID codes. The reproducibility and specificity of the decoding was confirmed by using the most complex template mixture, which had 32 different products in 8 groups with different Tm values. The industrial applicability of our protocol was also demonstrated by labeling a drone with an oil-based paint containing a predefined DNA code, which was then successfully decoded. The method presented here consists of a simple code system based on a small number of synthetic DNA sequences and a cost-effective, rapid decoding protocol using a few qPCR reactions, enabling a wide range of authentication applications.
Detection of DNA Methylation by Whole-Genome Bisulfite Sequencing.
Li, Qing; Hermanson, Peter J; Springer, Nathan M
2018-01-01
DNA methylation plays an important role in the regulation of the expression of transposons and genes. Various methods have been developed to assay DNA methylation levels. Bisulfite sequencing is considered to be the "gold standard" for single-base resolution measurement of DNA methylation levels. Coupled with next-generation sequencing, whole-genome bisulfite sequencing (WGBS) allows DNA methylation to be evaluated at a genome-wide scale. Here, we described a protocol for WGBS in plant species with large genomes. This protocol has been successfully applied to assay genome-wide DNA methylation levels in maize and barley. This protocol has also been successfully coupled with sequence capture technology to assay DNA methylation levels in a targeted set of genomic regions.
Ureba, A; Salguero, F J; Barbeiro, A R; Jimenez-Ortega, E; Baeza, J A; Miras, H; Linares, R; Perucha, M; Leal, A
2014-08-01
The authors present a hybrid direct multileaf collimator (MLC) aperture optimization model exclusively based on sequencing of patient imaging data to be implemented on a Monte Carlo treatment planning system (MC-TPS) to allow the explicit radiation transport simulation of advanced radiotherapy treatments with optimal results in efficient times for clinical practice. The planning system (called CARMEN) is a full MC-TPS, controlled through aMATLAB interface, which is based on the sequencing of a novel map, called "biophysical" map, which is generated from enhanced image data of patients to achieve a set of segments actually deliverable. In order to reduce the required computation time, the conventional fluence map has been replaced by the biophysical map which is sequenced to provide direct apertures that will later be weighted by means of an optimization algorithm based on linear programming. A ray-casting algorithm throughout the patient CT assembles information about the found structures, the mass thickness crossed, as well as PET values. Data are recorded to generate a biophysical map for each gantry angle. These maps are the input files for a home-made sequencer developed to take into account the interactions of photons and electrons with the MLC. For each linac (Axesse of Elekta and Primus of Siemens) and energy beam studied (6, 9, 12, 15 MeV and 6 MV), phase space files were simulated with the EGSnrc/BEAMnrc code. The dose calculation in patient was carried out with the BEAMDOSE code. This code is a modified version of EGSnrc/DOSXYZnrc able to calculate the beamlet dose in order to combine them with different weights during the optimization process. Three complex radiotherapy treatments were selected to check the reliability of CARMEN in situations where the MC calculation can offer an added value: A head-and-neck case (Case I) with three targets delineated on PET/CT images and a demanding dose-escalation; a partial breast irradiation case (Case II) solved with photon and electron modulated beams (IMRT + MERT); and a prostatic bed case (Case III) with a pronounced concave-shaped PTV by using volumetric modulated arc therapy. In the three cases, the required target prescription doses and constraints on organs at risk were fulfilled in a short enough time to allow routine clinical implementation. The quality assurance protocol followed to check CARMEN system showed a high agreement with the experimental measurements. A Monte Carlo treatment planning model exclusively based on maps performed from patient imaging data has been presented. The sequencing of these maps allows obtaining deliverable apertures which are weighted for modulation under a linear programming formulation. The model is able to solve complex radiotherapy treatments with high accuracy in an efficient computation time.
Hasinoff, Samuel W; Kutulakos, Kiriakos N
2011-11-01
In this paper, we consider the problem of imaging a scene with a given depth of field at a given exposure level in the shortest amount of time possible. We show that by 1) collecting a sequence of photos and 2) controlling the aperture, focus, and exposure time of each photo individually, we can span the given depth of field in less total time than it takes to expose a single narrower-aperture photo. Using this as a starting point, we obtain two key results. First, for lenses with continuously variable apertures, we derive a closed-form solution for the globally optimal capture sequence, i.e., that collects light from the specified depth of field in the most efficient way possible. Second, for lenses with discrete apertures, we derive an integer programming problem whose solution is the optimal sequence. Our results are applicable to off-the-shelf cameras and typical photography conditions, and advocate the use of dense, wide-aperture photo sequences as a light-efficient alternative to single-shot, narrow-aperture photography.
Using SQL Databases for Sequence Similarity Searching and Analysis.
Pearson, William R; Mackey, Aaron J
2017-09-13
Relational databases can integrate diverse types of information and manage large sets of similarity search results, greatly simplifying genome-scale analyses. By focusing on taxonomic subsets of sequences, relational databases can reduce the size and redundancy of sequence libraries and improve the statistical significance of homologs. In addition, by loading similarity search results into a relational database, it becomes possible to explore and summarize the relationships between all of the proteins in an organism and those in other biological kingdoms. This unit describes how to use relational databases to improve the efficiency of sequence similarity searching and demonstrates various large-scale genomic analyses of homology-related data. It also describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. The unit also introduces search_demo, a database that stores sequence similarity search results. The search_demo database is then used to explore the evolutionary relationships between E. coli proteins and proteins in other organisms in a large-scale comparative genomic analysis. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Robust DNA Isolation and High-throughput Sequencing Library Construction for Herbarium Specimens.
Saeidi, Saman; McKain, Michael R; Kellogg, Elizabeth A
2018-03-08
Herbaria are an invaluable source of plant material that can be used in a variety of biological studies. The use of herbarium specimens is associated with a number of challenges including sample preservation quality, degraded DNA, and destructive sampling of rare specimens. In order to more effectively use herbarium material in large sequencing projects, a dependable and scalable method of DNA isolation and library preparation is needed. This paper demonstrates a robust, beginning-to-end protocol for DNA isolation and high-throughput library construction from herbarium specimens that does not require modification for individual samples. This protocol is tailored for low quality dried plant material and takes advantage of existing methods by optimizing tissue grinding, modifying library size selection, and introducing an optional reamplification step for low yield libraries. Reamplification of low yield DNA libraries can rescue samples derived from irreplaceable and potentially valuable herbarium specimens, negating the need for additional destructive sampling and without introducing discernible sequencing bias for common phylogenetic applications. The protocol has been tested on hundreds of grass species, but is expected to be adaptable for use in other plant lineages after verification. This protocol can be limited by extremely degraded DNA, where fragments do not exist in the desired size range, and by secondary metabolites present in some plant material that inhibit clean DNA isolation. Overall, this protocol introduces a fast and comprehensive method that allows for DNA isolation and library preparation of 24 samples in less than 13 h, with only 8 h of active hands-on time with minimal modifications.
Best, Katharine; Oakes, Theres; Heather, James M.; Shawe-Taylor, John; Chain, Benny
2015-01-01
The polymerase chain reaction (PCR) is one of the most widely used techniques in molecular biology. In combination with High Throughput Sequencing (HTS), PCR is widely used to quantify transcript abundance for RNA-seq, and in the context of analysis of T and B cell receptor repertoires. In this study, we combine DNA barcoding with HTS to quantify PCR output from individual target molecules. We develop computational tools that simulate both the PCR branching process itself, and the subsequent subsampling which typically occurs during HTS sequencing. We explore the influence of different types of heterogeneity on sequencing output, and compare them to experimental results where the efficiency of amplification is measured by barcodes uniquely identifying each molecule of starting template. Our results demonstrate that the PCR process introduces substantial amplification heterogeneity, independent of primer sequence and bulk experimental conditions. This heterogeneity can be attributed both to inherited differences between different template DNA molecules, and the inherent stochasticity of the PCR process. The results demonstrate that PCR heterogeneity arises even when reaction and substrate conditions are kept as constant as possible, and therefore single molecule barcoding is essential in order to derive reproducible quantitative results from any protocol combining PCR with HTS. PMID:26459131
Intelligent QoS routing algorithm based on improved AODV protocol for Ad Hoc networks
NASA Astrophysics Data System (ADS)
Huibin, Liu; Jun, Zhang
2016-04-01
Mobile Ad Hoc Networks were playing an increasingly important part in disaster reliefs, military battlefields and scientific explorations. However, networks routing difficulties are more and more outstanding due to inherent structures. This paper proposed an improved cuckoo searching-based Ad hoc On-Demand Distance Vector Routing protocol (CSAODV). It elaborately designs the calculation methods of optimal routing algorithm used by protocol and transmission mechanism of communication-package. In calculation of optimal routing algorithm by CS Algorithm, by increasing QoS constraint, the found optimal routing algorithm can conform to the requirements of specified bandwidth and time delay, and a certain balance can be obtained among computation spending, bandwidth and time delay. Take advantage of NS2 simulation software to take performance test on protocol in three circumstances and validate the feasibility and validity of CSAODV protocol. In results, CSAODV routing protocol is more adapt to the change of network topological structure than AODV protocol, which improves package delivery fraction of protocol effectively, reduce the transmission time delay of network, reduce the extra burden to network brought by controlling information, and improve the routing efficiency of network.
Shahrisa, Aziz; Teimuri-Mofrad, Reza; Gholamhosseini-Nazari, Mahdi
2015-02-01
A variety of organocatalysts has been screened for the synthesis of arylaminonaphthols. It has been shown that (N,N-dimethylethanolamine) is a highly efficient organocatalyst for the direct synthesis of a novel class of arylaminonaphthols via three-component condensation of 2-naphthol, aldehydes, and arylamines under solvent-free conditions. Mild, one-pot, and green reaction conditions, relatively short reaction times and good yields make this protocol highly significant. 25 new compounds have been synthesized by this method.
Grueneisen, Johannes; Sawicki, Lino Morris; Wetter, Axel; Kirchner, Julian; Kinner, Sonja; Aktas, Bahriye; Forsting, Michael; Ruhlmann, Verena; Umutlu, Lale
2017-04-01
To investigate the diagnostic value of different MR sequences and 18F-FDG PET data for whole-body restaging of breast cancer patients utilizing PET/MRI. A total of 36 patients with suspected tumor recurrence of breast cancer based on clinical follow-up or abnormal findings in follow-up examinations (e.g. CT, MRI) were prospectively enrolled in this study. All patients underwent a PET/CT and subsequently an additional PET/MR scan. Two readers were instructed to identify the occurrence of a tumor relapse in subsequent MR and PET/MR readings, utilizing different MR sequence constellations for each session. The diagnostic confidence for the determination of a malignant or benign lesion was qualitatively rated (3-point ordinal scale) for each lesion in the different reading sessions and the lesion conspicuity (4-point ordinal scale) for the three different MR sequences was additionally evaluated. Tumor recurrence was present in 25/36 (69%) patients. All three PET/MRI readings showed a significantly higher accuracy as well as higher confidence levels for the detection of recurrent breast cancer lesions when compared to MRI alone (p<0.05). Furthermore, all three PET/MR sequence constellations showed comparable diagnostic accuracy for the identification of a breast cancer recurrence (p>0.05), yet the highest confidence levels were obtained, when all three MR sequences were used for image interpretation. Moreover, contrast-enhanced T1-weighted VIBE imaging showed significantly higher values for the delineation of malignant and benign lesions when compared to T2w HASTE and diffusion-weighted imaging. Integrated PET/MRI provides superior restaging of breast cancer patients over MRI alone. Facing the need for appropriate and efficient whole-body PET/MR protocols, our results show the feasibility of fast and morphologically adequate PET/MR protocols. However, considering an equivalent accuracy for the detection of breast cancer recurrences in the three PET/MR readings, the application of contrast-agent and the inclusion of DWI in the study protocol seems to be debatable. Copyright © 2017 Elsevier B.V. All rights reserved.
Steiner, S; Vogl, T J; Fischer, P; Steger, W; Neuhaus, P; Keck, H
1995-08-01
The aim of our study was to evaluate a T2-weighted turbo-spinecho sequence in comparison to a T2-weighted spinecho sequence in imaging focal liver lesions. In our study 35 patients with suspected focal liver lesions were examined. Standardised imaging protocol included a conventional T2-weighted SE sequence (TR/TE = 2000/90/45, acquisition time = 10.20) as well as a T2-weighted TSE sequence (TR/TE = 4700/90, acquisition time = 6.33). Calculation of S/N and C/N ratio as a basis of quantitative evaluation was done using standard methods. A diagnostic score was implemented to enable qualitative assessment. In 7% (n = 2) the TSE sequence enabled detection of further liver lesions showing a size of less than 1 cm in diameter. Comparing anatomical details the TSE sequence was superior. S/N and C/N ratio of anatomic and pathologic structures of the TSE sequence were higher compared to results of the SE sequence. Our results indicate that the T2-weighted turbo-spinecho sequence is well appropriate for imaging focal liver lesions, and leads to reduction of imaging time.
A Tree Based Broadcast Scheme for (m, k)-firm Real-Time Stream in Wireless Sensor Networks.
Park, HoSung; Kim, Beom-Su; Kim, Kyong Hoon; Shah, Babar; Kim, Ki-Il
2017-11-09
Recently, various unicast routing protocols have been proposed to deliver measured data from the sensor node to the sink node within the predetermined deadline in wireless sensor networks. In parallel with their approaches, some applications demand the specific service, which is based on broadcast to all nodes within the deadline, the feasible real-time traffic model and improvements in energy efficiency. However, current protocols based on either flooding or one-to-one unicast cannot meet the above requirements entirely. Moreover, as far as the authors know, there is no study for the real-time broadcast protocol to support the application-specific traffic model in WSN yet. Based on the above analysis, in this paper, we propose a new ( m , k )-firm-based Real-time Broadcast Protocol (FRBP) by constructing a broadcast tree to satisfy the ( m , k )-firm, which is applicable to the real-time model in resource-constrained WSNs. The broadcast tree in FRBP is constructed by the distance-based priority scheme, whereas energy efficiency is improved by selecting as few as nodes on a tree possible. To overcome the unstable network environment, the recovery scheme invokes rapid partial tree reconstruction in order to designate another node as the parent on a tree according to the measured ( m , k )-firm real-time condition and local states monitoring. Finally, simulation results are given to demonstrate the superiority of FRBP compared to the existing schemes in terms of average deadline missing ratio, average throughput and energy consumption.
An Energy-Efficient MAC Protocol for Medical Emergency Monitoring Body Sensor Networks
Zhang, Chongqing; Wang, Yinglong; Liang, Yongquan; Shu, Minglei; Chen, Changfang
2016-01-01
Medical emergency monitoring body sensor networks (BSNs) monitor the occurrence of medical emergencies and are helpful for the daily care of the elderly and chronically ill people. Such BSNs are characterized by rare traffic when there is no emergency occurring, high real-time and reliable requirements of emergency data and demand for a fast wake-up mechanism for waking up all nodes when an emergency happens. A beacon-enabled MAC protocol is specially designed to meet the demands of medical emergency monitoring BSNs. The rarity of traffic is exploited to improve energy efficiency. By adopting a long superframe structure to avoid unnecessary beacons and allocating most of the superframe to be inactive periods, the duty cycle is reduced to an extremely low level to save energy. Short active time slots are interposed into the superframe and shared by all of the nodes to deliver the emergency data in a low-delay and reliable way to meet the real-time and reliable requirements. The interposition slots can also be used by the coordinator to broadcast network demands to wake-up all nodes in a low-delay and energy-efficient way. Experiments display that the proposed MAC protocol works well in BSNs with low emergency data traffic. PMID:26999145
Sequence Polishing Library (SPL) v10.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberortner, Ernst
The Sequence Polishing Library (SPL) is a suite of software tools in order to automate "Design for Synthesis and Assembly" workflows. Specifically: The SPL "Converter" tool converts files among the following sequence data exchange formats: CSV, FASTA, GenBank, and Synthetic Biology Open Language (SBOL); The SPL "Juggler" tool optimizes the codon usages of DNA coding sequences according to an optimization strategy, a user-specific codon usage table and genetic code. In addition, the SPL "Juggler" can translate amino acid sequences into DNA sequences.:The SPL "Polisher" verifies NA sequences against DNA synthesis constraints, such as GC content, repeating k-mers, and restriction sites.more » In case of violations, the "Polisher" reports the violations in a comprehensive manner. The "Polisher" tool can also modify the violating regions according to an optimization strategy, a user-specific codon usage table and genetic code;The SPL "Partitioner" decomposes large DNA sequences into smaller building blocks with partial overlaps that enable an efficient assembly. The "Partitioner" enables the user to configure the characteristics of the overlaps, which are mostly determined by the utilized assembly protocol, such as length, GC content, or melting temperature.« less
Genetic Characterization of a Panel of Diverse HIV-1 Isolates at Seven International Sites
Chen, Yue; Sanchez, Ana M.; Sabino, Ester; Hunt, Gillian; Ledwaba, Johanna; Hackett, John; Swanson, Priscilla; Hewlett, Indira; Ragupathy, Viswanath; Vikram Vemula, Sai; Zeng, Peibin; Tee, Kok-Keng; Chow, Wei Zhen; Ji, Hezhao; Sandstrom, Paul; Denny, Thomas N.; Busch, Michael P.; Gao, Feng
2016-01-01
HIV-1 subtypes and drug resistance are routinely tested by many international surveillance groups. However, results from different sites often vary. A systematic comparison of results from multiple sites is needed to determine whether a standardized protocol is required for consistent and accurate data analysis. A panel of well-characterized HIV-1 isolates (N = 50) from the External Quality Assurance Program Oversight Laboratory (EQAPOL) was assembled for evaluation at seven international sites. This virus panel included seven subtypes, six circulating recombinant forms (CRFs), nine unique recombinant forms (URFs) and three group O viruses. Seven viruses contained 10 major drug resistance mutations (DRMs). HIV-1 isolates were prepared at a concentration of 107 copies/ml and compiled into blinded panels. Subtypes and DRMs were determined with partial or full pol gene sequences by conventional Sanger sequencing and/or Next Generation Sequencing (NGS). Subtype and DRM results were reported and decoded for comparison with full-length genome sequences generated by EQAPOL. The partial pol gene was amplified by RT-PCR and sequenced for 89.4%-100% of group M viruses at six sites. Subtyping results of majority of the viruses (83%-97.9%) were correctly determined for the partial pol sequences. All 10 major DRMs in seven isolates were detected at these six sites. The complete pol gene sequence was also obtained by NGS at one site. However, this method missed six group M viruses and sequences contained host chromosome fragments. Three group O viruses were only characterized with additional group O-specific RT-PCR primers employed by one site. These results indicate that PCR protocols and subtyping tools should be standardized to efficiently amplify diverse viruses and more consistently assign virus genotypes, which is critical for accurate global subtype and drug resistance surveillance. Targeted NGS analysis of partial pol sequences can serve as an alternative approach, especially for detection of low-abundance DRMs. PMID:27314585
Genetic Characterization of a Panel of Diverse HIV-1 Isolates at Seven International Sites.
Hora, Bhavna; Keating, Sheila M; Chen, Yue; Sanchez, Ana M; Sabino, Ester; Hunt, Gillian; Ledwaba, Johanna; Hackett, John; Swanson, Priscilla; Hewlett, Indira; Ragupathy, Viswanath; Vikram Vemula, Sai; Zeng, Peibin; Tee, Kok-Keng; Chow, Wei Zhen; Ji, Hezhao; Sandstrom, Paul; Denny, Thomas N; Busch, Michael P; Gao, Feng
2016-01-01
HIV-1 subtypes and drug resistance are routinely tested by many international surveillance groups. However, results from different sites often vary. A systematic comparison of results from multiple sites is needed to determine whether a standardized protocol is required for consistent and accurate data analysis. A panel of well-characterized HIV-1 isolates (N = 50) from the External Quality Assurance Program Oversight Laboratory (EQAPOL) was assembled for evaluation at seven international sites. This virus panel included seven subtypes, six circulating recombinant forms (CRFs), nine unique recombinant forms (URFs) and three group O viruses. Seven viruses contained 10 major drug resistance mutations (DRMs). HIV-1 isolates were prepared at a concentration of 107 copies/ml and compiled into blinded panels. Subtypes and DRMs were determined with partial or full pol gene sequences by conventional Sanger sequencing and/or Next Generation Sequencing (NGS). Subtype and DRM results were reported and decoded for comparison with full-length genome sequences generated by EQAPOL. The partial pol gene was amplified by RT-PCR and sequenced for 89.4%-100% of group M viruses at six sites. Subtyping results of majority of the viruses (83%-97.9%) were correctly determined for the partial pol sequences. All 10 major DRMs in seven isolates were detected at these six sites. The complete pol gene sequence was also obtained by NGS at one site. However, this method missed six group M viruses and sequences contained host chromosome fragments. Three group O viruses were only characterized with additional group O-specific RT-PCR primers employed by one site. These results indicate that PCR protocols and subtyping tools should be standardized to efficiently amplify diverse viruses and more consistently assign virus genotypes, which is critical for accurate global subtype and drug resistance surveillance. Targeted NGS analysis of partial pol sequences can serve as an alternative approach, especially for detection of low-abundance DRMs.
Proactive Byzantine Quorum Systems
NASA Astrophysics Data System (ADS)
Alchieri, Eduardo A. P.; Bessani, Alysson Neves; Pereira, Fernando Carlos; da Silva Fraga, Joni
Byzantine Quorum Systems is a replication technique used to ensure availability and consistency of replicates data even in presence of arbitrary faults. This paper presents a Byzantine Quorum Systems protocol that provides atomic semantics despite the existence of Byzantine clients and servers. Moreover, this protocol is integrated with a protocol for proactive recovery of servers. In that way, the system tolerates any number of failures during its lifetime, since no more than f out of n servers fail during a small interval of time between recoveries. All solutions proposed in this paper can be used on asynchronous systems, which requires no time assumptions. The proposed quorum system read and write protocols have been implemented and their efficiency is demonstrated through some experiments carried out in the Emulab platform.
Unexpected substrate specificity of T4 DNA ligase revealed by in vitro selection
NASA Technical Reports Server (NTRS)
Harada, Kazuo; Orgel, Leslie E.
1993-01-01
We have used in vitro selection techniques to characterize DNA sequences that are ligated efficiently by T4 DNA ligase. We find that the ensemble of selected sequences ligates about 50 times as efficiently as the random mixture of sequences used as the input for selection. Surprisingly many of the selected sequences failed to produce a match at or close to the ligation junction. None of the 20 selected oligomers that we sequenced produced a match two bases upstream from the ligation junction.
Logan, Grace; Freimanis, Graham L; King, David J; Valdazo-González, Begoña; Bachanek-Bankowska, Katarzyna; Sanderson, Nicholas D; Knowles, Nick J; King, Donald P; Cottam, Eleanor M
2014-09-30
Next-Generation Sequencing (NGS) is revolutionizing molecular epidemiology by providing new approaches to undertake whole genome sequencing (WGS) in diagnostic settings for a variety of human and veterinary pathogens. Previous sequencing protocols have been subject to biases such as those encountered during PCR amplification and cell culture, or are restricted by the need for large quantities of starting material. We describe here a simple and robust methodology for the generation of whole genome sequences on the Illumina MiSeq. This protocol is specific for foot-and-mouth disease virus (FMDV) or other polyadenylated RNA viruses and circumvents both the use of PCR and the requirement for large amounts of initial template. The protocol was successfully validated using five FMDV positive clinical samples from the 2001 epidemic in the United Kingdom, as well as a panel of representative viruses from all seven serotypes. In addition, this protocol was successfully used to recover 94% of an FMDV genome that had previously been identified as cell culture negative. Genome sequences from three other non-FMDV polyadenylated RNA viruses (EMCV, ERAV, VESV) were also obtained with minor protocol amendments. We calculated that a minimum coverage depth of 22 reads was required to produce an accurate consensus sequence for FMDV O. This was achieved in 5 FMDV/O/UKG isolates and the type O FMDV from the serotype panel with the exception of the 5' genomic termini and area immediately flanking the poly(C) region. We have developed a universal WGS method for FMDV and other polyadenylated RNA viruses. This method works successfully from a limited quantity of starting material and eliminates the requirement for genome-specific PCR amplification. This protocol has the potential to generate consensus-level sequences within a routine high-throughput diagnostic environment.
Genome Editing in Mice Using TALE Nucleases.
Wefers, Benedikt; Brandl, Christina; Ortiz, Oskar; Wurst, Wolfgang; Kühn, Ralf
2016-01-01
Gene engineering for generating targeted mouse mutants is a key technology for biomedical research. Using TALENs as sequence-specific nucleases to induce targeted double-strand breaks, the mouse genome can be directly modified in zygotes in a single step without the need for embryonic stem cells. By embryo microinjection of TALEN mRNAs and targeting vectors, knockout and knock-in alleles can be generated fast and efficiently. In this chapter we provide protocols for the application of TALENs in mouse zygotes.
NASA Astrophysics Data System (ADS)
Rodrigues, Diego S.; Mancera, Paulo F. A.; Pinho, Suani T. R.
2016-12-01
Despite the current and increasingly successful fight against cancer, there are some important questions concerning the efficiency of its treatment - in particular, the design of oncology chemotherapy protocols. Seeking efficiency, schedules based on more frequent, low-doses of drugs, known as metronomic chemotherapy, have been proposed as an alternative to the classical standard protocol of chemotherapy administration. The in silico approach may be very useful for providing a comparative analysis of these two kinds of protocols. In so doing, we found that metronomic schedules are more effective in eliminating tumour cells mainly due to their chemotherapeutic action on endothelial cells and that more frequent, low drug doses also entail outcomes in which the survival time of patient is increased.
ASCOT: a text mining-based web-service for efficient search and assisted creation of clinical trials
2012-01-01
Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols. PMID:22595088
Korkontzelos, Ioannis; Mu, Tingting; Ananiadou, Sophia
2012-04-30
Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, R.
1992-01-01
The key elements in the second year (1991-92) of our project are: (1) implementation of the distributed system prototype; (2) successful passing of the candidacy examination and a PhD proposal acceptance by the funded student; (3) design of storage efficient schemes for replicated distributed systems; and (4) modeling of gracefully degrading reliable computing systems. In the third year of the project (1992-93), we propose to: (1) complete the testing of the prototype; (2) enhance the functionality of the modules by enabling the experimentation with more complex protocols; (3) use the prototype to verify the theoretically predicted performance of locking protocols, etc.; and (4) work on issues related to real-time distributed systems. This should result in efficient protocols for these systems.
Chevrier, Sandy; Boidot, Romain
2014-10-06
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM, BARD1, BRCA1, BRCA2, BRIP1, CHEK2, PALB2, RAD50, RAD51C, RAD80, and TP53) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
J. Dentz, F. Conlin, D. Podorson, and K. Alaigh
2014-08-01
In this project, Building America team ARIES worked with two public housing authorities (PHA) to develop packages of energy efficiency retrofit measures the PHAs can cost effectively implement at the time when units are refurbished between occupancies.
Camargo, Vinícius da Silva; Santana, Bruna Nicoleti; Ferrari, Elis Domingos; Nakamura, Alex Akira; Nagata, Walter Bertequini; Nardi, Ana Rita Moraes; Meireles, Marcelo Vasconcelos
2018-01-01
This study used several diagnostic methods to examine the occurrence of and molecularly characterize Cryptosporidium spp. in captive canaries (Serinus canaria) in southern and southeastern Brazil. A total of 498 fecal samples were purified by centrifugal-flotation using Sheather's solution. Cryptosporidium spp. diagnosis was performed using three diagnostic methods: malachite green negative staining, nested PCR targeting the 18S rRNA gene, followed by sequencing the amplified fragments, and duplex real-time PCR targeting the 18S rRNA specific to detect Cryptosporidium galli and Cryptosporidium avian genotype III. The overall positivity for Cryptosporidium spp. (total samples positive in at least one protocol) from the microscopic analysis, nested PCR and duplex real-time PCR protocol results was 13.3% (66/498). The positivity rates were 2.0% (10/498) and 4.6% (23/498) for Cryptosporidium spp. by microscopy and nested PCR, respectively. Sequencing of 20 samples amplified by nested PCR identified C. galli (3.0%; 15/498), Cryptosporidium avian genotype I (0.8%; 4/498) and Cryptosporidium avium (0.2%; 1/498). Duplex real-time PCR revealed a positivity of 7.8% (39/498) for C. galli and 2.4% (12/498) for avian genotype III. Malachite green negative staining differed significantly from nested PCR in detecting Cryptosporidium spp. Duplex real-time PCR was more sensitive than nested PCR/sequencing for detecting gastric Cryptosporidium in canaries.
Seo, Eunhui; Kang, Hwansu; Lim, Oh-Kyung; Jun, Hee-Sook
2018-05-24
Mature skeletal muscle cells cannot be expanded in culture systems. Therefore, it is difficult to construct an in vitro model for muscle diseases. To establish an efficient protocol for myogenic differentiation of human adipose tissue-derived stem cells (hADSCs), we investigated whether addition of IL-6 and/or myocyte-conditioned media (CM) to conventional differentiation media can shorten the differentiation period. hADSCs were differentiated to myocytes using the conventional protocol or modified with the addition of 25 pg/mL IL-6 and/or C2C12 CM (25% v / v ). The expression of MyoD and myogenine mRNA was significantly higher at 5⁻6 days after differentiation using the modified protocol than with the conventional protocol. mRNA and protein expression of myosin heavy chain, a marker of myotubes, was significantly upregulated at 28 and 42 days of differentiation using the modified protocol, and the level achieved after a 4-week differentiation period was similar to that achieved at 6 weeks using the conventional protocol. The expression of p-STAT3 was significantly increased when the modified protocol was used. Similarly, addition of colivelin, a STAT3 activator, instead of IL-6 and C2C12 CM, promoted the myogenic differentiation of ADSCs. The modified protocol improved differentiation efficiency and reduced the time required for differentiation of myocytes. It might be helpful to save cost and time when preparing myocytes for cell therapies and drug discovery.
Quantum Correlations in Nonlocal Boson Sampling.
Shahandeh, Farid; Lund, Austin P; Ralph, Timothy C
2017-09-22
Determination of the quantum nature of correlations between two spatially separated systems plays a crucial role in quantum information science. Of particular interest is the questions of if and how these correlations enable quantum information protocols to be more powerful. Here, we report on a distributed quantum computation protocol in which the input and output quantum states are considered to be classically correlated in quantum informatics. Nevertheless, we show that the correlations between the outcomes of the measurements on the output state cannot be efficiently simulated using classical algorithms. Crucially, at the same time, local measurement outcomes can be efficiently simulated on classical computers. We show that the only known classicality criterion violated by the input and output states in our protocol is the one used in quantum optics, namely, phase-space nonclassicality. As a result, we argue that the global phase-space nonclassicality inherent within the output state of our protocol represents true quantum correlations.
Detection of Bacterial Pathogens from Broncho-Alveolar Lavage by Next-Generation Sequencing.
Leo, Stefano; Gaïa, Nadia; Ruppé, Etienne; Emonet, Stephane; Girard, Myriam; Lazarevic, Vladimir; Schrenzel, Jacques
2017-09-20
The applications of whole-metagenome shotgun sequencing (WMGS) in routine clinical analysis are still limited. A combination of a DNA extraction procedure, sequencing, and bioinformatics tools is essential for the removal of human DNA and for improving bacterial species identification in a timely manner. We tackled these issues with a broncho-alveolar lavage (BAL) sample from an immunocompromised patient who had developed severe chronic pneumonia. We extracted DNA from the BAL sample with protocols based either on sequential lysis of human and bacterial cells or on the mechanical disruption of all cells. Metagenomic libraries were sequenced on Illumina HiSeq platforms. Microbial community composition was determined by k-mer analysis or by mapping to taxonomic markers. Results were compared to those obtained by conventional clinical culture and molecular methods. Compared to mechanical cell disruption, a sequential lysis protocol resulted in a significantly increased proportion of bacterial DNA over human DNA and higher sequence coverage of Mycobacterium abscessus , Corynebacterium jeikeium and Rothia dentocariosa , the bacteria reported by clinical microbiology tests. In addition, we identified anaerobic bacteria not searched for by the clinical laboratory. Our results further support the implementation of WMGS in clinical routine diagnosis for bacterial identification.
Modularity of Protein Folds as a Tool for Template-Free Modeling of Structures.
Vallat, Brinda; Madrid-Aliste, Carlos; Fiser, Andras
2015-08-01
Predicting the three-dimensional structure of proteins from their amino acid sequences remains a challenging problem in molecular biology. While the current structural coverage of proteins is almost exclusively provided by template-based techniques, the modeling of the rest of the protein sequences increasingly require template-free methods. However, template-free modeling methods are much less reliable and are usually applicable for smaller proteins, leaving much space for improvement. We present here a novel computational method that uses a library of supersecondary structure fragments, known as Smotifs, to model protein structures. The library of Smotifs has saturated over time, providing a theoretical foundation for efficient modeling. The method relies on weak sequence signals from remotely related protein structures to create a library of Smotif fragments specific to the target protein sequence. This Smotif library is exploited in a fragment assembly protocol to sample decoys, which are assessed by a composite scoring function. Since the Smotif fragments are larger in size compared to the ones used in other fragment-based methods, the proposed modeling algorithm, SmotifTF, can employ an exhaustive sampling during decoy assembly. SmotifTF successfully predicts the overall fold of the target proteins in about 50% of the test cases and performs competitively when compared to other state of the art prediction methods, especially when sequence signal to remote homologs is diminishing. Smotif-based modeling is complementary to current prediction methods and provides a promising direction in addressing the structure prediction problem, especially when targeting larger proteins for modeling.
Comparison of two automatic methods for the assessment of brachial artery flow-mediated dilation.
Faita, Francesco; Masi, Stefano; Loukogeorgakis, Stavros; Gemignani, Vincenzo; Okorie, Mike; Bianchini, Elisabetta; Charakida, Marietta; Demi, Marcello; Ghiadoni, Lorenzo; Deanfield, John Eric
2011-01-01
Brachial artery flow-mediated dilation (FMD) is associated with risk factors providing information on cardiovascular prognosis. Despite the large effort to standardize the methodology, the FMD examination is still characterized by problems of reproducibility and reliability that can be partially overcome with the use of automatic systems. We developed real-time software for the assessment of brachial FMD (FMD Studio, Institute of Clinical Physiology, Pisa, Italy) from ultrasound images. The aim of this study is to compare our system with another automatic method (Brachial Analyzer, MIA LLC, IA, USA) which is currently considered as a reference method in FMD assessment. The agreement between systems was assessed as follows. Protocol 1: Mean baseline (Basal), maximal (Max) brachial artery diameter after forearm ischemia and FMD, calculated as maximal percentage diameter increase, have been evaluated in 60 recorded FMD sequences. Protocol 2: Values of diameter and FMD have been evaluated in 618 frames extracted from 12 sequences. All biases are negligible and standard deviations of the differences are satisfactory (protocol 1: -0.27 ± 0.59%; protocol 2: -0.26 ± 0.61%) for FMD measurements. Analysis times were reduced (-33%) when FMD Studio is used. Rejected examinations due to the poor quality were 2% with the FMD Studio and 5% with the Brachial Analyzer. In conclusion, the compared systems show a optimal grade of agreement and they can be used interchangeably. Thus, the use of a system characterized by real-time functionalities could represent a referral method for assessing endothelial function in clinical trials.
Wu, Jun-Zheng; Liu, Qin; Geng, Xiao-Shan; Li, Kai-Mian; Luo, Li-Juan; Liu, Jin-Ping
2017-03-14
Cassava (Manihot esculenta Crantz) is a major crop extensively cultivated in the tropics as both an important source of calories and a promising source for biofuel production. Although stable gene expression have been used for transgenic breeding and gene function study, a quick, easy and large-scale transformation platform has been in urgent need for gene functional characterization, especially after the cassava full genome was sequenced. Fully expanded leaves from in vitro plantlets of Manihot esculenta were used to optimize the concentrations of cellulase R-10 and macerozyme R-10 for obtaining protoplasts with the highest yield and viability. Then, the optimum conditions (PEG4000 concentration and transfection time) were determined for cassava protoplast transient gene expression. In addition, the reliability of the established protocol was confirmed for subcellular protein localization. In this work we optimized the main influencing factors and developed an efficient mesophyll protoplast isolation and PEG-mediated transient gene expression in cassava. The suitable enzyme digestion system was established with the combination of 1.6% cellulase R-10 and 0.8% macerozyme R-10 for 16 h of digestion in the dark at 25 °C, resulting in the high yield (4.4 × 10 7 protoplasts/g FW) and vitality (92.6%) of mesophyll protoplasts. The maximum transfection efficiency (70.8%) was obtained with the incubation of the protoplasts/vector DNA mixture with 25% PEG4000 for 10 min. We validated the applicability of the system for studying the subcellular localization of MeSTP7 (an H + /monosaccharide cotransporter) with our transient expression protocol and a heterologous Arabidopsis transient gene expression system. We optimized the main influencing factors and developed an efficient mesophyll protoplast isolation and transient gene expression in cassava, which will facilitate large-scale characterization of genes and pathways in cassava.
Protocol compliance and time management in blunt trauma resuscitation.
Spanjersberg, W R; Bergs, E A; Mushkudiani, N; Klimek, M; Schipper, I B
2009-01-01
To study advanced trauma life support (ATLS) protocol adherence prospectively in trauma resuscitation and to analyse time management of daily multidisciplinary trauma resuscitation at a level 1 trauma centre, for both moderately and severely injured patients. All victims of severe blunt trauma were consecutively included. Patients with a revised trauma score (RTS) of 12 were resuscitated by a "minor trauma" team and patients with an RTS of less than 12 were resuscitated by a "severe trauma" team. Digital video recordings were used to analyse protocol compliance and time management during initial assessment. From 1 May to 1 September 2003, 193 resuscitations were included. The "minor trauma" team assessed 119 patients, with a mean injury severity score (ISS) of 7 (range 1-45). Overall protocol compliance was 42%, ranging from 0% for thoracic percussion to 93% for thoracic auscultation. The median resuscitation time was 45.9 minutes (range 39.7-55.9). The "severe team" assessed 74 patients, with a mean ISS of 22 (range 1-59). Overall protocol compliance was 53%, ranging from 4% for thoracic percussion to 95% for thoracic auscultation. Resuscitation took 34.8 minutes median (range 21.6-44.1). Results showed the current trauma resuscitation to be ATLS-like, with sometimes very low protocol compliance rates. Timing of secondary survey and radiology and thus time efficiency remains a challenge in all trauma patients. To assess the effect of trauma resuscitation protocols on outcome, protocol adherence needs to be improved.
Chang, Shy-Shin; Hsu, Hsung-Ling; Cheng, Ju-Chien; Tseng, Ching-Ping
2011-01-01
Background Bacterial DNA contamination in PCR reagents has been a long standing problem that hampers the adoption of broad-range PCR in clinical and applied microbiology, particularly in detection of low abundance bacteria. Although several DNA decontamination protocols have been reported, they all suffer from compromised PCR efficiency or detection limits. To date, no satisfactory solution has been found. Methodology/Principal Findings We herein describe a method that solves this long standing problem by employing a broad-range primer extension-PCR (PE-PCR) strategy that obviates the need for DNA decontamination. In this method, we first devise a fusion probe having a 3′-end complementary to the template bacterial sequence and a 5′-end non-bacterial tag sequence. We then hybridize the probes to template DNA, carry out primer extension and remove the excess probes using an optimized enzyme mix of Klenow DNA polymerase and exonuclease I. This strategy allows the templates to be distinguished from the PCR reagent contaminants and selectively amplified by PCR. To prove the concept, we spiked the PCR reagents with Staphylococcus aureus genomic DNA and applied PE-PCR to amplify template bacterial DNA. The spiking DNA neither interfered with template DNA amplification nor caused false positive of the reaction. Broad-range PE-PCR amplification of the 16S rRNA gene was also validated and minute quantities of template DNA (10–100 fg) were detectable without false positives. When adapting to real-time and high-resolution melting (HRM) analytical platforms, the unique melting profiles for the PE-PCR product can be used as the molecular fingerprints to further identify individual bacterial species. Conclusions/Significance Broad-range PE-PCR is simple, efficient, and completely obviates the need to decontaminate PCR reagents. When coupling with real-time and HRM analyses, it offers a new avenue for bacterial species identification with a limited source of bacterial DNA, making it suitable for use in clinical and applied microbiology laboratories. PMID:21637859
Real-Time DNA Sequencing in the Antarctic Dry Valleys Using the Oxford Nanopore Sequencer
Johnson, Sarah S.; Zaikova, Elena; Goerlitz, David S.; Bai, Yu; Tighe, Scott W.
2017-01-01
The ability to sequence DNA outside of the laboratory setting has enabled novel research questions to be addressed in the field in diverse areas, ranging from environmental microbiology to viral epidemics. Here, we demonstrate the application of offline DNA sequencing of environmental samples using a hand-held nanopore sequencer in a remote field location: the McMurdo Dry Valleys, Antarctica. Sequencing was performed using a MK1B MinION sequencer from Oxford Nanopore Technologies (ONT; Oxford, United Kingdom) that was equipped with software to operate without internet connectivity. One-direction (1D) genomic libraries were prepared using portable field techniques on DNA isolated from desiccated microbial mats. By adequately insulating the sequencer and laptop, it was possible to run the sequencing protocol for up to 2½ h under arduous conditions. PMID:28337073
High-throughput sequencing of forensic genetic samples using punches of FTA cards with buccal swabs.
Kampmann, Marie-Louise; Buchard, Anders; Børsting, Claus; Morling, Niels
2016-01-01
Here, we demonstrate that punches from buccal swab samples preserved on FTA cards can be used for high-throughput DNA sequencing, also known as massively parallel sequencing (MPS). We typed 44 reference samples with the HID-Ion AmpliSeq Identity Panel using washed 1.2 mm punches from FTA cards with buccal swabs and compared the results with those obtained with DNA extracted using the EZ1 DNA Investigator Kit. Concordant profiles were obtained for all samples. Our protocol includes simple punch, wash, and PCR steps, reducing cost and hands-on time in the laboratory. Furthermore, it facilitates automation of DNA sequencing.
Synthesis of (+)-dumetorine and congeners by using flow chemistry technologies.
Riva, Elena; Rencurosi, Anna; Gagliardi, Stefania; Passarella, Daniele; Martinelli, Marisa
2011-05-23
An efficient total synthesis of the natural alkaloid (+)-dumetorine by using flow technology is described. The process entailed five separate steps starting from the enantiopure (S)-2-(piperidin-2-yl)ethanol 4 with 29% overall yield. Most of the reactions were carried out by exploiting solvent superheating and by using packed columns of immobilized reagents or scavengers to minimize handling. New protocols for performing classical reactions under continuous flow are disclosed: the ring-closing metathesis reaction with a novel polyethylene glycol-supported Hoveyda catalyst and the unprecedented flow deprotection/Eschweiler-Clarke methylation sequence. The new protocols developed for the synthesis of (+)-dumetorine were applied to the synthesis of its simplified natural congeners (-)-sedamine and (+)-sedridine. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Fast discovery and visualization of conserved regions in DNA sequences using quasi-alignment
2013-01-01
Background Next Generation Sequencing techniques are producing enormous amounts of biological sequence data and analysis becomes a major computational problem. Currently, most analysis, especially the identification of conserved regions, relies heavily on Multiple Sequence Alignment and its various heuristics such as progressive alignment, whose run time grows with the square of the number and the length of the aligned sequences and requires significant computational resources. In this work, we present a method to efficiently discover regions of high similarity across multiple sequences without performing expensive sequence alignment. The method is based on approximating edit distance between segments of sequences using p-mer frequency counts. Then, efficient high-throughput data stream clustering is used to group highly similar segments into so called quasi-alignments. Quasi-alignments have numerous applications such as identifying species and their taxonomic class from sequences, comparing sequences for similarities, and, as in this paper, discovering conserved regions across related sequences. Results In this paper, we show that quasi-alignments can be used to discover highly similar segments across multiple sequences from related or different genomes efficiently and accurately. Experiments on a large number of unaligned 16S rRNA sequences obtained from the Greengenes database show that the method is able to identify conserved regions which agree with known hypervariable regions in 16S rRNA. Furthermore, the experiments show that the proposed method scales well for large data sets with a run time that grows only linearly with the number and length of sequences, whereas for existing multiple sequence alignment heuristics the run time grows super-linearly. Conclusion Quasi-alignment-based algorithms can detect highly similar regions and conserved areas across multiple sequences. Since the run time is linear and the sequences are converted into a compact clustering model, we are able to identify conserved regions fast or even interactively using a standard PC. Our method has many potential applications such as finding characteristic signature sequences for families of organisms and studying conserved and variable regions in, for example, 16S rRNA. PMID:24564200
Fast discovery and visualization of conserved regions in DNA sequences using quasi-alignment.
Nagar, Anurag; Hahsler, Michael
2013-01-01
Next Generation Sequencing techniques are producing enormous amounts of biological sequence data and analysis becomes a major computational problem. Currently, most analysis, especially the identification of conserved regions, relies heavily on Multiple Sequence Alignment and its various heuristics such as progressive alignment, whose run time grows with the square of the number and the length of the aligned sequences and requires significant computational resources. In this work, we present a method to efficiently discover regions of high similarity across multiple sequences without performing expensive sequence alignment. The method is based on approximating edit distance between segments of sequences using p-mer frequency counts. Then, efficient high-throughput data stream clustering is used to group highly similar segments into so called quasi-alignments. Quasi-alignments have numerous applications such as identifying species and their taxonomic class from sequences, comparing sequences for similarities, and, as in this paper, discovering conserved regions across related sequences. In this paper, we show that quasi-alignments can be used to discover highly similar segments across multiple sequences from related or different genomes efficiently and accurately. Experiments on a large number of unaligned 16S rRNA sequences obtained from the Greengenes database show that the method is able to identify conserved regions which agree with known hypervariable regions in 16S rRNA. Furthermore, the experiments show that the proposed method scales well for large data sets with a run time that grows only linearly with the number and length of sequences, whereas for existing multiple sequence alignment heuristics the run time grows super-linearly. Quasi-alignment-based algorithms can detect highly similar regions and conserved areas across multiple sequences. Since the run time is linear and the sequences are converted into a compact clustering model, we are able to identify conserved regions fast or even interactively using a standard PC. Our method has many potential applications such as finding characteristic signature sequences for families of organisms and studying conserved and variable regions in, for example, 16S rRNA.
Easi-CRISPR for creating knock-in and conditional knockout mouse models using long ssDNA donors.
Miura, Hiromi; Quadros, Rolen M; Gurumurthy, Channabasavaiah B; Ohtsuka, Masato
2018-01-01
CRISPR/Cas9-based genome editing can easily generate knockout mouse models by disrupting the gene sequence, but its efficiency for creating models that require either insertion of exogenous DNA (knock-in) or replacement of genomic segments is very poor. The majority of mouse models used in research involve knock-in (reporters or recombinases) or gene replacement (e.g., conditional knockout alleles containing exons flanked by LoxP sites). A few methods for creating such models have been reported that use double-stranded DNA as donors, but their efficiency is typically 1-10% and therefore not suitable for routine use. We recently demonstrated that long single-stranded DNAs (ssDNAs) serve as very efficient donors, both for insertion and for gene replacement. We call this method efficient additions with ssDNA inserts-CRISPR (Easi-CRISPR) because it is a highly efficient technology (efficiency is typically 30-60% and reaches as high as 100% in some cases). The protocol takes ∼2 months to generate the founder mice.
Optimal protocols for slowly driven quantum systems.
Zulkowski, Patrick R; DeWeese, Michael R
2015-09-01
The design of efficient quantum information processing will rely on optimal nonequilibrium transitions of driven quantum systems. Building on a recently developed geometric framework for computing optimal protocols for classical systems driven in finite time, we construct a general framework for optimizing the average information entropy for driven quantum systems. Geodesics on the parameter manifold endowed with a positive semidefinite metric correspond to protocols that minimize the average information entropy production in finite time. We use this framework to explicitly compute the optimal entropy production for a simple two-state quantum system coupled to a heat bath of bosonic oscillators, which has applications to quantum annealing.
Stimulated Raman adiabatic passage in a three-level superconducting circuit
Kumar, K. S.; Vepsäläinen, A.; Danilin, S.; Paraoanu, G. S.
2016-01-01
The adiabatic manipulation of quantum states is a powerful technique that opened up new directions in quantum engineering—enabling tests of fundamental concepts such as geometrical phases and topological transitions, and holding the promise of alternative models of quantum computation. Here we benchmark the stimulated Raman adiabatic passage for circuit quantum electrodynamics by employing the first three levels of a transmon qubit. In this ladder configuration, we demonstrate a population transfer efficiency >80% between the ground state and the second excited state using two adiabatic Gaussian-shaped control microwave pulses. By doing quantum tomography at successive moments during the Raman pulses, we investigate the transfer of the population in time domain. Furthermore, we show that this protocol can be reversed by applying a third adiabatic pulse, we study a hybrid nondiabatic–adiabatic sequence, and we present experimental results for a quasi-degenerate intermediate level. PMID:26902454
Stimulated Raman adiabatic passage in a three-level superconducting circuit.
Kumar, K S; Vepsäläinen, A; Danilin, S; Paraoanu, G S
2016-02-23
The adiabatic manipulation of quantum states is a powerful technique that opened up new directions in quantum engineering--enabling tests of fundamental concepts such as geometrical phases and topological transitions, and holding the promise of alternative models of quantum computation. Here we benchmark the stimulated Raman adiabatic passage for circuit quantum electrodynamics by employing the first three levels of a transmon qubit. In this ladder configuration, we demonstrate a population transfer efficiency >80% between the ground state and the second excited state using two adiabatic Gaussian-shaped control microwave pulses. By doing quantum tomography at successive moments during the Raman pulses, we investigate the transfer of the population in time domain. Furthermore, we show that this protocol can be reversed by applying a third adiabatic pulse, we study a hybrid nondiabatic-adiabatic sequence, and we present experimental results for a quasi-degenerate intermediate level.
Lee, Ju Seok; Chen, Junghuei; Deaton, Russell; Kim, Jin-Woo
2014-01-01
Genetic material extracted from in situ microbial communities has high promise as an indicator of biological system status. However, the challenge is to access genomic information from all organisms at the population or community scale to monitor the biosystem's state. Hence, there is a need for a better diagnostic tool that provides a holistic view of a biosystem's genomic status. Here, we introduce an in vitro methodology for genomic pattern classification of biological samples that taps large amounts of genetic information from all genes present and uses that information to detect changes in genomic patterns and classify them. We developed a biosensing protocol, termed Biological Memory, that has in vitro computational capabilities to "learn" and "store" genomic sequence information directly from genomic samples without knowledge of their explicit sequences, and that discovers differences in vitro between previously unknown inputs and learned memory molecules. The Memory protocol was designed and optimized based upon (1) common in vitro recombinant DNA operations using 20-base random probes, including polymerization, nuclease digestion, and magnetic bead separation, to capture a snapshot of the genomic state of a biological sample as a DNA memory and (2) the thermal stability of DNA duplexes between new input and the memory to detect similarities and differences. For efficient read out, a microarray was used as an output method. When the microarray-based Memory protocol was implemented to test its capability and sensitivity using genomic DNA from two model bacterial strains, i.e., Escherichia coli K12 and Bacillus subtilis, results indicate that the Memory protocol can "learn" input DNA, "recall" similar DNA, differentiate between dissimilar DNA, and detect relatively small concentration differences in samples. This study demonstrated not only the in vitro information processing capabilities of DNA, but also its promise as a genomic pattern classifier that could access information from all organisms in a biological system without explicit genomic information. The Memory protocol has high potential for many applications, including in situ biomonitoring of ecosystems, screening for diseases, biosensing of pathological features in water and food supplies, and non-biological information processing of memory devices, among many.
2011-01-01
Background Quantitative noninvasive imaging of myocardial mechanics in mice enables studies of the roles of individual genes in cardiac function. We sought to develop comprehensive three-dimensional methods for imaging myocardial mechanics in mice. Methods A 3D cine DENSE pulse sequence was implemented on a 7T small-bore scanner. The sequence used three-point phase cycling for artifact suppression and a stack-of-spirals k-space trajectory for efficient data acquisition. A semi-automatic 2D method was adapted for 3D image segmentation, and automated 3D methods to calculate strain, twist, and torsion were employed. A scan protocol that covered the majority of the left ventricle in a scan time of less than 25 minutes was developed, and seven healthy C57Bl/6 mice were studied. Results Using these methods, multiphase normal and shear strains were measured, as were myocardial twist and torsion. Peak end-systolic values for the normal strains at the mid-ventricular level were 0.29 ± 0.17, -0.13 ± 0.03, and -0.18 ± 0.14 for Err, Ecc, and Ell, respectively. Peak end-systolic values for the shear strains were 0.00 ± 0.08, 0.04 ± 0.12, and 0.03 ± 0.07 for Erc, Erl, and Ecl, respectively. The peak end-systolic normalized torsion was 5.6 ± 0.9°. Conclusions Using a 3D cine DENSE sequence tailored for cardiac imaging in mice at 7 T, a comprehensive assessment of 3D myocardial mechanics can be achieved with a scan time of less than 25 minutes and an image analysis time of approximately 1 hour. PMID:22208954
Zhang, Zheshen; Mower, Jacob; Englund, Dirk; Wong, Franco N C; Shapiro, Jeffrey H
2014-03-28
High-dimensional quantum key distribution (HDQKD) offers the possibility of high secure-key rate with high photon-information efficiency. We consider HDQKD based on the time-energy entanglement produced by spontaneous parametric down-conversion and show that it is secure against collective attacks. Its security rests upon visibility data-obtained from Franson and conjugate-Franson interferometers-that probe photon-pair frequency correlations and arrival-time correlations. From these measurements, an upper bound can be established on the eavesdropper's Holevo information by translating the Gaussian-state security analysis for continuous-variable quantum key distribution so that it applies to our protocol. We show that visibility data from just the Franson interferometer provides a weaker, but nonetheless useful, secure-key rate lower bound. To handle multiple-pair emissions, we incorporate the decoy-state approach into our protocol. Our results show that over a 200-km transmission distance in optical fiber, time-energy entanglement HDQKD could permit a 700-bit/sec secure-key rate and a photon information efficiency of 2 secure-key bits per photon coincidence in the key-generation phase using receivers with a 15% system efficiency.
Van Ooteghem, Karen; Frank, James S; Allard, Fran; Horak, Fay B
2010-08-01
Postural motor learning for dynamic balance tasks has been demonstrated in healthy older adults (Van Ooteghem et al. in Exp Brain Res 199(2):185-193, 2009). The purpose of this study was to investigate the type of knowledge (general or specific) obtained with balance training in this age group and to examine whether embedding perturbation regularities within a balance task masks specific learning. Two groups of older adults maintained balance on a translating platform that oscillated with variable amplitude and constant frequency. One group was trained using an embedded-sequence (ES) protocol which contained the same 15-s sequence of variable amplitude oscillations in the middle of each trial. A second group was trained using a looped-sequence (LS) protocol which contained a 15-s sequence repeated three times to form each trial. All trials were 45 s. Participants were not informed of any repetition. To examine learning, participants performed a retention test following a 24-h delay. LS participants also completed a transfer task. Specificity of learning was examined by comparing performance for repeated versus random sequences (ES) and training versus transfer sequences (LS). Performance was measured by deriving spatial and temporal measures of whole body center of mass (COM) and trunk orientation. Both groups improved performance with practice as characterized by reduced COM displacement, improved COM-platform phase relationships, and decreased angular trunk motion. Furthermore, improvements reflected general rather than specific postural motor learning regardless of training protocol (ES or LS). This finding is similar to young adults (Van Ooteghem et al. in Exp Brain Res 187(4):603-611, 2008) and indicates that age does not influence the type of learning which occurs for balance control.
Pyicos: a versatile toolkit for the analysis of high-throughput sequencing data
Althammer, Sonja; González-Vallinas, Juan; Ballaré, Cecilia; Beato, Miguel; Eyras, Eduardo
2011-01-01
Motivation: High-throughput sequencing (HTS) has revolutionized gene regulation studies and is now fundamental for the detection of protein–DNA and protein–RNA binding, as well as for measuring RNA expression. With increasing variety and sequencing depth of HTS datasets, the need for more flexible and memory-efficient tools to analyse them is growing. Results: We describe Pyicos, a powerful toolkit for the analysis of mapped reads from diverse HTS experiments: ChIP-Seq, either punctuated or broad signals, CLIP-Seq and RNA-Seq. We prove the effectiveness of Pyicos to select for significant signals and show that its accuracy is comparable and sometimes superior to that of methods specifically designed for each particular type of experiment. Pyicos facilitates the analysis of a variety of HTS datatypes through its flexibility and memory efficiency, providing a useful framework for data integration into models of regulatory genomics. Availability: Open-source software, with tutorials and protocol files, is available at http://regulatorygenomics.upf.edu/pyicos or as a Galaxy server at http://regulatorygenomics.upf.edu/galaxy Contact: eduardo.eyras@upf.edu Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:21994224
TreeMAC: Localized TDMA MAC protocol for real-time high-data-rate sensor networks
Song, W.-Z.; Huang, R.; Shirazi, B.; Husent, R.L.
2009-01-01
Earlier sensor network MAC protocols focus on energy conservation in low-duty cycle applications, while some recent applications involve real-time high-data-rate signals. This motivates us to design an innovative localized TDMA MAC protocol to achieve high throughput and low congestion in data collection sensor networks, besides energy conservation. TreeMAC divides a time cycle into frames and frame into slots. Parent determines children's frame assigmnent based on their relative bandwidth demand, and each node calculates its own slot assignment based on its hop-count to the sink. This innovative 2-dimensional frame-slot assignment algorithm has the following nice theory properties. Firstly, given any node, at any time slot, there is at most one active sender in its neighborhood (includ ing itself). Secondly, the packet scheduling with TreelMAC is bufferless, which therefore minimizes the probability of network congestion. Thirdly, the data throughput to gateway is at least 1/3 of the optimum assuming reliable links. Our experiments on a 24 node test bed demonstrate that TreeMAC protocol significantly improves network throughput and energy efficiency, by comparing to the TinyOS's default CSMA MAC protocol and a recent TDMA MAC protocol Funneling-MAC[8]. ?? 2009 IEEE.
Reliability of Vibrating Mesh Technology.
Gowda, Ashwin A; Cuccia, Ann D; Smaldone, Gerald C
2017-01-01
For delivery of inhaled aerosols, vibrating mesh systems are more efficient than jet nebulizers are and do not require added gas flow. We assessed the reliability of a vibrating mesh nebulizer (Aerogen Solo, Aerogen Ltd, Galway Ireland) suitable for use in mechanical ventilation. An initial observational study was performed with 6 nebulizers to determine run time and efficiency using normal saline and distilled water. Nebulizers were run until cessation of aerosol production was noted, with residual volume and run time recorded. Three controllers were used to assess the impact of the controller on nebulizer function. Following the observational study, a more detailed experimental protocol was performed using 20 nebulizers. For this analysis, 2 controllers were used, and time to cessation of aerosol production was noted. Gravimetric techniques were used to measure residual volume. Total nebulization time and residual volume were recorded. Failure was defined as premature cessation of aerosol production represented by residual volume of > 10% of the nebulizer charge. In the initial observational protocol, an unexpected sporadic failure rate was noted of 25% in 55 experimental runs. In the experimental protocol, a failure rate was noted of 30% in 40 experimental runs. Failed runs in the experimental protocol exhibited a wide range of retained volume averaging ± SD 36 ± 21.3% compared with 3.2 ± 1.5% (P = .001) in successful runs. Small but significant differences existed in nebulization time between controllers. Aerogen Solo nebulization was often randomly interrupted with a wide range of retained volumes. Copyright © 2017 by Daedalus Enterprises.
Rusbridge, Clare; Long, Sam; Jovanovik, Jelena; Milne, Marjorie; Berendt, Mette; Bhatti, Sofie F M; De Risio, Luisa; Farqhuar, Robyn G; Fischer, Andrea; Matiasek, Kaspar; Muñana, Karen; Patterson, Edward E; Pakozdy, Akos; Penderis, Jacques; Platt, Simon; Podell, Michael; Potschka, Heidrun; Stein, Veronika M; Tipold, Andrea; Volk, Holger A
2015-08-28
Epilepsy is one of the most common chronic neurological diseases in veterinary practice. Magnetic resonance imaging (MRI) is regarded as an important diagnostic test to reach the diagnosis of idiopathic epilepsy. However, given that the diagnosis requires the exclusion of other differentials for seizures, the parameters for MRI examination should allow the detection of subtle lesions which may not be obvious with existing techniques. In addition, there are several differentials for idiopathic epilepsy in humans, for example some focal cortical dysplasias, which may only apparent with special sequences, imaging planes and/or particular techniques used in performing the MRI scan. As a result, there is a need to standardize MRI examination in veterinary patients with techniques that reliably diagnose subtle lesions, identify post-seizure changes, and which will allow for future identification of underlying causes of seizures not yet apparent in the veterinary literature.There is a need for a standardized veterinary epilepsy-specific MRI protocol which will facilitate more detailed examination of areas susceptible to generating and perpetuating seizures, is cost efficient, simple to perform and can be adapted for both low and high field scanners. Standardisation of imaging will improve clinical communication and uniformity of case definition between research studies. A 6-7 sequence epilepsy-specific MRI protocol for veterinary patients is proposed and further advanced MR and functional imaging is reviewed.
Design and assessment of engineered CRISPR-Cpf1 and its use for genome editing.
Li, Bin; Zeng, Chunxi; Dong, Yizhou
2018-05-01
Cpf1, a CRISPR endonuclease discovered in Prevotella and Francisella 1 bacteria, offers an alternative platform for CRISPR-based genome editing beyond the commonly used CRISPR-Cas9 system originally discovered in Streptococcus pyogenes. This protocol enables the design of engineered CRISPR-Cpf1 components, both CRISPR RNAs (crRNAs) to guide the endonuclease and Cpf1 mRNAs to express the endonuclease protein, and provides experimental procedures for effective genome editing using this system. We also describe quantification of genome-editing activity and off-target effects of the engineered CRISPR-Cpf1 in human cell lines using both T7 endonuclease I (T7E1) assay and targeted deep sequencing. This protocol enables rapid construction and identification of engineered crRNAs and Cpf1 mRNAs to enhance genome-editing efficiency using the CRISPR-Cpf1 system, as well as assessment of target specificity within 2 months. This protocol may also be appropriate for fine-tuning other types of CRISPR systems.
SCTP as scalable video coding transport
NASA Astrophysics Data System (ADS)
Ortiz, Jordi; Graciá, Eduardo Martínez; Skarmeta, Antonio F.
2013-12-01
This study presents an evaluation of the Stream Transmission Control Protocol (SCTP) for the transport of the scalable video codec (SVC), proposed by MPEG as an extension to H.264/AVC. Both technologies fit together properly. On the one hand, SVC permits to split easily the bitstream into substreams carrying different video layers, each with different importance for the reconstruction of the complete video sequence at the receiver end. On the other hand, SCTP includes features, such as the multi-streaming and multi-homing capabilities, that permit to transport robustly and efficiently the SVC layers. Several transmission strategies supported on baseline SCTP and its concurrent multipath transfer (CMT) extension are compared with the classical solutions based on the Transmission Control Protocol (TCP) and the Realtime Transmission Protocol (RTP). Using ns-2 simulations, it is shown that CMT-SCTP outperforms TCP and RTP in error-prone networking environments. The comparison is established according to several performance measurements, including delay, throughput, packet loss, and peak signal-to-noise ratio of the received video.
Temporal logics and real time expert systems.
Blom, J A
1996-10-01
This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.
Generation of 2A-linked multicistronic cassettes by recombinant PCR.
Szymczak-Workman, Andrea L; Vignali, Kate M; Vignali, Dario A A
2012-02-01
The need for reliable, multicistronic vectors for multigene delivery is at the forefront of biomedical technology. It is now possible to express multiple proteins from a single open reading frame (ORF) using 2A peptide-linked multicistronic vectors. These small sequences, when cloned between genes, allow for efficient, stoichiometric production of discrete protein products within a single vector through a novel "cleavage" event within the 2A peptide sequence. Expression of more than two genes using conventional approaches has several limitations, most notably imbalanced protein expression and large size. The use of 2A peptide sequences alleviates these concerns. They are small (18-22 amino acids) and have divergent amino-terminal sequences, which minimizes the chance for homologous recombination and allows for multiple, different 2A peptide sequences to be used within a single vector. Importantly, separation of genes placed between 2A peptide sequences is nearly 100%, which allows for stoichiometric and concordant expression of the genes, regardless of the order of placement within the vector. This protocol describes the use of recombinant polymerase chain reaction (PCR) to connect multiple 2A-linked protein sequences. The final construct is subcloned into an expression vector.
High density FTA plates serve as efficient long-term sample storage for HLA genotyping.
Lange, V; Arndt, K; Schwarzelt, C; Boehme, I; Giani, A S; Schmidt, A H; Ehninger, G; Wassmuth, R
2014-02-01
Storage of dried blood spots (DBS) on high-density FTA(®) plates could constitute an appealing alternative to frozen storage. However, it remains controversial whether DBS are suitable for high-resolution sequencing of human leukocyte antigen (HLA) alleles. Therefore, we extracted DNA from DBS that had been stored for up to 4 years, using six different methods. We identified those extraction methods that recovered sufficient high-quality DNA for reliable high-resolution HLA sequencing. Further, we confirmed that frozen whole blood samples that had been stored for several years can be transferred to filter paper without compromising HLA genotyping upon extraction. Concluding, DNA derived from high-density FTA(®) plates is suitable for high-resolution HLA sequencing, provided that appropriate extraction protocols are employed. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Sachsenröder, Jana; Twardziok, Sven; Hammerl, Jens A; Janczyk, Pawel; Wrede, Paul; Hertwig, Stefan; Johne, Reimar
2012-01-01
Animal faeces comprise a community of many different microorganisms including bacteria and viruses. Only scarce information is available about the diversity of viruses present in the faeces of pigs. Here we describe a protocol, which was optimized for the purification of the total fraction of viral particles from pig faeces. The genomes of the purified DNA and RNA viruses were simultaneously amplified by PCR and subjected to deep sequencing followed by bioinformatic analyses. The efficiency of the method was monitored using a process control consisting of three bacteriophages (T4, M13 and MS2) with different morphology and genome types. Defined amounts of the bacteriophages were added to the sample and their abundance was assessed by quantitative PCR during the preparation procedure. The procedure was applied to a pooled faecal sample of five pigs. From this sample, 69,613 sequence reads were generated. All of the added bacteriophages were identified by sequence analysis of the reads. In total, 7.7% of the reads showed significant sequence identities with published viral sequences. They mainly originated from bacteriophages (73.9%) and mammalian viruses (23.9%); 0.8% of the sequences showed identities to plant viruses. The most abundant detected porcine viruses were kobuvirus, rotavirus C, astrovirus, enterovirus B, sapovirus and picobirnavirus. In addition, sequences with identities to the chimpanzee stool-associated circular ssDNA virus were identified. Whole genome analysis indicates that this virus, tentatively designated as pig stool-associated circular ssDNA virus (PigSCV), represents a novel pig virus. The established protocol enables the simultaneous detection of DNA and RNA viruses in pig faeces including the identification of so far unknown viruses. It may be applied in studies investigating aetiology, epidemiology and ecology of diseases. The implemented process control serves as quality control, ensures comparability of the method and may be used for further method optimization.
Curated eutherian third party data gene data sets.
Premzl, Marko
2016-03-01
The free available eutherian genomic sequence data sets advanced scientific field of genomics. Of note, future revisions of gene data sets were expected, due to incompleteness of public eutherian genomic sequence assemblies and potential genomic sequence errors. The eutherian comparative genomic analysis protocol was proposed as guidance in protection against potential genomic sequence errors in public eutherian genomic sequences. The protocol was applicable in updates of 7 major eutherian gene data sets, including 812 complete coding sequences deposited in European Nucleotide Archive as curated third party data gene data sets.
SVANET: A smart vehicular ad hoc network for efficient data transmission with wireless sensors.
Sahoo, Prasan Kumar; Chiang, Ming-Jer; Wu, Shih-Lin
2014-11-25
Wireless sensors can sense any event, such as accidents, as well as icy roads, and can forward the rescue/warning messages through intermediate vehicles for any necessary help. In this paper, we propose a smart vehicular ad hoc network (SVANET) architecture that uses wireless sensors to detect events and vehicles to transmit the safety and non-safety messages efficiently by using different service channels and one control channel with different priorities. We have developed a data transmission protocol for the vehicles in the highway, in which data can be forwarded with the help of vehicles if they are connected with each other or data can be forwarded with the help of nearby wireless sensors. Our data transmission protocol is designed to increase the driving safety, to prevent accidents and to utilize channels efficiently by adjusting the control and service channel time intervals dynamically. Besides, our protocol can transmit information to vehicles in advance, so that drivers can decide an alternate route in case of traffic congestion. For various data sharing, we design a method that can select a few leader nodes among vehicles running along a highway to broadcast data efficiently. Simulation results show that our protocol can outperform the existing standard in terms of the end to end packet delivery ratio and latency.
SVANET: A Smart Vehicular Ad Hoc Network for Efficient Data Transmission with Wireless Sensors
Sahoo, Prasan Kumar; Chiang, Ming-Jer; Wu, Shih-Lin
2014-01-01
Wireless sensors can sense any event, such as accidents, as well as icy roads, and can forward the rescue/warning messages through intermediate vehicles for any necessary help. In this paper, we propose a smart vehicular ad hoc network (SVANET) architecture that uses wireless sensors to detect events and vehicles to transmit the safety and non-safety messages efficiently by using different service channels and one control channel with different priorities. We have developed a data transmission protocol for the vehicles in the highway, in which data can be forwarded with the help of vehicles if they are connected with each other or data can be forwarded with the help of nearby wireless sensors. Our data transmission protocol is designed to increase the driving safety, to prevent accidents and to utilize channels efficiently by adjusting the control and service channel time intervals dynamically. Besides, our protocol can transmit information to vehicles in advance, so that drivers can decide an alternate route in case of traffic congestion. For various data sharing, we design a method that can select a few leader nodes among vehicles running along a highway to broadcast data efficiently. Simulation results show that our protocol can outperform the existing standard in terms of the end to end packet delivery ratio and latency. PMID:25429409
NASA Technical Reports Server (NTRS)
Quir, Kevin J.; Gin, Jonathan W.; Nguyen, Danh H.; Nguyen, Huy; Nakashima, Michael A.; Moision, Bruce E.
2012-01-01
A decoder was developed that decodes a serial concatenated pulse position modulation (SCPPM) encoded information sequence. The decoder takes as input a sequence of four bit log-likelihood ratios (LLR) for each PPM slot in a codeword via a XAUI 10-Gb/s quad optical fiber interface. If the decoder is unavailable, it passes the LLRs on to the next decoder via a XAUI 10-Gb/s quad optical fiber interface. Otherwise, it decodes the sequence and outputs information bits through a 1-GB/s Ethernet UDP/IP (User Datagram Protocol/Internet Protocol) interface. The throughput for a single decoder unit is 150-Mb/s at an average of four decoding iterations; by connecting a number of decoder units in series, a decoding rate equal to that of the aggregate rate is achieved. The unit is controlled through a 1-GB/s Ethernet UDP/IP interface. This ground station decoder was developed to demonstrate a deep space optical communication link capability, and is unique in the scalable design to achieve real-time SCPP decoding at the aggregate data rate.
Analysis of delay reducing and fuel saving sequencing and spacing algorithms for arrival traffic
NASA Technical Reports Server (NTRS)
Neuman, Frank; Erzberger, Heinz
1991-01-01
The air traffic control subsystem that performs sequencing and spacing is discussed. The function of the sequencing and spacing algorithms is to automatically plan the most efficient landing order and to assign optimally spaced landing times to all arrivals. Several algorithms are described and their statistical performance is examined. Sequencing brings order to an arrival sequence for aircraft. First-come-first-served sequencing (FCFS) establishes a fair order, based on estimated times of arrival, and determines proper separations. Because of the randomness of the arriving traffic, gaps will remain in the sequence of aircraft. Delays are reduced by time-advancing the leading aircraft of each group while still preserving the FCFS order. Tightly spaced groups of aircraft remain with a mix of heavy and large aircraft. Spacing requirements differ for different types of aircraft trailing each other. Traffic is reordered slightly to take advantage of this spacing criterion, thus shortening the groups and reducing average delays. For heavy traffic, delays for different traffic samples vary widely, even when the same set of statistical parameters is used to produce each sample. This report supersedes NASA TM-102795 on the same subject. It includes a new method of time-advance as well as an efficient method of sequencing and spacing for two dependent runways.
2010-01-01
Research in plant molecular biology involves DNA purification on a daily basis. Although different commercial kits enable convenient extraction of high-quality DNA from E. coli cells, PCR and agarose gel samples as well as plant tissues, each kit is designed for a particular type of DNA extraction work, and the cost of purchasing these kits over a long run can be considerable. Furthermore, a simple method for the isolation of binary plasmid from Agrobacterium tumefaciens cells with satisfactory yield is lacking. Here we describe an easy protocol using homemade silicon dioxide matrix and seven simple solutions for DNA extraction from E. coli and A. tumefaciens cells, PCR and restriction digests, agarose gel slices, and plant tissues. Compared with the commercial kits, this protocol allows rapid DNA purification from diverse sources with comparable yield and purity at negligible cost. Following this protocol, we have demonstrated: (1) DNA fragments as small as a MYC-epitope tag coding sequence can be successfully recovered from an agarose gel slice; (2) Miniprep DNA from E. coli can be eluted with as little as 5 μl water, leading to high DNA concentrations (>1 μg/μl) for efficient biolistic bombardment of Arabidopsis seedlings, polyethylene glycol (PEG)-mediated Arabidopsis protoplast transfection and maize protoplast electroporation; (3) Binary plasmid DNA prepared from A. tumefaciens is suitable for verification by restriction analysis without the need for large scale propagation; (4) High-quality genomic DNA is readily isolated from several plant species including Arabidopsis, tobacco and maize. Thus, the silicon dioxide matrix-based DNA purification protocol offers an easy, efficient and economical way to extract DNA for various purposes in plant research. PMID:20180960
A multiple-alignment based primer design algorithm for genetically highly variable DNA targets
2013-01-01
Background Primer design for highly variable DNA sequences is difficult, and experimental success requires attention to many interacting constraints. The advent of next-generation sequencing methods allows the investigation of rare variants otherwise hidden deep in large populations, but requires attention to population diversity and primer localization in relatively conserved regions, in addition to recognized constraints typically considered in primer design. Results Design constraints include degenerate sites to maximize population coverage, matching of melting temperatures, optimizing de novo sequence length, finding optimal bio-barcodes to allow efficient downstream analyses, and minimizing risk of dimerization. To facilitate primer design addressing these and other constraints, we created a novel computer program (PrimerDesign) that automates this complex procedure. We show its powers and limitations and give examples of successful designs for the analysis of HIV-1 populations. Conclusions PrimerDesign is useful for researchers who want to design DNA primers and probes for analyzing highly variable DNA populations. It can be used to design primers for PCR, RT-PCR, Sanger sequencing, next-generation sequencing, and other experimental protocols targeting highly variable DNA samples. PMID:23965160
Proteome analysis of Aspergillus ochraceus.
Rizwan, Muhammad; Miller, Ingrid; Tasneem, Fareeha; Böhm, Josef; Gemeiner, Manfred; Razzazi-Fazeli, Ebrahim
2010-08-01
Genome sequencing for many important fungi has begun during recent years; however, there is still some deficiency in proteome profiling of aspergilli. To obtain a comprehensive overview of proteins and their expression, a proteomic approach based on 2D gel electrophoresis and MALDI-TOF/TOF mass spectrometry was used to investigate A. ochraceus. The cell walls of fungi are exceptionally resistant to destruction, therefore two lysis protocols were tested: (1) lysis via manual grinding using liquid nitrogen, and (2) mechanical lysis via rapid agitation with glass beads using MagNalyser. Mechanical grinding with mortar and pestle using liquid nitrogen was found to be a more efficient extraction method for our purpose, resulting in extracts with higher protein content and a clear band pattern in SDS-PAGE. Two-dimensional electrophoresis gave a complex spot pattern comprising proteins of a broad range of isoelectric points and molecular masses. The most abundant spots were subjected to mass spectrometric analysis. We could identify 31 spots representing 26 proteins, most of them involved in metabolic processes and response to stress. Seventeen spots were identified by de novo sequencing due to a lack of DNA and protein database sequences of A. ochraceus. The proteins identified in our study have been reported for the first time in A. ochraceus and this represents the first proteomic approach with identification of major proteins, when the fungus was grown under submerged culture.
Cryopreservation of Human Pluripotent Stem Cells in Defined Medium
Liu, Weiwei; Chen, Guokai
2014-01-01
This protocol describes a cryopreservation procedure using an enzyme-free dissociation method to harvest cells and preserve cells in albumin-free chemically defined E8 medium for human pluripotent stem cells (hPSCs). The dissociation by EDTA/PBS produces small cell aggregates that allow high survival efficiency in passaging and cryopreservation. The preservation in E8 medium eliminates serum or other animal products, and is suitable for the increasing demand for high quality hPSCs in translational research. In combination with the special feature of EDTA/PBS dissociation, this protocol allows efficient cryopreservation in more time-saving manner. PMID:25366897
Vanz, Ana Ls; Renard, Gaby; Palma, Mario S; Chies, Jocelei M; Dalmora, Sérgio L; Basso, Luiz A; Santos, Diógenes S
2008-04-04
Biopharmaceutical drugs are mainly recombinant proteins produced by biotechnological tools. The patents of many biopharmaceuticals have expired, and biosimilars are thus currently being developed. Human granulocyte colony stimulating factor (hG-CSF) is a hematopoietic cytokine that acts on cells of the neutrophil lineage causing proliferation and differentiation of committed precursor cells and activation of mature neutrophils. Recombinant hG-CSF has been produced in genetically engineered Escherichia coli (Filgrastim) and successfully used to treat cancer patients suffering from chemotherapy-induced neutropenia. Filgrastim is a 175 amino acid protein, containing an extra N-terminal methionine, which is needed for expression in E. coli. Here we describe a simple and low-cost process that is amenable to scaling-up for the production and purification of homogeneous and active recombinant hG-CSF expressed in E. coli cells. Here we describe cloning of the human granulocyte colony-stimulating factor coding DNA sequence, protein expression in E. coli BL21(DE3) host cells in the absence of isopropyl-beta-D-thiogalactopyranoside (IPTG) induction, efficient isolation and solubilization of inclusion bodies by a multi-step washing procedure, and a purification protocol using a single cationic exchange column. Characterization of homogeneous rhG-CSF by size exclusion and reverse phase chromatography showed similar yields to the standard. The immunoassay and N-terminal sequencing confirmed the identity of rhG-CSF. The biological activity assay, in vivo, showed an equivalent biological effect (109.4%) to the standard reference rhG-CSF. The homogeneous rhG-CSF protein yield was 3.2 mg of bioactive protein per liter of cell culture. The recombinant protein expression in the absence of IPTG induction is advantageous since cost is reduced, and the protein purification protocol using a single chromatographic step should reduce cost even further for large scale production. The physicochemical, immunological and biological analyses showed that this protocol can be useful to develop therapeutic bioproducts. In summary, the combination of different experimental strategies presented here allowed an efficient and cost-effective protocol for rhG-CSF production. These data may be of interest to biopharmaceutical companies interested in developing biosimilars and healthcare community.
Vanz, Ana LS; Renard, Gaby; Palma, Mario S; Chies, Jocelei M; Dalmora, Sérgio L; Basso, Luiz A; Santos, Diógenes S
2008-01-01
Background Biopharmaceutical drugs are mainly recombinant proteins produced by biotechnological tools. The patents of many biopharmaceuticals have expired, and biosimilars are thus currently being developed. Human granulocyte colony stimulating factor (hG-CSF) is a hematopoietic cytokine that acts on cells of the neutrophil lineage causing proliferation and differentiation of committed precursor cells and activation of mature neutrophils. Recombinant hG-CSF has been produced in genetically engineered Escherichia coli (Filgrastim) and successfully used to treat cancer patients suffering from chemotherapy-induced neutropenia. Filgrastim is a 175 amino acid protein, containing an extra N-terminal methionine, which is needed for expression in E. coli. Here we describe a simple and low-cost process that is amenable to scaling-up for the production and purification of homogeneous and active recombinant hG-CSF expressed in E. coli cells. Results Here we describe cloning of the human granulocyte colony-stimulating factor coding DNA sequence, protein expression in E. coli BL21(DE3) host cells in the absence of isopropyl-β-D-thiogalactopyranoside (IPTG) induction, efficient isolation and solubilization of inclusion bodies by a multi-step washing procedure, and a purification protocol using a single cationic exchange column. Characterization of homogeneous rhG-CSF by size exclusion and reverse phase chromatography showed similar yields to the standard. The immunoassay and N-terminal sequencing confirmed the identity of rhG-CSF. The biological activity assay, in vivo, showed an equivalent biological effect (109.4%) to the standard reference rhG-CSF. The homogeneous rhG-CSF protein yield was 3.2 mg of bioactive protein per liter of cell culture. Conclusion The recombinant protein expression in the absence of IPTG induction is advantageous since cost is reduced, and the protein purification protocol using a single chromatographic step should reduce cost even further for large scale production. The physicochemical, immunological and biological analyses showed that this protocol can be useful to develop therapeutic bioproducts. In summary, the combination of different experimental strategies presented here allowed an efficient and cost-effective protocol for rhG-CSF production. These data may be of interest to biopharmaceutical companies interested in developing biosimilars and healthcare community. PMID:18394164
A Tree Based Broadcast Scheme for (m, k)-firm Real-Time Stream in Wireless Sensor Networks
Park, HoSung; Kim, Beom-Su; Kim, Kyong Hoon; Shah, Babar; Kim, Ki-Il
2017-01-01
Recently, various unicast routing protocols have been proposed to deliver measured data from the sensor node to the sink node within the predetermined deadline in wireless sensor networks. In parallel with their approaches, some applications demand the specific service, which is based on broadcast to all nodes within the deadline, the feasible real-time traffic model and improvements in energy efficiency. However, current protocols based on either flooding or one-to-one unicast cannot meet the above requirements entirely. Moreover, as far as the authors know, there is no study for the real-time broadcast protocol to support the application-specific traffic model in WSN yet. Based on the above analysis, in this paper, we propose a new (m, k)-firm-based Real-time Broadcast Protocol (FRBP) by constructing a broadcast tree to satisfy the (m, k)-firm, which is applicable to the real-time model in resource-constrained WSNs. The broadcast tree in FRBP is constructed by the distance-based priority scheme, whereas energy efficiency is improved by selecting as few as nodes on a tree possible. To overcome the unstable network environment, the recovery scheme invokes rapid partial tree reconstruction in order to designate another node as the parent on a tree according to the measured (m, k)-firm real-time condition and local states monitoring. Finally, simulation results are given to demonstrate the superiority of FRBP compared to the existing schemes in terms of average deadline missing ratio, average throughput and energy consumption. PMID:29120404
Protocol Processing for 100 Gbit/s and Beyond - A Soft Real-Time Approach in Hardware and Software
NASA Astrophysics Data System (ADS)
Büchner, Steffen; Lopacinski, Lukasz; Kraemer, Rolf; Nolte, Jörg
2017-09-01
100 Gbit/s wireless communication protocol processing stresses all parts of a communication system until the outermost. The efficient use of upcoming 100 Gbit/s and beyond transmission technology requires the rethinking of the way protocols are processed by the communication endpoints. This paper summarizes the achievements of the project End2End100. We will present a comprehensive soft real-time stream processing approach that allows the protocol designer to develop, analyze, and plan scalable protocols for ultra high data rates of 100 Gbit/s and beyond. Furthermore, we will present an ultra-low power, adaptable, and massively parallelized FEC (Forward Error Correction) scheme that detects and corrects bit errors at line rate with an energy consumption between 1 pJ/bit and 13 pJ/bit. The evaluation results discussed in this publication show that our comprehensive approach allows end-to-end communication with a very low protocol processing overhead.
Dutta, Debargh; Gunasekera, Devi; Ragni, Margaret V; Pratt, Kathleen P
2016-12-27
The most frequent mutations resulting in hemophilia A are an intron 22 or intron 1 gene inversion, which together cause ∼50% of severe hemophilia A cases. We report a simple and accurate RNA-based assay to detect these mutations in patients and heterozygous carriers. The assays do not require specialized equipment or expensive reagents; therefore, they may provide useful and economic protocols that could be standardized for central laboratory testing. RNA is purified from a blood sample, and reverse transcription nested polymerase chain reaction (RT-NPCR) reactions amplify DNA fragments with the F8 sequence spanning the exon 22 to 23 splice site (intron 22 inversion test) or the exon 1 to 2 splice site (intron 1 inversion test). These sequences will be amplified only from F8 RNA without an intron 22 or intron 1 inversion mutation, respectively. Additional RT-NPCR reactions are then carried out to amplify the inverted sequences extending from F8 exon 19 to the first in-frame stop codon within intron 22 or a chimeric transcript containing F8 exon 1 and the VBP1 gene. These latter 2 products are produced only by individuals with an intron 22 or intron 1 inversion mutation, respectively. The intron 22 inversion mutations may be further classified (eg, as type 1 or type 2, reflecting the specific homologous recombination sites) by the standard DNA-based "inverse-shifting" PCR assay if desired. Efficient Bcl I and T4 DNA ligase enzymes that cleave and ligate DNA in minutes were used, which is a substantial improvement over previous protocols that required overnight incubations. These protocols can accurately detect F8 inversion mutations via same-day testing of patient samples.
Design of SIP transformation server for efficient media negotiation
NASA Astrophysics Data System (ADS)
Pack, Sangheon; Paik, Eun Kyoung; Choi, Yanghee
2001-07-01
Voice over IP (VoIP) is one of the advanced services supported by the next generation mobile communication. VoIP should support various media formats and terminals existing together. This heterogeneous environment may prevent diverse users from establishing VoIP sessions among them. To solve the problem an efficient media negotiation mechanism is required. In this paper, we propose the efficient media negotiation architecture using the transformation server and the Intelligent Location Server (ILS). The transformation server is an extended Session Initiation Protocol (SIP) proxy server. It can modify an unacceptable session INVITE message into an acceptable one using the ILS. The ILS is a directory server based on the Lightweight Directory Access Protocol (LDAP) that keeps userí*s location information and available media information. The proposed architecture can eliminate an unnecessary response and re-INVITE messages of the standard SIP architecture. It takes only 1.5 round trip times to negotiate two different media types while the standard media negotiation mechanism takes 2.5 round trip times. The extra processing time in message handling is negligible in comparison to the reduced round trip time. The experimental results show that the session setup time in the proposed architecture is less than the setup time in the standard SIP. These results verify that the proposed media negotiation mechanism is more efficient in solving diversity problems.
Cryopreservation of Fish Spermatogonial Cells: The Future of Natural History Collections.
Hagedorn, Mary M; Daly, Jonathan P; Carter, Virginia L; Cole, Kathleen S; Jaafar, Zeehan; Lager, Claire V A; Parenti, Lynne R
2018-04-18
As global biodiversity declines, the value of biological collections increases. Cryopreserved diploid spermatogonial cells meet two goals: to yield high-quality molecular sequence data; and to regenerate new individuals, hence potentially countering species extinction. Cryopreserved spermatogonial cells that allow for such mitigative measures are not currently in natural history museum collections because there are no standard protocols to collect them. Vertebrate specimens, especially fishes, are traditionally formalin-fixed and alcohol-preserved which makes them ideal for morphological studies and as museum vouchers, but inadequate for molecular sequence data. Molecular studies of fishes routinely use tissues preserved in ethanol; yet tissues preserved in this way may yield degraded sequences over time. As an alternative to tissue fixation methods, we assessed and compared previously published cryopreservation methods by gating and counting fish testicular cells with flow cytometry to identify presumptive spermatogonia A-type cells. Here we describe a protocol to cryopreserve tissues that yields a high percentage of viable spermatogonial cells from the testes of Asterropteryx semipunctata, a marine goby. Material cryopreserved using this protocol represents the first frozen and post-thaw viable spermatogonial cells of fishes archived in a natural history museum to provide better quality material for re-derivation of species and DNA preservation and analysis.
NASA Technical Reports Server (NTRS)
Burleigh, Scott C.
2011-01-01
Zero-Copy Objects System software enables application data to be encapsulated in layers of communication protocol without being copied. Indirect referencing enables application source data, either in memory or in a file, to be encapsulated in place within an unlimited number of protocol headers and/or trailers. Zero-copy objects (ZCOs) are abstract data access representations designed to minimize I/O (input/output) in the encapsulation of application source data within one or more layers of communication protocol structure. They are constructed within the heap space of a Simple Data Recorder (SDR) data store to which all participating layers of the stack must have access. Each ZCO contains general information enabling access to the core source data object (an item of application data), together with (a) a linked list of zero or more specific extents that reference portions of this source data object, and (b) linked lists of protocol header and trailer capsules. The concatenation of the headers (in ascending stack sequence), the source data object extents, and the trailers (in descending stack sequence) constitute the transmitted data object constructed from the ZCO. This scheme enables a source data object to be encapsulated in a succession of protocol layers without ever having to be copied from a buffer at one layer of the protocol stack to an encapsulating buffer at a lower layer of the stack. For large source data objects, the savings in copy time and reduction in memory consumption may be considerable.
Quick, Joshua; Grubaugh, Nathan D; Pullan, Steven T; Claro, Ingra M; Smith, Andrew D; Gangavarapu, Karthik; Oliveira, Glenn; Robles-Sikisaka, Refugio; Rogers, Thomas F; Beutler, Nathan A; Burton, Dennis R; Lewis-Ximenez, Lia Laura; de Jesus, Jaqueline Goes; Giovanetti, Marta; Hill, Sarah C; Black, Allison; Bedford, Trevor; Carroll, Miles W; Nunes, Marcio; Alcantara, Luiz Carlos; Sabino, Ester C; Baylis, Sally A; Faria, Nuno R; Loose, Matthew; Simpson, Jared T; Pybus, Oliver G; Andersen, Kristian G; Loman, Nicholas J
2017-06-01
Genome sequencing has become a powerful tool for studying emerging infectious diseases; however, genome sequencing directly from clinical samples (i.e., without isolation and culture) remains challenging for viruses such as Zika, for which metagenomic sequencing methods may generate insufficient numbers of viral reads. Here we present a protocol for generating coding-sequence-complete genomes, comprising an online primer design tool, a novel multiplex PCR enrichment protocol, optimized library preparation methods for the portable MinION sequencer (Oxford Nanopore Technologies) and the Illumina range of instruments, and a bioinformatics pipeline for generating consensus sequences. The MinION protocol does not require an Internet connection for analysis, making it suitable for field applications with limited connectivity. Our method relies on multiplex PCR for targeted enrichment of viral genomes from samples containing as few as 50 genome copies per reaction. Viral consensus sequences can be achieved in 1-2 d by starting with clinical samples and following a simple laboratory workflow. This method has been successfully used by several groups studying Zika virus evolution and is facilitating an understanding of the spread of the virus in the Americas. The protocol can be used to sequence other viral genomes using the online Primal Scheme primer designer software. It is suitable for sequencing either RNA or DNA viruses in the field during outbreaks or as an inexpensive, convenient method for use in the lab.
"First generation" automated DNA sequencing technology.
Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M
2011-10-01
Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.
Abal-Fabeiro, J L; Maside, X; Llovo, J; Bello, X; Torres, M; Treviño, M; Moldes, L; Muñoz, A; Carracedo, A; Bartolomé, C
2014-04-01
The epidemiological study of human cryptosporidiosis requires the characterization of species and subtypes involved in human disease in large sample collections. Molecular genotyping is costly and time-consuming, making the implementation of low-cost, highly efficient technologies increasingly necessary. Here, we designed a protocol based on MALDI-TOF mass spectrometry for the high-throughput genotyping of a panel of 55 single nucleotide variants (SNVs) selected as markers for the identification of common gp60 subtypes of four Cryptosporidium species that infect humans. The method was applied to a panel of 608 human and 63 bovine isolates and the results were compared with control samples typed by Sanger sequencing. The method allowed the identification of species in 610 specimens (90·9%) and gp60 subtype in 605 (90·2%). It displayed excellent performance, with sensitivity and specificity values of 87·3 and 98·0%, respectively. Up to nine genotypes from four different Cryptosporidium species (C. hominis, C. parvum, C. meleagridis and C. felis) were detected in humans; the most common ones were C. hominis subtype Ib, and C. parvum IIa (61·3 and 28·3%, respectively). 96·5% of the bovine samples were typed as IIa. The method performs as well as the widely used Sanger sequencing and is more cost-effective and less time consuming.
Sun, Bing; Shen, Feng; McCalla, Stephanie E; Kreutz, Jason E; Karymov, Mikhail A; Ismagilov, Rustem F
2013-02-05
Here we used a SlipChip microfluidic device to evaluate the performance of digital reverse transcription-loop-mediated isothermal amplification (dRT-LAMP) for quantification of HIV viral RNA. Tests are needed for monitoring HIV viral load to control the emergence of drug resistance and to diagnose acute HIV infections. In resource-limited settings, in vitro measurement of HIV viral load in a simple format is especially needed, and single-molecule counting using a digital format could provide a potential solution. We showed here that when one-step dRT-LAMP is used for quantification of HIV RNA, the digital count is lower than expected and is limited by the yield of desired cDNA. We were able to overcome the limitations by developing a microfluidic protocol to manipulate many single molecules in parallel through a two-step digital process. In the first step we compartmentalize the individual RNA molecules (based on Poisson statistics) and perform reverse transcription on each RNA molecule independently to produce DNA. In the second step, we perform the LAMP amplification on all individual DNA molecules in parallel. Using this new protocol, we increased the absolute efficiency (the ratio between the concentration calculated from the actual count and the expected concentration) of dRT-LAMP 10-fold, from ∼2% to ∼23%, by (i) using a more efficient reverse transcriptase, (ii) introducing RNase H to break up the DNA:RNA hybrid, and (iii) adding only the BIP primer during the RT step. We also used this two-step method to quantify HIV RNA purified from four patient samples and found that in some cases, the quantification results were highly sensitive to the sequence of the patient's HIV RNA. We learned the following three lessons from this work: (i) digital amplification technologies, including dLAMP and dPCR, may give adequate dilution curves and yet have low efficiency, thereby providing quantification values that underestimate the true concentration. Careful validation is essential before a method is considered to provide absolute quantification; (ii) the sensitivity of dLAMP to the sequence of the target nucleic acid necessitates additional validation with patient samples carrying the full spectrum of mutations; (iii) for multistep digital amplification chemistries, such as a combination of reverse transcription with amplification, microfluidic devices may be used to decouple these steps from one another and to perform them under different, individually optimized conditions for improved efficiency.
Fischer, Carlos N; Campos, Victor De A; Barella, Victor H
2018-05-01
Profile hidden Markov models (pHMMs) have been used to search for transposable elements (TEs) in genomes. For the learning of pHMMs aimed to search for TEs of the retrotransposon class, the conventional protocol is to use the whole internal nucleotide portions of these elements as representative sequences. To further explore the potential of pHMMs in such a search, we propose five alternative ways to obtain the sets of representative sequences of TEs other than the conventional protocol. In this study, we are interested in Bel-PAO, Copia, Gypsy, and DIRS superfamilies from the retrotransposon class. We compared the pHMMs of all six protocols. The test results show that, for each TE superfamily, the pHMMs of at least two of the proposed protocols performed better than the conventional one and that the number of correct predictions provided by the latter can be improved by considering together the results of one or more of the alternative protocols.
Mitra, Abhishek; Skrzypczak, Magdalena; Ginalski, Krzysztof; Rowicka, Maga
2015-01-01
Sequencing microRNA, reduced representation sequencing, Hi-C technology and any method requiring the use of in-house barcodes result in sequencing libraries with low initial sequence diversity. Sequencing such data on the Illumina platform typically produces low quality data due to the limitations of the Illumina cluster calling algorithm. Moreover, even in the case of diverse samples, these limitations are causing substantial inaccuracies in multiplexed sample assignment (sample bleeding). Such inaccuracies are unacceptable in clinical applications, and in some other fields (e.g. detection of rare variants). Here, we discuss how both problems with quality of low-diversity samples and sample bleeding are caused by incorrect detection of clusters on the flowcell during initial sequencing cycles. We propose simple software modifications (Long Template Protocol) that overcome this problem. We present experimental results showing that our Long Template Protocol remarkably increases data quality for low diversity samples, as compared with the standard analysis protocol; it also substantially reduces sample bleeding for all samples. For comprehensiveness, we also discuss and compare experimental results from alternative approaches to sequencing low diversity samples. First, we discuss how the low diversity problem, if caused by barcodes, can be avoided altogether at the barcode design stage. Second and third, we present modified guidelines, which are more stringent than the manufacturer’s, for mixing low diversity samples with diverse samples and lowering cluster density, which in our experience consistently produces high quality data from low diversity samples. Fourth and fifth, we present rescue strategies that can be applied when sequencing results in low quality data and when there is no more biological material available. In such cases, we propose that the flowcell be re-hybridized and sequenced again using our Long Template Protocol. Alternatively, we discuss how analysis can be repeated from saved sequencing images using the Long Template Protocol to increase accuracy. PMID:25860802
Efficient model checking of network authentication protocol based on SPIN
NASA Astrophysics Data System (ADS)
Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan
2013-03-01
Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.
Marcacci, Maurilia; Ancora, Massimo; Mangone, Iolanda; Teodori, Liana; Di Sabatino, Daria; De Massis, Fabrizio; Camma', Cesare; Savini, Giovanni; Lorusso, Alessio
2014-06-01
Dynamic surveillance and characterization of canine distemper virus (CDV) circulating strains are essential against possible vaccine breakthroughs events. This study describes the setup of a fast and robust next-generation sequencing (NGS) Ion PGM™ protocol that was used to obtain the complete genome sequence of a CDV isolate (CDV2784/2013). CDV2784/2013 is the prototype of CDV strains responsible for severe clinical distemper in dogs and wolves in Italy during 2013. CDV2784/2013 was isolated on cell culture and total RNA was used for NGS sample preparation. A total of 112.3 Mb of reads were assembled de novo using MIRA version 4.0rc4, which yielded a total number of 403 contigs with 12.1% coverage. The whole genome (15,690 bp) was recovered successfully and compared to those of existing CDV whole genomes. CDV2784/2013 was shown to have 92% nt identity with the Onderstepoort vaccine strain. This study describes for the first time a fast and robust Ion PGM™ platform-based whole genome amplification protocol for non-segmented negative stranded RNA viruses starting from total cell-purified RNA. Additionally, this is the first study reporting the whole genome analysis of an Arctic lineage strain that is known to circulate widely in Europe, Asia and USA. Copyright © 2014 Elsevier B.V. All rights reserved.
El-Assaad, Atlal; Dawy, Zaher; Nemer, Georges; Hajj, Hazem; Kobeissy, Firas H
2017-01-01
Degradomics is a novel discipline that involves determination of the proteases/substrate fragmentation profile, called the substrate degradome, and has been recently applied in different disciplines. A major application of degradomics is its utility in the field of biomarkers where the breakdown products (BDPs) of different protease have been investigated. Among the major proteases assessed, calpain and caspase proteases have been associated with the execution phases of the pro-apoptotic and pro-necrotic cell death, generating caspase/calpain-specific cleaved fragments. The distinction between calpain and caspase protein fragments has been applied to distinguish injury mechanisms. Advanced proteomics technology has been used to identify these BDPs experimentally. However, it has been a challenge to identify these BDPs with high precision and efficiency, especially if we are targeting a number of proteins at one time. In this chapter, we present a novel bioinfromatic detection method that identifies BDPs accurately and efficiently with validation against experimental data. This method aims at predicting the consensus sequence occurrences and their variants in a large set of experimentally detected protein sequences based on state-of-the-art sequence matching and alignment algorithms. After detection, the method generates all the potential cleaved fragments by a specific protease. This space and time-efficient algorithm is flexible to handle the different orientations that the consensus sequence and the protein sequence can take before cleaving. It is O(mn) in space complexity and O(Nmn) in time complexity, with N number of protein sequences, m length of the consensus sequence, and n length of each protein sequence. Ultimately, this knowledge will subsequently feed into the development of a novel tool for researchers to detect diverse types of selected BDPs as putative disease markers, contributing to the diagnosis and treatment of related disorders.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ureba, A.; Salguero, F. J.; Barbeiro, A. R.
Purpose: The authors present a hybrid direct multileaf collimator (MLC) aperture optimization model exclusively based on sequencing of patient imaging data to be implemented on a Monte Carlo treatment planning system (MC-TPS) to allow the explicit radiation transport simulation of advanced radiotherapy treatments with optimal results in efficient times for clinical practice. Methods: The planning system (called CARMEN) is a full MC-TPS, controlled through aMATLAB interface, which is based on the sequencing of a novel map, called “biophysical” map, which is generated from enhanced image data of patients to achieve a set of segments actually deliverable. In order to reducemore » the required computation time, the conventional fluence map has been replaced by the biophysical map which is sequenced to provide direct apertures that will later be weighted by means of an optimization algorithm based on linear programming. A ray-casting algorithm throughout the patient CT assembles information about the found structures, the mass thickness crossed, as well as PET values. Data are recorded to generate a biophysical map for each gantry angle. These maps are the input files for a home-made sequencer developed to take into account the interactions of photons and electrons with the MLC. For each linac (Axesse of Elekta and Primus of Siemens) and energy beam studied (6, 9, 12, 15 MeV and 6 MV), phase space files were simulated with the EGSnrc/BEAMnrc code. The dose calculation in patient was carried out with the BEAMDOSE code. This code is a modified version of EGSnrc/DOSXYZnrc able to calculate the beamlet dose in order to combine them with different weights during the optimization process. Results: Three complex radiotherapy treatments were selected to check the reliability of CARMEN in situations where the MC calculation can offer an added value: A head-and-neck case (Case I) with three targets delineated on PET/CT images and a demanding dose-escalation; a partial breast irradiation case (Case II) solved with photon and electron modulated beams (IMRT + MERT); and a prostatic bed case (Case III) with a pronounced concave-shaped PTV by using volumetric modulated arc therapy. In the three cases, the required target prescription doses and constraints on organs at risk were fulfilled in a short enough time to allow routine clinical implementation. The quality assurance protocol followed to check CARMEN system showed a high agreement with the experimental measurements. Conclusions: A Monte Carlo treatment planning model exclusively based on maps performed from patient imaging data has been presented. The sequencing of these maps allows obtaining deliverable apertures which are weighted for modulation under a linear programming formulation. The model is able to solve complex radiotherapy treatments with high accuracy in an efficient computation time.« less
A Lightweight Protocol for Secure Video Streaming
Morkevicius, Nerijus; Bagdonas, Kazimieras
2018-01-01
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing “Fog Node-End Device” layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard. PMID:29757988
A Lightweight Protocol for Secure Video Streaming.
Venčkauskas, Algimantas; Morkevicius, Nerijus; Bagdonas, Kazimieras; Damaševičius, Robertas; Maskeliūnas, Rytis
2018-05-14
The Internet of Things (IoT) introduces many new challenges which cannot be solved using traditional cloud and host computing models. A new architecture known as fog computing is emerging to address these technological and security gaps. Traditional security paradigms focused on providing perimeter-based protections and client/server point to point protocols (e.g., Transport Layer Security (TLS)) are no longer the best choices for addressing new security challenges in fog computing end devices, where energy and computational resources are limited. In this paper, we present a lightweight secure streaming protocol for the fog computing "Fog Node-End Device" layer. This protocol is lightweight, connectionless, supports broadcast and multicast operations, and is able to provide data source authentication, data integrity, and confidentiality. The protocol is based on simple and energy efficient cryptographic methods, such as Hash Message Authentication Codes (HMAC) and symmetrical ciphers, and uses modified User Datagram Protocol (UDP) packets to embed authentication data into streaming data. Data redundancy could be added to improve reliability in lossy networks. The experimental results summarized in this paper confirm that the proposed method efficiently uses energy and computational resources and at the same time provides security properties on par with the Datagram TLS (DTLS) standard.
Wan, Haisu; Li, Yongwen; Fan, Yu; Meng, Fanrong; Chen, Chen; Zhou, Qinghua
2012-01-15
Site-directed mutagenesis has become routine in molecular biology. However, many mutants can still be very difficult to create. Complicated chimerical mutations, tandem repeats, inverted sequences, GC-rich regions, and/or heavy secondary structures can cause inefficient or incorrect binding of the mutagenic primer to the target sequence and affect the subsequent amplification. In theory, these problems can be avoided by introducing the mutations into the target sequence using mutagenic fragments and so removing the need for primer-template annealing. The cassette mutagenesis uses the mutagenic fragment in its protocol; however, in most cases it needs to perform two rounds of mutagenic primer-based mutagenesis to introduce suitable restriction enzyme sites into templates and is not suitable for routine mutagenesis. Here we describe a highly efficient method in which the template except the region to be mutated is amplified by polymerase chain reaction (PCR) and the type IIs restriction enzyme-digested PCR product is directly ligated with the mutagenic fragment. Our method requires no assistance of mutagenic primers. We have used this method to create various types of difficult-to-make mutants with mutagenic frequencies of nearly 100%. Our protocol has many advantages over the prevalent QuikChange method and is a valuable tool for studies on gene structure and function. Copyright © 2011 Elsevier Inc. All rights reserved.
Won, Jongho; Ma, Chris Y. T.; Yau, David K. Y.; ...
2016-06-01
Smart meters are integral to demand response in emerging smart grids, by reporting the electricity consumption of users to serve application needs. But reporting real-time usage information for individual households raises privacy concerns. Existing techniques to guarantee differential privacy (DP) of smart meter users either are not fault tolerant or achieve (possibly partial) fault tolerance at high communication overheads. In this paper, we propose a fault-tolerant protocol for smart metering that can handle general communication failures while ensuring DP with significantly improved efficiency and lower errors compared with the state of the art. Our protocol handles fail-stop faults proactively bymore » using a novel design of future ciphertexts, and distributes trust among the smart meters by sharing secret keys among them. We prove the DP properties of our protocol and analyze its advantages in fault tolerance, accuracy, and communication efficiency relative to competing techniques. We illustrate our analysis by simulations driven by real-world traces of electricity consumption.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Won, Jongho; Ma, Chris Y. T.; Yau, David K. Y.
Smart meters are integral to demand response in emerging smart grids, by reporting the electricity consumption of users to serve application needs. But reporting real-time usage information for individual households raises privacy concerns. Existing techniques to guarantee differential privacy (DP) of smart meter users either are not fault tolerant or achieve (possibly partial) fault tolerance at high communication overheads. In this paper, we propose a fault-tolerant protocol for smart metering that can handle general communication failures while ensuring DP with significantly improved efficiency and lower errors compared with the state of the art. Our protocol handles fail-stop faults proactively bymore » using a novel design of future ciphertexts, and distributes trust among the smart meters by sharing secret keys among them. We prove the DP properties of our protocol and analyze its advantages in fault tolerance, accuracy, and communication efficiency relative to competing techniques. We illustrate our analysis by simulations driven by real-world traces of electricity consumption.« less
A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks.
Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang
2017-08-08
Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs.
De Tobel, Jannick; Hillewig, Elke; Bogaert, Stephanie; Deblaere, Karel; Verstraete, Koenraad
2017-03-01
Established dental age estimation methods in sub-adults study the development of third molar root apices on radiographs. In living individuals, however, avoiding ionising radiation is expedient. Studying dental development with magnetic resonance imaging complies with this requirement, adding the advantage of imaging in three dimensions. To elaborate the development of an MRI protocol to visualise all third molars for forensic age estimation, with particular attention to the development of the root apex. Ex vivo scans of porcine jaws and in vivo scans of 10 volunteers aged 17-25 years were performed to select adequate sequences. Studied parameters were T1 vs T2 weighting, ultrashort echo time (UTE), fat suppression, in plane resolution, slice thickness, 3D imaging, signal-to-noise ratio, and acquisition time. A bilateral four-channel flexible surface coil was used. Two observers evaluated the suitability of the images. T2-weighted images were preferred to T1-weighted images. To clearly distinguish root apices in (almost) fully developed third molars an in plane resolution of 0.33 × 0.33 mm 2 was deemed necessary. Taking acquisition time limits into account, only a T2 FSE sequence with slice thickness of 2 mm generated images with sufficient resolution and contrast. UTE, thinner slice T2 FSE and T2 3D FSE sequences could not generate the desired resolution within 6.5 minutes. Three Tesla MRI of the third molars is a feasible technique for forensic age estimation, in which a T2 FSE sequence can provide the desired in plane resolution within a clinically acceptable acquisition time.
An Efficient Data-Gathering Routing Protocol for Underwater Wireless Sensor Networks.
Javaid, Nadeem; Ilyas, Naveed; Ahmad, Ashfaq; Alrajeh, Nabil; Qasim, Umar; Khan, Zahoor Ali; Liaqat, Tayyaba; Khan, Majid Iqbal
2015-11-17
Most applications of underwater wireless sensor networks (UWSNs) demand reliable data delivery over a longer period in an efficient and timely manner. However, the harsh and unpredictable underwater environment makes routing more challenging as compared to terrestrial WSNs. Most of the existing schemes deploy mobile sensors or a mobile sink (MS) to maximize data gathering. However, the relatively high deployment cost prevents their usage in most applications. Thus, this paper presents an autonomous underwater vehicle (AUV)-aided efficient data-gathering (AEDG) routing protocol for reliable data delivery in UWSNs. To prolong the network lifetime, AEDG employs an AUV for data collection from gateways and uses a shortest path tree (SPT) algorithm while associating sensor nodes with the gateways. The AEDG protocol also limits the number of associated nodes with the gateway nodes to minimize the network energy consumption and to prevent the gateways from overloading. Moreover, gateways are rotated with the passage of time to balance the energy consumption of the network. To prevent data loss, AEDG allows dynamic data collection at the AUV depending on the limited number of member nodes that are associated with each gateway. We also develop a sub-optimal elliptical trajectory of AUV by using a connected dominating set (CDS) to further facilitate network throughput maximization. The performance of the AEDG is validated via simulations, which demonstrate the effectiveness of AEDG in comparison to two existing UWSN routing protocols in terms of the selected performance metrics.
An Efficient Data-Gathering Routing Protocol for Underwater Wireless Sensor Networks
Javaid, Nadeem; Ilyas, Naveed; Ahmad, Ashfaq; Alrajeh, Nabil; Qasim, Umar; Khan, Zahoor Ali; Liaqat, Tayyaba; Khan, Majid Iqbal
2015-01-01
Most applications of underwater wireless sensor networks (UWSNs) demand reliable data delivery over a longer period in an efficient and timely manner. However, the harsh and unpredictable underwater environment makes routing more challenging as compared to terrestrial WSNs. Most of the existing schemes deploy mobile sensors or a mobile sink (MS) to maximize data gathering. However, the relatively high deployment cost prevents their usage in most applications. Thus, this paper presents an autonomous underwater vehicle (AUV)-aided efficient data-gathering (AEDG) routing protocol for reliable data delivery in UWSNs. To prolong the network lifetime, AEDG employs an AUV for data collection from gateways and uses a shortest path tree (SPT) algorithm while associating sensor nodes with the gateways. The AEDG protocol also limits the number of associated nodes with the gateway nodes to minimize the network energy consumption and to prevent the gateways from overloading. Moreover, gateways are rotated with the passage of time to balance the energy consumption of the network. To prevent data loss, AEDG allows dynamic data collection at the AUV depending on the limited number of member nodes that are associated with each gateway. We also develop a sub-optimal elliptical trajectory of AUV by using a connected dominating set (CDS) to further facilitate network throughput maximization. The performance of the AEDG is validated via simulations, which demonstrate the effectiveness of AEDG in comparison to two existing UWSN routing protocols in terms of the selected performance metrics. PMID:26593924
Adaptive efficient compression of genomes
2012-01-01
Modern high-throughput sequencing technologies are able to generate DNA sequences at an ever increasing rate. In parallel to the decreasing experimental time and cost necessary to produce DNA sequences, computational requirements for analysis and storage of the sequences are steeply increasing. Compression is a key technology to deal with this challenge. Recently, referential compression schemes, storing only the differences between a to-be-compressed input and a known reference sequence, gained a lot of interest in this field. However, memory requirements of the current algorithms are high and run times often are slow. In this paper, we propose an adaptive, parallel and highly efficient referential sequence compression method which allows fine-tuning of the trade-off between required memory and compression speed. When using 12 MB of memory, our method is for human genomes on-par with the best previous algorithms in terms of compression ratio (400:1) and compression speed. In contrast, it compresses a complete human genome in just 11 seconds when provided with 9 GB of main memory, which is almost three times faster than the best competitor while using less main memory. PMID:23146997
Serrano-Silva, N; Calderón-Ezquerro, M C
2018-04-01
The identification of airborne bacteria has traditionally been performed by retrieval in culture media, but the bacterial diversity in the air is underestimated using this method because many bacteria are not readily cultured. Advances in DNA sequencing technology have produced a broad knowledge of genomics and metagenomics, which can greatly improve our ability to identify and study the diversity of airborne bacteria. However, researchers are facing several challenges, particularly the efficient retrieval of low-density microorganisms from the air and the lack of standardized protocols for sample collection and processing. In this study, we tested three methods for sampling bioaerosols - a Durham-type spore trap (Durham), a seven-day recording volumetric spore trap (HST), and a high-throughput 'Jet' spore and particle sampler (Jet) - and recovered metagenomic DNA for 16S rDNA sequencing. Samples were simultaneously collected with the three devices during one week, and the sequencing libraries were analyzed. A simple and efficient method for collecting bioaerosols and extracting good quality DNA for high-throughput sequencing was standardized. The Durham sampler collected preferentially Cyanobacteria, the HST Actinobacteria, Proteobacteria and Firmicutes, and the Jet mainly Proteobacteria and Firmicutes. The HST sampler collected the largest amount of airborne bacterial diversity. More experiments are necessary to select the right sampler, depending on study objectives, which may require monitoring and collecting specific airborne bacteria. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sproul, John S; Maddison, David R
2017-11-01
Despite advances that allow DNA sequencing of old museum specimens, sequencing small-bodied, historical specimens can be challenging and unreliable as many contain only small amounts of fragmented DNA. Dependable methods to sequence such specimens are especially critical if the specimens are unique. We attempt to sequence small-bodied (3-6 mm) historical specimens (including nomenclatural types) of beetles that have been housed, dried, in museums for 58-159 years, and for which few or no suitable replacement specimens exist. To better understand ideal approaches of sample preparation and produce preparation guidelines, we compared different library preparation protocols using low amounts of input DNA (1-10 ng). We also explored low-cost optimizations designed to improve library preparation efficiency and sequencing success of historical specimens with minimal DNA, such as enzymatic repair of DNA. We report successful sample preparation and sequencing for all historical specimens despite our low-input DNA approach. We provide a list of guidelines related to DNA repair, bead handling, reducing adapter dimers and library amplification. We present these guidelines to facilitate more economical use of valuable DNA and enable more consistent results in projects that aim to sequence challenging, irreplaceable historical specimens. © 2017 John Wiley & Sons Ltd.
Chung, Jongsuk; Son, Dae-Soon; Jeon, Hyo-Jeong; Kim, Kyoung-Mee; Park, Gahee; Ryu, Gyu Ha; Park, Woong-Yang; Park, Donghyun
2016-01-01
Targeted capture massively parallel sequencing is increasingly being used in clinical settings, and as costs continue to decline, use of this technology may become routine in health care. However, a limited amount of tissue has often been a challenge in meeting quality requirements. To offer a practical guideline for the minimum amount of input DNA for targeted sequencing, we optimized and evaluated the performance of targeted sequencing depending on the input DNA amount. First, using various amounts of input DNA, we compared commercially available library construction kits and selected Agilent’s SureSelect-XT and KAPA Biosystems’ Hyper Prep kits as the kits most compatible with targeted deep sequencing using Agilent’s SureSelect custom capture. Then, we optimized the adapter ligation conditions of the Hyper Prep kit to improve library construction efficiency and adapted multiplexed hybrid selection to reduce the cost of sequencing. In this study, we systematically evaluated the performance of the optimized protocol depending on the amount of input DNA, ranging from 6.25 to 200 ng, suggesting the minimal input DNA amounts based on coverage depths required for specific applications. PMID:27220682
Pollier, Jacob; González-Guzmán, Miguel; Ardiles-Diaz, Wilson; Geelen, Danny; Goossens, Alain
2011-01-01
cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) is a commonly used technique for genome-wide expression analysis that does not require prior sequence knowledge. Typically, quantitative expression data and sequence information are obtained for a large number of differentially expressed gene tags. However, most of the gene tags do not correspond to full-length (FL) coding sequences, which is a prerequisite for subsequent functional analysis. A medium-throughput screening strategy, based on integration of polymerase chain reaction (PCR) and colony hybridization, was developed that allows in parallel screening of a cDNA library for FL clones corresponding to incomplete cDNAs. The method was applied to screen for the FL open reading frames of a selection of 163 cDNA-AFLP tags from three different medicinal plants, leading to the identification of 109 (67%) FL clones. Furthermore, the protocol allows for the use of multiple probes in a single hybridization event, thus significantly increasing the throughput when screening for rare transcripts. The presented strategy offers an efficient method for the conversion of incomplete expressed sequence tags (ESTs), such as cDNA-AFLP tags, to FL-coding sequences.
An energy-aware routing protocol for query-based applications in wireless sensor networks.
Ahvar, Ehsan; Ahvar, Shohreh; Lee, Gyu Myoung; Crespi, Noel
2014-01-01
Wireless sensor network (WSN) typically has energy consumption restriction. Designing energy-aware routing protocol can significantly reduce energy consumption in WSNs. Energy-aware routing protocols can be classified into two categories, energy savers and energy balancers. Energy saving protocols are used to minimize the overall energy consumed by a WSN, while energy balancing protocols attempt to efficiently distribute the consumption of energy throughout the network. In general terms, energy saving protocols are not necessarily good at balancing energy consumption and energy balancing protocols are not always good at reducing energy consumption. In this paper, we propose an energy-aware routing protocol (ERP) for query-based applications in WSNs, which offers a good trade-off between traditional energy balancing and energy saving objectives and supports a soft real time packet delivery. This is achieved by means of fuzzy sets and learning automata techniques along with zonal broadcasting to decrease total energy consumption.
An Energy-Aware Routing Protocol for Query-Based Applications in Wireless Sensor Networks
Crespi, Noel
2014-01-01
Wireless sensor network (WSN) typically has energy consumption restriction. Designing energy-aware routing protocol can significantly reduce energy consumption in WSNs. Energy-aware routing protocols can be classified into two categories, energy savers and energy balancers. Energy saving protocols are used to minimize the overall energy consumed by a WSN, while energy balancing protocols attempt to efficiently distribute the consumption of energy throughout the network. In general terms, energy saving protocols are not necessarily good at balancing energy consumption and energy balancing protocols are not always good at reducing energy consumption. In this paper, we propose an energy-aware routing protocol (ERP) for query-based applications in WSNs, which offers a good trade-off between traditional energy balancing and energy saving objectives and supports a soft real time packet delivery. This is achieved by means of fuzzy sets and learning automata techniques along with zonal broadcasting to decrease total energy consumption. PMID:24696640
Multicore and GPU algorithms for Nussinov RNA folding
2014-01-01
Background One segment of a RNA sequence might be paired with another segment of the same RNA sequence due to the force of hydrogen bonds. This two-dimensional structure is called the RNA sequence's secondary structure. Several algorithms have been proposed to predict an RNA sequence's secondary structure. These algorithms are referred to as RNA folding algorithms. Results We develop cache efficient, multicore, and GPU algorithms for RNA folding using Nussinov's algorithm. Conclusions Our cache efficient algorithm provides a speedup between 1.6 and 3.0 relative to a naive straightforward single core code. The multicore version of the cache efficient single core algorithm provides a speedup, relative to the naive single core algorithm, between 7.5 and 14.0 on a 6 core hyperthreaded CPU. Our GPU algorithm for the NVIDIA C2050 is up to 1582 times as fast as the naive single core algorithm and between 5.1 and 11.2 times as fast as the fastest previously known GPU algorithm for Nussinov RNA folding. PMID:25082539
Zhu, Yun J; Fitch, Maureen M M; Moore, Paul H
2006-01-01
Transgenic papaya plants were initially obtained using particle bombardment, a method having poor efficiency in producing intact, single-copy insertion of transgenes. Single-copy gene insertion was improved using Agrobacterium tumefaciens. With progress being made in genome sequencing and gene discovery, there is a need for more efficient methods of transformation in order to study the function of these genes. We describe a protocol for Agrobacterium-mediated transformation using carborundum-wounded papaya embryogenic calli. This method should lead to high-throughput transformation, which on average produced at least one plant that was positive in polymerase chain reaction (PCR), histochemical staining, or by Southern blot hybridization from 10 to 20% of the callus clusters that had been co-cultivated with Agrobacterium. Plants regenerated from the callus clusters in 9 to 13 mo.
Dinçer, Alp; Yildiz, Erdem; Kohan, Saeed; Memet Özek, M
2011-01-01
The aim of the study is to evaluate the efficiency of turbo spin-echo (TSE), three-dimensional constructive interference in the steady state (3D CISS) and cine phase contrast (Cine PC) sequences in determining flow through the endoscopic third ventriculostomy (ETV) fenestration, and to determine the effect of various TSE sequence parameters. The study was approved by our institutional review board and informed consent from all patients was obtained. Two groups of patients were included: group I (24 patients with good clinical outcome after ETV) and group II (22 patients with hydrocephalus evaluated preoperatively). The imaging protocol for both groups was identical. TSE T2 with various sequence parameters and imaging planes, and 3D CISS, followed by cine PC were obtained. Flow void was graded as four-point scales. The sensitivity, specificity, accuracy, positive and negative predictive values of sequences were calculated. Bidirectional flow through the fenestration was detected in all group I patients by cine PC. Stroke volumes through the fenestration in group I ranged 10-160.8 ml/min. There was no correlation between the presence of reversed flow and flow void grading. Also, there was no correlation between the stroke volumes and flow void grading. The sensitivity of 3D CISS was low, and 2 mm sagittal TSE T2, nearly equal to cine PC, provided best result. Cine PC and TSE T2 both have high confidence in the assessment of the flow through the fenestration. But, sequence parameters significantly affect the efficiency of TSE T2.
Martineau, Christine; Li, Xuejing; Lalancette, Cindy; Perreault, Thérèse; Fournier, Eric; Tremblay, Julien; Gonzales, Milagros; Yergeau, Étienne; Quach, Caroline
2018-06-13
Serratia marcescens is an environmental bacterium commonly associated with outbreaks in neonatal intensive care units (NICU). Investigation of S. marcescens outbreaks requires efficient recovery and typing of clinical and environmental isolates. In this study, we described how the use of next-generation sequencing applications, such as bacterial whole-genome sequencing (WGS) and bacterial community profiling, could improve S. marcescens outbreak investigation. Phylogenomic links and potential antibiotic resistance genes and plasmids in S. marcescens isolates were investigated using WGS, while bacterial communities and relative abundances of Serratia in environmental samples were assessed using sequencing of bacterial phylogenetic marker genes (16S rRNA and gyrB genes). Typing results obtained using WGS for the ten S. marcescens isolates recovered during a NICU outbreak investigation were highly consistent with those from pulse-field gel electrophoresis (PFGE), the current gold standard typing method for this bacterium. WGS also allowed for the identification of genes associated with antibiotic resistance in all isolates, while no plasmid was detected. Sequencing of the 16S rRNA and gyrB genes both showed higher relative abundances of Serratia in environmental sampling sites that were in close contact with infected babies. Much lower relative abundances of Serratia were observed following disinfection of a room, indicating that the protocol used was efficient. Variations in the bacterial community composition and structure following room disinfection and between sampling sites were also identified through 16S rRNA gene sequencing. Globally, results from this study highlight the potential for next-generation sequencing tools to improve and facilitate outbreak investigation. Copyright © 2018 American Society for Microbiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fortes, Raphael; Rigolin, Gustavo, E-mail: rigolin@ifi.unicamp.br
We push the limits of the direct use of partially pure entangled states to perform quantum teleportation by presenting several protocols in many different scenarios that achieve the optimal efficiency possible. We review and put in a single formalism the three major strategies known to date that allow one to use partially entangled states for direct quantum teleportation (no distillation strategies permitted) and compare their efficiencies in real world implementations. We show how one can improve the efficiency of many direct teleportation protocols by combining these techniques. We then develop new teleportation protocols employing multipartite partially entangled states. The threemore » techniques are also used here in order to achieve the highest efficiency possible. Finally, we prove the upper bound for the optimal success rate for protocols based on partially entangled Bell states and show that some of the protocols here developed achieve such a bound. -- Highlights: •Optimal direct teleportation protocols using directly partially entangled states. •We put in a single formalism all strategies of direct teleportation. •We extend these techniques for multipartite partially entangle states. •We give upper bounds for the optimal efficiency of these protocols.« less
40 CFR 63.3965 - How do I determine the emission capture system efficiency?
Code of Federal Regulations, 2010 CFR
2010-07-01
...; coating solvent flash-off, curing, and drying occurs within the capture system; and the removal or... spray booth and a curing oven. (b) Measuring capture efficiency. If the capture system does not meet... surface preparation activities and drying and curing time. (c) Liquid-to-uncaptured-gas protocol using a...
Detection of KIT Genotype in Pigs by TaqMan MGB Real-Time Quantitative Polymerase Chain Reaction.
Li, Xiuxiu; Li, Xiaoning; Luo, Rongrong; Wang, Wenwen; Wang, Tao; Tang, Hui
2018-05-01
The dominant white phenotype in domestic pigs is caused by two mutations in the KIT gene: a 450 kb duplication containing the entire KIT gene together with flanking sequences and one splice mutation with a G:A substitution in intron 17. The purpose of this study was to establish a simple, rapid method to determine KIT genotype in pigs. First, to detect KIT copy number variation (CNV), primers for exon 2 of the KIT gene, along with a TaqMan minor groove binder (MGB) probe, were designed. The single-copy gene, estrogen receptor (ESR), was used as an internal control. A real-time fluorescence-based quantitative PCR (FQ-PCR) protocol was developed to accurately detect KIT CNVs. Second, to detect the splice mutation ratio of the G:A substitution in intron 17, a 175 bp region, including the target mutation, was amplified from genomic DNA. Based on the sequence of the resulting amplified fragment, an MGB probe set was designed to detect the ratio of splice mutation to normal using FQ-PCR. A series of parallel amplification curves with the same internal distances were obtained using gradually diluted DNA as templates. The CT values among dilutions were significantly different (p < 0.001) and the coefficients of variation from each dilution were low (from 0.13% to 0.26%). The amplification efficiencies for KIT and ESR were approximately equal, indicating ESR was an appropriate control gene. Furthermore, use of the MGB probe set resulted in detection of the target mutation at a high resolution and stability; standard curves illustrated that the amplification efficiencies of KIT1 (G) and KIT2 (A) were approximately equal (98.8% and 97.2%). In conclusion, a simple, rapid method, with high specificity and stability, for the detection of the KIT genotype in pigs was established using TaqMan MGB probe real-time quantitative PCR.
Robinson, P; Hodgson, R; Grainger, A J
2015-01-01
Objective: To assess whether a single isotropic three-dimensional (3D) fast spin echo (FSE) proton density fat-saturated (PD FS) sequence reconstructed in three planes could replace the three PD (FS) sequences in our standard protocol at 1.5 T (Siemens Avanto, Erlangen, Germany). Methods: A 3D FSE PD water excitation sequence was included in the protocol for 95 consecutive patients referred for routine knee MRI. This was used to produce offline reconstructions in axial, sagittal and coronal planes. Two radiologists independently assessed each case twice, once using the standard MRI protocol and once replacing the standard PD (FS) sequences with reconstructions from the 3D data set. Following scoring, the observer reviewed the 3D data set and performed multiplanar reformats to see if this altered confidence. The menisci, ligaments and cartilage were assessed, and statistical analysis was performed using the standard sequence as the reference standard. Results: The reporting accuracy was as follows: medial meniscus (MM) = 90.9%, lateral meniscus (LM) = 93.7%, anterior cruciate ligament (ACL) = 98.9% and cartilage surfaces = 85.8%. Agreement among the readers was for the standard protocol: MM kappa = 0.91, LM = 0.89, ACL = 0.98 and cartilage = 0.84; and for the 3D protocol: MM = 0.86, LM = 0.77, ACL = 0.94 and cartilage = 0.64. Conclusion: A 3D PD FSE sequence reconstructed in three planes gives reduced accuracy and decreased concordance among readers compared with conventional sequences when evaluating the menisci and cartilage with a 1.5-T MRI scanner. Advances in knowledge: Using the existing 1.5-T MR systems, a 3D FSE sequence should not replace two-dimensional sequences. PMID:26067920
A fast sequence assembly method based on compressed data structures.
Liang, Peifeng; Zhang, Yancong; Lin, Kui; Hu, Jinglu
2014-01-01
Assembling a large genome using next generation sequencing reads requires large computer memory and a long execution time. To reduce these requirements, a memory and time efficient assembler is presented from applying FM-index in JR-Assembler, called FMJ-Assembler, where FM stand for FMR-index derived from the FM-index and BWT and J for jumping extension. The FMJ-Assembler uses expanded FM-index and BWT to compress data of reads to save memory and jumping extension method make it faster in CPU time. An extensive comparison of the FMJ-Assembler with current assemblers shows that the FMJ-Assembler achieves a better or comparable overall assembly quality and requires lower memory use and less CPU time. All these advantages of the FMJ-Assembler indicate that the FMJ-Assembler will be an efficient assembly method in next generation sequencing technology.
A Review of Recommendations for Sequencing Receptive and Expressive Language Instruction
ERIC Educational Resources Information Center
Petursdottir, Anna Ingeborg; Carr, James E.
2011-01-01
We review recommendations for sequencing instruction in receptive and expressive language objectives in early and intensive behavioral intervention (EIBI) programs. Several books recommend completing receptive protocols before introducing corresponding expressive protocols. However, this recommendation has little empirical support, and some…
Structator: fast index-based search for RNA sequence-structure patterns
2011-01-01
Background The secondary structure of RNA molecules is intimately related to their function and often more conserved than the sequence. Hence, the important task of searching databases for RNAs requires to match sequence-structure patterns. Unfortunately, current tools for this task have, in the best case, a running time that is only linear in the size of sequence databases. Furthermore, established index data structures for fast sequence matching, like suffix trees or arrays, cannot benefit from the complementarity constraints introduced by the secondary structure of RNAs. Results We present a novel method and readily applicable software for time efficient matching of RNA sequence-structure patterns in sequence databases. Our approach is based on affix arrays, a recently introduced index data structure, preprocessed from the target database. Affix arrays support bidirectional pattern search, which is required for efficiently handling the structural constraints of the pattern. Structural patterns like stem-loops can be matched inside out, such that the loop region is matched first and then the pairing bases on the boundaries are matched consecutively. This allows to exploit base pairing information for search space reduction and leads to an expected running time that is sublinear in the size of the sequence database. The incorporation of a new chaining approach in the search of RNA sequence-structure patterns enables the description of molecules folding into complex secondary structures with multiple ordered patterns. The chaining approach removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our method runs up to two orders of magnitude faster than previous methods. Conclusions The presented method's sublinear expected running time makes it well suited for RNA sequence-structure pattern matching in large sequence databases. RNA molecules containing several stem-loop substructures can be described by multiple sequence-structure patterns and their matches are efficiently handled by a novel chaining method. Beyond our algorithmic contributions, we provide with Structator a complete and robust open-source software solution for index-based search of RNA sequence-structure patterns. The Structator software is available at http://www.zbh.uni-hamburg.de/Structator. PMID:21619640
US stock market efficiency over weekly, monthly, quarterly and yearly time scales
NASA Astrophysics Data System (ADS)
Rodriguez, E.; Aguilar-Cornejo, M.; Femat, R.; Alvarez-Ramirez, J.
2014-11-01
In financial markets, the weak form of the efficient market hypothesis implies that price returns are serially uncorrelated sequences. In other words, prices should follow a random walk behavior. Recent developments in evolutionary economic theory (Lo, 2004) have tailored the concept of adaptive market hypothesis (AMH) by proposing that market efficiency is not an all-or-none concept, but rather market efficiency is a characteristic that varies continuously over time and across markets. Within the AMH framework, this work considers the Dow Jones Index Average (DJIA) for studying the deviations from the random walk behavior over time. It is found that the market efficiency also varies over different time scales, from weeks to years. The well-known detrended fluctuation analysis was used for the characterization of the serial correlations of the return sequences. The results from the empirical showed that interday and intraday returns are more serially correlated than overnight returns. Also, some insights in the presence of business cycles (e.g., Juglar and Kuznets) are provided in terms of time variations of the scaling exponent.
High-throughput full-length single-cell mRNA-seq of rare cells.
Ooi, Chin Chun; Mantalas, Gary L; Koh, Winston; Neff, Norma F; Fuchigami, Teruaki; Wong, Dawson J; Wilson, Robert J; Park, Seung-Min; Gambhir, Sanjiv S; Quake, Stephen R; Wang, Shan X
2017-01-01
Single-cell characterization techniques, such as mRNA-seq, have been applied to a diverse range of applications in cancer biology, yielding great insight into mechanisms leading to therapy resistance and tumor clonality. While single-cell techniques can yield a wealth of information, a common bottleneck is the lack of throughput, with many current processing methods being limited to the analysis of small volumes of single cell suspensions with cell densities on the order of 107 per mL. In this work, we present a high-throughput full-length mRNA-seq protocol incorporating a magnetic sifter and magnetic nanoparticle-antibody conjugates for rare cell enrichment, and Smart-seq2 chemistry for sequencing. We evaluate the efficiency and quality of this protocol with a simulated circulating tumor cell system, whereby non-small-cell lung cancer cell lines (NCI-H1650 and NCI-H1975) are spiked into whole blood, before being enriched for single-cell mRNA-seq by EpCAM-functionalized magnetic nanoparticles and the magnetic sifter. We obtain high efficiency (> 90%) capture and release of these simulated rare cells via the magnetic sifter, with reproducible transcriptome data. In addition, while mRNA-seq data is typically only used for gene expression analysis of transcriptomic data, we demonstrate the use of full-length mRNA-seq chemistries like Smart-seq2 to facilitate variant analysis of expressed genes. This enables the use of mRNA-seq data for differentiating cells in a heterogeneous population by both their phenotypic and variant profile. In a simulated heterogeneous mixture of circulating tumor cells in whole blood, we utilize this high-throughput protocol to differentiate these heterogeneous cells by both their phenotype (lung cancer versus white blood cells), and mutational profile (H1650 versus H1975 cells), in a single sequencing run. This high-throughput method can help facilitate single-cell analysis of rare cell populations, such as circulating tumor or endothelial cells, with demonstrably high-quality transcriptomic data.
Mellors, L J; Gibbs, C L; Barclay, C J
2001-05-01
The results of previous studies suggest that the maximum mechanical efficiency of rat papillary muscles is lower during a contraction protocol involving sinusoidal length changes than during one involving afterloaded isotonic contractions. The aim of this study was to compare directly the efficiency of isolated rat papillary muscle preparations in isotonic and sinusoidal contraction protocols. Experiments were performed in vitro (27 degrees C) using left ventricular papillary muscles from adult rats. Each preparation performed three contraction protocols: (i) low-frequency afterloaded isotonic contractions (10 twitches at 0.2 Hz), (ii) sinusoidal length change contractions with phasic stimulation (40 twitches at 2 Hz) and (iii) high-frequency afterloaded isotonic contractions (40 twitches at 2 Hz). The first two protocols resembled those used in previous studies and the third combined the characteristics of the first two. The parameters for each protocol were adjusted to those that gave maximum efficiency. For the afterloaded isotonic protocols, the afterload was set to 0.3 of the maximum developed force. The sinusoidal length change protocol incorporated a cycle amplitude of +/-5% resting length and a stimulus phase of -10 degrees. Measurements of force output, muscle length change and muscle temperature change were used to calculate the work and heat produced during and after each protocol. Net mechanical efficiency was defined as the proportion of the energy (enthalpy) liberated by the muscle that appeared as work. The efficiency in the low-frequency, isotonic contraction protocol was 21.1+/-1.4% (mean +/- s.e.m., N=6) and that in the sinusoidal protocol was 13.2+/-0.7%, consistent with previous results. This difference was not due to the higher frequency or greater number of twitches because efficiency in the high-frequency, isotonic protocol was 21.5+/-1.0%. Although these results apparently confirm that efficiency is protocol-dependent, additional experiments designed to measure work output unambiguously indicated that the method used to calculate work output in isotonic contractions overestimated actual work output. When net work output, which excludes work done by parallel elastic elements, rather than total work output was used to determine efficiency in afterloaded isotonic contractions, efficiency was similar to that for sinusoidal contractions. The maximum net mechanical efficiency of rat papillary muscles performing afterloaded isotonic or sinusoidal length change contractions was between 10 and 15%.
NASA Astrophysics Data System (ADS)
Yang, YuGuang; Liu, ZhiChao; Chen, XiuBo; Zhou, YiHua; Shi, WeiMin
2017-12-01
Quantum channel noise may cause the user to obtain a wrong answer and thus misunderstand the database holder for existing QKD-based quantum private query (QPQ) protocols. In addition, an outside attacker may conceal his attack by exploiting the channel noise. We propose a new, robust QPQ protocol based on four-qubit decoherence-free (DF) states. In contrast to existing QPQ protocols against channel noise, only an alternative fixed sequence of single-qubit measurements is needed by the user (Alice) to measure the received DF states. This property makes it easy to implement the proposed protocol by exploiting current technologies. Moreover, to retain the advantage of flexible database queries, we reconstruct Alice's measurement operators so that Alice needs only conditioned sequences of single-qubit measurements.
Rona, Z; Klebermass, K; Cardona, F; Czaba, C D; Brugger, P C; Weninger, M; Pollak, A; Prayer, D
2010-09-01
To assess the utility of an MRI-compatible incubator (INC) by comparing. In a retrospective study, the clinical and radiological aspects of 129 neonatal MRI examinations during a 3 year period were analyzed. Routine protocols including fast spin-echo T2-weighted (w) sequences, axial T1w, Gradient-echo, diffusion sequences, and 3D T1 gradient-echo sequences were performed routinely, angiography and spectroscopy were added in some cases. Diffusion-tensor imaging was done in 50% of the babies examined in the INC and 26% without INC. Sequences, adapted from fetal MR-protocols were done in infants younger than 32 gestational weeks. Benefit from MR-information with respect to further management was evaluated. The number of the examinations increased (30-99), while the mean age (43-38, 8 weeks of gestational age) and weight (3308-2766 g) decreased significantly with the use of the MR-compatible incubator. The mean imaging time (34, 43-30, 29 min) decreased, with a mean of one additionally performed sequence in the INC group. All infants received sedatives according to our anaesthetic protocol preceding imaging, but a repeated dose was never necessary (10% without INC) using the INC. Regarding all cases, MR-based changes in clinical management were initiated in 58%, while in 57% of cases the initial ultrasound diagnosis was changed or further specified. The use of the INC enables the MR access of unstable infants with suspect CNS problems to the management, of whom is improved by MR information to significantly higher percentage, than without INC. Copyright (c) 2010 European Paediatric Neurology Society. Published by Elsevier Ltd. All rights reserved.
Heuvelink, Annet; Hassan, Abdulwahed Ahmed; van Weering, Hilmar; van Engelen, Erik; Bülte, Michael; Akineden, Ömer
2017-05-01
Mycobacterium avium subsp. paratuberculosis (MAP) is a vigorous microorganism which causes incurable chronic enteritis, Johne's disease (JD) in cattle. A target of control programmes for JD is to accurately detect MAP-infected cattle early to reduce disease transmission. The present study evaluated the efficacy of two different cultural procedures and a TaqMan real-time PCR assay for detection of subclinical paratuberculosis in dairy herds. Therefore, sixty-one faecal samples were collected from two Dutch dairy herds (n = 40 and n = 21, respectively) which were known to be MAP-ELISA positive. All individual samples were assessed using two different cultural protocols in two different laboratories. The first cultural protocol (first laboratory) included a decontamination step with 0.75% hexadecylpyridinium chloride (HPC) followed by inoculation on Herrold's egg yolk media (HEYM). The second protocol (second laboratory) comprised of a decontamination step using 4% NaOH and malachite green-oxalic acid followed by inoculation on two media, HEYM and in parallel on modified Löwenstein-Jensen media (mLJ). For the TaqMan real-time PCR assay, all faecal samples were tested in two different laboratories using TaqMan® MAP (Johne's) reagents (Life Technologies). The cultural procedures revealed positive reactions in 1.64% of the samples for cultivation protocol 1 and 6.56 and 8.20% of the samples for cultivation protocol 2, respectively. The results of the TaqMan real-time PCR performed in two different laboratories yielded 13.11 and 19.76% positive reaction. The kappa test showed proportional agreement 0.54 between the mLJ media (second laboratory) and TaqMan® real-time PCR method (second laboratory). In conclusion, the TaqMan real-time PCR could be a strongly useful and efficient assay for the detection of subclinical paratuberculosis in dairy cattle leading to an improvement in the efficiency of MAP control strategies.
Silva, A T; Paiva, L V; Andrade, A C; Barduche, D
2013-05-21
Brazil possesses the most modern and productive coffee growing farms in the world, but technological development is desired to cope with the increasing world demand. One way to increase Brazilian coffee growing productivity is wide scale production of clones with superior genotypes, which can be obtained with in vitro propagation technique, or from tissue culture. These procedures can generate thousands of clones. However, the methodologies for in vitro cultivation are genotype-dependent, which leads to an almost empirical development of specific protocols for each species. Therefore, molecular markers linked to the biochemical events of somatic embryogenesis would greatly facilitate the development of such protocols. In this context, sequences potentially involved in embryogenesis processes in the coffee plant were identified in silico from libraries generated by the Brazilian Coffee Genome Project. Through these in silico analyses, we identified 15 EST-contigs related to the embryogenesis process. Among these, 5 EST-contigs (3605, 9850, 13686, 17240, and 17265) could readily be associated with plant embryogenesis. Sequence analysis of EST-contig 3605, 9850, and 17265 revealed similarity to a polygalacturonase, to a cysteine-proteinase, and to an allergenine, respectively. Results also show that EST-contig 17265 sequences presented similarity to an expansin. Finally, analysis of EST-contig 17240 revealed similarity to a protein of unknown function, but it grouped in the similarity dendrogram with the WUSCHEL transcription factor. The data suggest that these EST-contigs are related to the embryogenic process and have potential as molecular markers to increase methodological efficiency in obtaining coffee plant embryogenic materials.
A dyadic protocol for training complex skills: a replication using female participants.
Sanchez-Ku, M L; Arthur, W
2000-01-01
The effectiveness and efficiency of the active interlocked modeling (AIM) dyadic protocol in training complex skills has been extensively demonstrated. However, past evaluation studies have all used male participants exclusively. Consequently, the present study investigated the generalizability of the effectiveness and efficiency gains to women. We randomly assigned 108 female participants to either the AIM-dyad condition or a standard individual control training condition. The results supported the robustness and viability of the AIM protocol. Although their overall performance was lower than that obtained for men in previous studies, women trained in the AIM-dyad condition performed as well as those trained in the individual condition. Thus, the efficiency gains associated with the AIM-dyad protocol, which result from the ability to train two people simultaneously to reach the same performance level as a single person with no increase in training time or machine cost, are generalizable to female participants. The applied and basic research implications of the present study are discussed within the context of well-documented male/female differences in the performance of complex psychomotor tasks. For instance, given the number of women entering the workforce and the significant proportion of women in professions previously deemed to be male-dominated (e.g., air navigation), it is reassuring to know that sex differences in task performance do not necessarily imply sex differences in the effectiveness of training protocols.
The role of MRI in axillary lymph node imaging in breast cancer patients: a systematic review.
Kuijs, V J L; Moossdorff, M; Schipper, R J; Beets-Tan, R G H; Heuts, E M; Keymeulen, K B M I; Smidt, M L; Lobbes, M B I
2015-04-01
To assess whether MRI can exclude axillary lymph node metastasis, potentially replacing sentinel lymph node biopsy (SLNB), and consequently eliminating the risk of SLNB-associated morbidity. PubMed, Cochrane, Medline and Embase databases were searched for relevant publications up to July 2014. Studies were selected based on predefined inclusion and exclusion criteria and independently assessed by two reviewers using a standardised extraction form. Sixteen eligible studies were selected from 1,372 publications identified by the search. A dedicated axillary protocol [sensitivity 84.7 %, negative predictive value (NPV) 95.0 %] was superior to a standard protocol covering both the breast and axilla simultaneously (sensitivity 82.0 %, NPV 82.6 %). Dynamic, contrast-enhanced MRI had a lower median sensitivity (60.0 %) and NPV (80.0 %) compared to non-enhanced T1w/T2w sequences (88.4, 94.7 %), diffusion-weighted imaging (84.2, 90.6 %) and ultrasmall superparamagnetic iron oxide (USPIO)- enhanced T2*w sequences (83.0, 95.9 %). The most promising results seem to be achievable when using non-enhanced T1w/T2w and USPIO-enhanced T2*w sequences in combination with a dedicated axillary protocol (sensitivity 84.7 % and NPV 95.0 %). The diagnostic performance of some MRI protocols for excluding axillary lymph node metastases approaches the NPV needed to replace SLNB. However, current observations are based on studies with heterogeneous study designs and limited populations. • Some axillary MRI protocols approach the NPV of an SLNB procedure. • Dedicated axillary MRI is more accurate than protocols also covering the breast. • T1w/T2w protocols combined with USPIO-enhanced sequences are the most promising sequences.
Improved Protocols for Illumina Sequencing
Bronner, Iraad F.; Quail, Michael A.; Turner, Daniel J.; Swerdlow, Harold
2013-01-01
In this unit, we describe a set of improvements we have made to the standard Illumina protocols to make the sequencing process more reliable in a high-throughput environment, reduce amplification bias, narrow the distribution of insert sizes, and reliably obtain high yields of data. PMID:19582764
Using machine learning for sequence-level automated MRI protocol selection in neuroradiology.
Brown, Andrew D; Marotta, Thomas R
2018-05-01
Incorrect imaging protocol selection can lead to important clinical findings being missed, contributing to both wasted health care resources and patient harm. We present a machine learning method for analyzing the unstructured text of clinical indications and patient demographics from magnetic resonance imaging (MRI) orders to automatically protocol MRI procedures at the sequence level. We compared 3 machine learning models - support vector machine, gradient boosting machine, and random forest - to a baseline model that predicted the most common protocol for all observations in our test set. The gradient boosting machine model significantly outperformed the baseline and demonstrated the best performance of the 3 models in terms of accuracy (95%), precision (86%), recall (80%), and Hamming loss (0.0487). This demonstrates the feasibility of automating sequence selection by applying machine learning to MRI orders. Automated sequence selection has important safety, quality, and financial implications and may facilitate improvements in the quality and safety of medical imaging service delivery.
Li, Dongmei; Fan, Qing-Hai; Waite, David W.; Gunawardana, Disna; George, Sherly; Kumarasinghe, Lalith
2015-01-01
Spider mites of the genus Tetranychus are difficult to identify due to their limited diagnostic characters. Many of them are morphologically similar and males are needed for species-level identification. Tetranychus urticae is a common interception and non-regulated pest at New Zealand’s borders, however, most of the intercepted specimens are females and the identification was left at Tetranychus sp. Consequently, the shipments need to be fumigated. DNA sequencing and PCR-restriction fragment length polymorphism (PCR-RFLP) protocols could be used to facilitate the accurate identification. However, in the context of border security practiced in New Zealand, insect identifications are required to be provided within four hours of receiving the samples; thus, those molecular methods are not sufficient to meet this requirement. Therefore, a real-time PCR TaqMan assay was developed for identification of T. urticae by amplification of a 142 bp Internal Transcribed Spacer (ITS) 1 sequence. The developed assay is rapid, detects all life stages of T. urticae within three hours, and does not react with closely related species. Plasmid DNA containing ITS1 sequence of T. uritcae was serially diluted and used as standards in the real-time PCR assay. The quantification cycle (Cq) value of the assay depicted a strong linear relationship with T. urticae DNA content, with a regression coefficient of 0.99 and efficiency of 98%. The detection limit was estimated to be ten copies of the T. urticae target region. The assay was validated against a range of T. urticae specimens from various countries and hosts in a blind panel test. Therefore the application of the assay at New Zealand will reduce the unnecessary fumigation and be beneficial to both the importers and exporters. It is expected that the implementation of this real-time PCR assay would have wide applications in diagnostic and research agencies worldwide. PMID:26147599
Ong, Emily; Ho, Christopher; Miles, Peter
2011-03-01
To compare the efficiency of orthodontic archwire sequences produced by three manufacturers. Prospective, randomized clinical trial with three parallel groups. Private orthodontic practice in Caloundra, QLD, Australia. One hundred and thirty-two consecutive patients were randomized to one of three archwire sequence groups: (i) 3M Unitek, 0·014 inch Nitinol, 0·017 inch × 0·017 inch heat activated Ni-Ti; (ii) GAC international, 0·014 inch Sentalloy, 0·016 × 0·022 inch Bioforce; and (iii) Ormco corporation, 0·014 inch Damon Copper Ni-Ti, 0·014 × 0·025 inch Damon Copper Ni-Ti. All patients received 0·018 × 0·025 inch slot Victory Series™ brackets. Mandibular impressions were taken before the insertion of each archwire. Patients completed discomfort surveys according to a seven-point Likert Scale at 4 h, 24 h, 3 days and 7 days after the insertion of each archwire. Efficiency was measured by time required to reach the working archwire, mandibular anterior alignment and level of discomfort. No significant differences were found in the reduction of irregularity between the archwire sequences at any time-point (T1: P = 0·12; T2: P = 0·06; T3: P = 0·21) or in the time to reach the working archwire (P = 0·28). No significant differences were found in the overall discomfort scores between the archwire sequences (4 h: P = 0·30; 24 h: P = 0·18; 3 days: P = 0·53; 7 days: P = 0·47). When the time-points were analysed individually, the 3M Unitek archwire sequence induced significantly less discomfort than GAC and Ormco archwires 24 h after the insertion of the third archwire (P = 0·02). This could possibly be attributed to the progression in archwire material and archform. The archwire sequences were similar in alignment efficiency and overall discomfort. Progression in archwire dimension and archform may contribute to discomfort levels. This study provides clinical justification for three common archwire sequences in 0·018 × 0·025 inch slot brackets.
PCR Amplification Strategies towards full-length HIV-1 Genome sequencing.
Liu, Chao Chun; Ji, Hezhao
2018-06-26
The advent of next generation sequencing has enabled greater resolution of viral diversity and improved feasibility of full viral genome sequencing allowing routine HIV-1 full genome sequencing in both research and diagnostic settings. Regardless of the sequencing platform selected, successful PCR amplification of the HIV-1 genome is essential for sequencing template preparation. As such, full HIV-1 genome amplification is a crucial step in dictating the successful and reliable sequencing downstream. Here we reviewed existing PCR protocols leading to HIV-1 full genome sequencing. In addition to the discussion on basic considerations on relevant PCR design, the advantages as well as the pitfalls of published protocols were reviewed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Genome sequencing in microfabricated high-density picolitre reactors.
Margulies, Marcel; Egholm, Michael; Altman, William E; Attiya, Said; Bader, Joel S; Bemben, Lisa A; Berka, Jan; Braverman, Michael S; Chen, Yi-Ju; Chen, Zhoutao; Dewell, Scott B; Du, Lei; Fierro, Joseph M; Gomes, Xavier V; Godwin, Brian C; He, Wen; Helgesen, Scott; Ho, Chun Heen; Ho, Chun He; Irzyk, Gerard P; Jando, Szilveszter C; Alenquer, Maria L I; Jarvie, Thomas P; Jirage, Kshama B; Kim, Jong-Bum; Knight, James R; Lanza, Janna R; Leamon, John H; Lefkowitz, Steven M; Lei, Ming; Li, Jing; Lohman, Kenton L; Lu, Hong; Makhijani, Vinod B; McDade, Keith E; McKenna, Michael P; Myers, Eugene W; Nickerson, Elizabeth; Nobile, John R; Plant, Ramona; Puc, Bernard P; Ronan, Michael T; Roth, George T; Sarkis, Gary J; Simons, Jan Fredrik; Simpson, John W; Srinivasan, Maithreyan; Tartaro, Karrie R; Tomasz, Alexander; Vogt, Kari A; Volkmer, Greg A; Wang, Shally H; Wang, Yong; Weiner, Michael P; Yu, Pengguang; Begley, Richard F; Rothberg, Jonathan M
2005-09-15
The proliferation of large-scale DNA-sequencing projects in recent years has driven a search for alternative methods to reduce time and cost. Here we describe a scalable, highly parallel sequencing system with raw throughput significantly greater than that of state-of-the-art capillary electrophoresis instruments. The apparatus uses a novel fibre-optic slide of individual wells and is able to sequence 25 million bases, at 99% or better accuracy, in one four-hour run. To achieve an approximately 100-fold increase in throughput over current Sanger sequencing technology, we have developed an emulsion method for DNA amplification and an instrument for sequencing by synthesis using a pyrosequencing protocol optimized for solid support and picolitre-scale volumes. Here we show the utility, throughput, accuracy and robustness of this system by shotgun sequencing and de novo assembly of the Mycoplasma genitalium genome with 96% coverage at 99.96% accuracy in one run of the machine.
Protocols development for security and privacy of radio frequency identification systems
NASA Astrophysics Data System (ADS)
Sabbagha, Fatin
There are benefits to adopting radio frequency identification (RFID) technology, although there are methods of attack that can compromise the system. This research determined how that may happen and what possible solutions can keep that from happening. Protocols were developed to implement better security. In addition, new topologies were developed to handle the problems of the key management. Previously proposed protocols focused on providing mutual authentication and privacy between readers and tags. However, those protocols are still vulnerable to be attacked. These protocols were analyzed and the disadvantages shown for each one. Previous works assumed that the channels between readers and the servers were secure. In the proposed protocols, a compromised reader is considered along with how to prevent tags from being read by that reader. The new protocols provide mutual authentication between readers and tags and, at the same time, remove the compromised reader from the system. Three protocols are proposed. In the first protocol, a mutual authentication is achieved and a compromised reader is not allowed in the network. In the second protocol, the number of times a reader contacts the server is reduced. The third protocol provides authentication and privacy between tags and readers using a trusted third party. The developed topology is implemented using python language and simulates work to check the efficiency regarding the processing time. The three protocols are implemented by writing codes in C language and then compiling them in MSP430. IAR Embedded workbench is used, which is an integrated development environment with the C/C++ compiler to generate a faster code and to debug the microcontroller. In summary, the goal of this research is to find solutions for the problems on previously proposed protocols, handle a compromised reader, and solve key management problems.
Mann, David L; Abernethy, Bruce; Farrow, Damian; Davis, Mark; Spratford, Wayne
2010-05-01
This article describes a new automated method for the controlled occlusion of vision during natural tasks. The method permits the time course of the presence or absence of visual information to be linked to identifiable events within the task of interest. An example application is presented in which the method is used to examine the ability of cricket batsmen to pick up useful information from the prerelease movement patterns of the opposing bowler. Two key events, separated by a consistent within-action time lag, were identified in the cricket bowling action sequence-namely, the penultimate foot strike prior to ball release (Event 1), and the subsequent moment of ball release (Event 2). Force-plate registration of Event 1 was then used as a trigger to facilitate automated occlusion of vision using liquid crystal occlusion goggles at time points relative to Event 2. Validation demonstrated that, compared with existing approaches that are based on manual triggering, this method of occlusion permitted considerable gains in temporal precision and a reduction in the number of unusable trials. A more efficient and accurate protocol to examine anticipation is produced, while preserving the important natural coupling between perception and action.
Optimization and evaluation of single-cell whole-genome multiple displacement amplification.
Spits, C; Le Caignec, C; De Rycke, M; Van Haute, L; Van Steirteghem, A; Liebaers, I; Sermon, K
2006-05-01
The scarcity of genomic DNA can be a limiting factor in some fields of genetic research. One of the methods developed to overcome this difficulty is whole genome amplification (WGA). Recently, multiple displacement amplification (MDA) has proved very efficient in the WGA of small DNA samples and pools of cells, the reaction being catalyzed by the phi29 or the Bst DNA polymerases. The aim of the present study was to develop a reliable, efficient, and fast protocol for MDA at the single-cell level. We first compared the efficiency of phi29 and Bst polymerases on DNA samples and single cells. The phi29 polymerase generated accurately, in a short time and from a single cell, sufficient DNA for a large set of tests, whereas the Bst enzyme showed a low efficiency and a high error rate. A single-cell protocol was optimized using the phi29 polymerase and was evaluated on 60 single cells; the DNA obtained DNA was assessed by 22 locus-specific PCRs. This new protocol can be useful for many applications involving minute quantities of starting material, such as forensic DNA analysis, prenatal and preimplantation genetic diagnosis, or cancer research. (c) 2006 Wiley-Liss, Inc.
First Time Authentication for Airborne Networks (FAAN)
2010-01-01
21] “An Improved Protocol for Demonstrating Possession of Discrete Logarithms and Some Generalizations”, D Chaum , J-H Evertse, J van de Graaf...System Sciences, p. 1-9, 1998. [5] D . Micciancio, The Shortest Vector in a Lattice is Hard to Approximate to within Some Constant, Proc. 39th...1999. [7] D . Micciancio and E. Petrank, Efficient and Concurrent Zero-Knowledge from any public coin HVZK protocol, Proceedings of Advances in
Zhelyabovskaya, Olga B.; Berlin, Yuri A.; Birikh, Klara R.
2004-01-01
In bacterial expression systems, translation initiation is usually the rate limiting and the least predictable stage of protein synthesis. Efficiency of a translation initiation site can vary dramatically depending on the sequence context. This is why many standard expression vectors provide very poor expression levels of some genes. This notion persuaded us to develop an artificial genetic selection protocol, which allows one to find for a given target gene an individual efficient ribosome binding site from a random pool. In order to create Darwinian pressure necessary for the genetic selection, we designed a system based on translational coupling, in which microorganism survival in the presence of antibiotic depends on expression of the target gene, while putting no special requirements on this gene. Using this system we obtained superproducing constructs for the human protein RACK1 (receptor for activated C kinase). PMID:15034151
NASA Astrophysics Data System (ADS)
Blank, M.; Laine, A. O.; Jürss, K.; Bastrop, R.
2008-06-01
Studies of Marenzelleria species were often hampered by identification uncertainties when using morphological characters only. A newly developed PCR/RFLP protocol allows a more efficient discrimination of the three species Marenzelleria viridis, Marenzelleria neglecta and Marenzelleria arctia currently known for the Baltic Sea. The protocol is based on PCR amplification of two mitochondrial DNA gene segments (16S, COI) followed by digestion with restriction enzymes. As it is faster and cheaper than PCR/sequencing protocols used so far, the protocol is recommended for large-scale analyses. The markers allow an undoubted determination of species irrespective of life stage or condition of the worms in the samples. The protocol was validated on about 950 specimens sampled at more than 30 sites of the Baltic and the North Sea, and on specimens from populations of the North American east coast. Besides this test we used mitochondrial DNA sequences (16S, COI, Cytb) and starch gel electrophoresis to further investigate the distribution of the three Marenzelleria species in the Baltic Sea. The results show that M. viridis (formerly genetic type I or M. cf. wireni) occurred in the Öresund area, in the south western as well as in the eastern Baltic Sea, where it is found sympatric with M. neglecta. Allozyme electrophoresis indicated an introduction by range expansion from the North Sea. The second species, M. arctia, was only found in the northern Baltic Sea, where it sometimes occurred sympatric with M. neglecta or M. viridis. For Baltic M. arctia, the most probable way of introduction is by ship ballast water from the European Arctic. There is an urgent need for a new genetic analysis of all Marenzelleria populations of the Baltic Sea to unravel the current distribution of the three species.
Hu, Simin; Guo, Zhiling; Li, Tao; Carpenter, Edward J; Liu, Sheng; Lin, Senjie
2014-01-01
Knowledge of in situ copepod diet diversity is crucial for accurately describing pelagic food web structure but is challenging to achieve due to lack of an easily applicable methodology. To enable analysis with whole copepod-derived DNAs, we developed a copepod-excluding 18S rDNA-based PCR protocol. Although it is effective in depressing amplification of copepod 18S rDNA, its applicability to detect diverse eukaryotes in both mono- and mixed-species has not been demonstrated. Besides, the protocol suffers from the problem that sequences from symbiotic ciliates are overrepresented in the retrieved 18S rDNA libraries. In this study, we designed a blocking primer to make a combined primer set (copepod/symbiotic ciliate-excluding eukaryote-common: CEEC) to depress PCR amplification of symbiotic ciliate sequences while maximizing the range of eukaryotes amplified. We firstly examined the specificity and efficacy of CEEC by PCR-amplifying DNAs from 16 copepod species, 37 representative organisms that are potential prey of copepods and a natural microplankton sample, and then evaluated the efficiency in reconstructing diet composition by detecting the food of both lab-reared and field-collected copepods. Our results showed that the CEEC primer set can successfully amplify 18S rDNA from a wide range of isolated species and mixed-species samples while depressing amplification of that from copepod and targeted symbiotic ciliate, indicating the universality of CEEC in specifically detecting prey of copepods. All the predetermined food offered to copepods in the laboratory were successfully retrieved, suggesting that the CEEC-based protocol can accurately reconstruct the diets of copepods without interference of copepods and their associated ciliates present in the DNA samples. Our initial application to analyzing the food composition of field-collected copepods uncovered diverse prey species, including those currently known, and those that are unsuspected, as copepod prey. While testing is required, this protocol provides a useful strategy for depicting in situ dietary composition of copepods.
Henninger, B; Raithel, E; Kranewitter, C; Steurer, M; Jaschke, W; Kremser, C
2018-05-01
To prospectively evaluate a prototypical 3D turbo-spin-echo proton-density-weighted sequence with compressed sensing and free-stop scan mode for preventing motion artefacts (3D-PD-CS-SPACE free-stop) for knee imaging in a clinical setting. 80 patients underwent 3T magnetic resonance imaging (MRI) of the knee with our 2D routine protocol and with 3D-PD-CS-SPACE free-stop. In case of a scan-stop caused by motion (images are calculated nevertheless) the sequence was repeated without free-stop mode. All scans were evaluated by 2 radiologists concerning image quality of the 3D-PD-CS-SPACE (with and without free-stop). Important knee structures were further assessed in a lesion based analysis and compared to our reference 2D-PD-fs sequences. Image quality of the 3D-PD-CS-SPACE free-stop was found optimal in 47/80, slightly compromised in 21/80, moderately in 10/80 and severely in 2/80. In 29/80, the free-stop scan mode stopped the 3D-PD-CS-SPACE due to subject motion with a slight increase of image quality at longer effective acquisition times. Compared to the 3D-PD-CS-SPACE with free-stop, the image quality of the acquired 3D-PD-CS-SPACE without free-stop was found equal in 6/29, slightly improved in 13/29, improved with equal contours in 8/29, and improved with sharper contours in 2/29. The lesion based analysis showed a high agreement between the results from the 3D-PD-CS-SPACE free-stop and our 2D-PD-fs routine protocol (overall agreement 96.25%-100%, Cohen's Kappa 0.883-1, p < 0.001). 3D-PD-CS-SPACE free-stop is a reliable alternative for standard 2D-PD-fs protocols with acceptable acquisition times. Copyright © 2018 Elsevier B.V. All rights reserved.
The Simulation of Read-time Scalable Coherent Interface
NASA Technical Reports Server (NTRS)
Li, Qiang; Grant, Terry; Grover, Radhika S.
1997-01-01
Scalable Coherent Interface (SCI, IEEE/ANSI Std 1596-1992) (SCI1, SCI2) is a high performance interconnect for shared memory multiprocessor systems. In this project we investigate an SCI Real Time Protocols (RTSCI1) using Directed Flow Control Symbols. We studied the issues of efficient generation of control symbols, and created a simulation model of the protocol on a ring-based SCI system. This report presents the results of the study. The project has been implemented using SES/Workbench. The details that follow encompass aspects of both SCI and Flow Control Protocols, as well as the effect of realistic client/server processing delay. The report is organized as follows. Section 2 provides a description of the simulation model. Section 3 describes the protocol implementation details. The next three sections of the report elaborate on the workload, results and conclusions. Appended to the report is a description of the tool, SES/Workbench, used in our simulation, and internal details of our implementation of the protocol.
Service discovery with routing protocols for MANETs
NASA Astrophysics Data System (ADS)
Gu, Xuemai; Shi, Shuo
2005-11-01
Service discovery is becoming an important topic as its use throughout the Internet becomes more widespread. In Mobile Ad hoc Networks (MANETs), the routing protocol is very important because it is special network. To find a path for data, and destination nodes, nodes send packets to each node, creating substantial overhead traffic and consuming much time. Even though a variety of routing protocols have been developed for use in MANETs, they are insufficient for reducing overhead traffic and time. In this paper, we propose SDRP: a new service discovery protocol combined with routing policies in MANETs. The protocol is performed upon a distributed network. We describe a service by a unique ID number and use a group-cast routing policy in advertisement and request. The group-cast routing policy decreases the traffic in networks, and it is efficient to find destination node. In addition, the nodes included in the reply path also cache the advertisement information, and it means when each node finds a node next time, they can know where it is as soon as possible, so they minimize the time. Finally, we compare SDRP with both Flood and MAODV in terms of overload, and average delay. Simulation results show SDRP can spend less response time and accommodate even high mobility network environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jason L. Wright; Milos Manic
Time synchronization and event time correlation are important in wireless sensor networks. In particular, time is used to create a sequence events or time line to answer questions of cause and effect. Time is also used as a basis for determining the freshness of received packets and the validity of cryptographic certificates. This paper presents secure method of time synchronization and event time correlation for TESLA-based hierarchical wireless sensor networks. The method demonstrates that events in a TESLA network can be accurately timestamped by adding only a few pieces of data to the existing protocol.
NASA Astrophysics Data System (ADS)
Mehic, M.; Fazio, P.; Voznak, M.; Partila, P.; Komosny, D.; Tovarek, J.; Chmelikova, Z.
2016-05-01
A mobile ad hoc network is a collection of mobile nodes which communicate without a fixed backbone or centralized infrastructure. Due to the frequent mobility of nodes, routes connecting two distant nodes may change. Therefore, it is not possible to establish a priori fixed paths for message delivery through the network. Because of its importance, routing is the most studied problem in mobile ad hoc networks. In addition, if the Quality of Service (QoS) is demanded, one must guarantee the QoS not only over a single hop but over an entire wireless multi-hop path which may not be a trivial task. In turns, this requires the propagation of QoS information within the network. The key to the support of QoS reporting is QoS routing, which provides path QoS information at each source. To support QoS for real-time traffic one needs to know not only minimum delay on the path to the destination but also the bandwidth available on it. Therefore, throughput, end-to-end delay, and routing overhead are traditional performance metrics used to evaluate the performance of routing protocol. To obtain additional information about the link, most of quality-link metrics are based on calculation of the lost probabilities of links by broadcasting probe packets. In this paper, we address the problem of including multiple routing metrics in existing routing packets that are broadcasted through the network. We evaluate the efficiency of such approach with modified version of DSDV routing protocols in ns-3 simulator.
Medium Access Control Protocols for Cognitive Radio Ad Hoc Networks: A Survey
Islam, A. K. M. Muzahidul; Baharun, Sabariah; Mansoor, Nafees
2017-01-01
New wireless network paradigms will demand higher spectrum use and availability to cope with emerging data-hungry devices. Traditional static spectrum allocation policies cause spectrum scarcity, and new paradigms such as Cognitive Radio (CR) and new protocols and techniques need to be developed in order to have efficient spectrum usage. Medium Access Control (MAC) protocols are accountable for recognizing free spectrum, scheduling available resources and coordinating the coexistence of heterogeneous systems and users. This paper provides an ample review of the state-of-the-art MAC protocols, which mainly focuses on Cognitive Radio Ad Hoc Networks (CRAHN). First, a description of the cognitive radio fundamental functions is presented. Next, MAC protocols are divided into three groups, which are based on their channel access mechanism, namely time-slotted protocol, random access protocol and hybrid protocol. In each group, a detailed and comprehensive explanation of the latest MAC protocols is presented, as well as the pros and cons of each protocol. A discussion on future challenges for CRAHN MAC protocols is included with a comparison of the protocols from a functional perspective. PMID:28926952
Performance evaluation of the intra compression in the video coding standards
NASA Astrophysics Data System (ADS)
Abramowski, Andrzej
2015-09-01
The article presents a comparison of the Intra prediction algorithms in the current state-of-the-art video coding standards, including MJPEG 2000, VP8, VP9, H.264/AVC and H.265/HEVC. The effectiveness of techniques employed by each standard is evaluated in terms of compression efficiency and average encoding time. The compression efficiency is measured using BD-PSNR and BD-RATE metrics with H.265/HEVC results as an anchor. Tests are performed on a set of video sequences, composed of sequences gathered by Joint Collaborative Team on Video Coding during the development of the H.265/HEVC standard and 4K sequences provided by Ultra Video Group. According to results, H.265/HEVC provides significant bit-rate savings at the expense of computational complexity, while VP9 may be regarded as a compromise between the efficiency and required encoding time.
The Traffic Adaptive Data Dissemination (TrAD) Protocol for both Urban and Highway Scenarios.
Tian, Bin; Hou, Kun Mean; Zhou, Haiying
2016-06-21
The worldwide economic cost of road crashes and injuries is estimated to be US$518 billion per year and the annual congestion cost in France is estimated to be €5.9 billion. Vehicular Ad hoc Networks (VANETs) are one solution to improve transport features such as traffic safety, traffic jam and infotainment on wheels, where a great number of event-driven messages need to be disseminated in a timely way in a region of interest. In comparison with traditional wireless networks, VANETs have to consider the highly dynamic network topology and lossy links due to node mobility. Inter-Vehicle Communication (IVC) protocols are the keystone of VANETs. According to our survey, most of the proposed IVC protocols focus on either highway or urban scenarios, but not on both. Furthermore, too few protocols, considering both scenarios, can achieve high performance. In this paper, an infrastructure-less Traffic Adaptive data Dissemination (TrAD) protocol which takes into account road traffic and network traffic status for both highway and urban scenarios will be presented. TrAD has double broadcast suppression techniques and is designed to adapt efficiently to the irregular road topology. The performance of the TrAD protocol was evaluated quantitatively by means of realistic simulations taking into account different real road maps, traffic routes and vehicular densities. The obtained simulation results show that TrAD is more efficient in terms of packet delivery ratio, number of transmissions and delay in comparison with the performance of three well-known reference protocols. Moreover, TrAD can also tolerate a reasonable degree of GPS drift and still achieve efficient data dissemination.
The Traffic Adaptive Data Dissemination (TrAD) Protocol for both Urban and Highway Scenarios
Tian, Bin; Hou, Kun Mean; Zhou, Haiying
2016-01-01
The worldwide economic cost of road crashes and injuries is estimated to be US$518 billion per year and the annual congestion cost in France is estimated to be €5.9 billion. Vehicular Ad hoc Networks (VANETs) are one solution to improve transport features such as traffic safety, traffic jam and infotainment on wheels, where a great number of event-driven messages need to be disseminated in a timely way in a region of interest. In comparison with traditional wireless networks, VANETs have to consider the highly dynamic network topology and lossy links due to node mobility. Inter-Vehicle Communication (IVC) protocols are the keystone of VANETs. According to our survey, most of the proposed IVC protocols focus on either highway or urban scenarios, but not on both. Furthermore, too few protocols, considering both scenarios, can achieve high performance. In this paper, an infrastructure-less Traffic Adaptive data Dissemination (TrAD) protocol which takes into account road traffic and network traffic status for both highway and urban scenarios will be presented. TrAD has double broadcast suppression techniques and is designed to adapt efficiently to the irregular road topology. The performance of the TrAD protocol was evaluated quantitatively by means of realistic simulations taking into account different real road maps, traffic routes and vehicular densities. The obtained simulation results show that TrAD is more efficient in terms of packet delivery ratio, number of transmissions and delay in comparison with the performance of three well-known reference protocols. Moreover, TrAD can also tolerate a reasonable degree of GPS drift and still achieve efficient data dissemination. PMID:27338393
A Byzantine-Fault Tolerant Self-Stabilizing Protocol for Distributed Clock Synchronization Systems
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2006-01-01
Embedded distributed systems have become an integral part of safety-critical computing applications, necessitating system designs that incorporate fault tolerant clock synchronization in order to achieve ultra-reliable assurance levels. Many efficient clock synchronization protocols do not, however, address Byzantine failures, and most protocols that do tolerate Byzantine failures do not self-stabilize. Of the Byzantine self-stabilizing clock synchronization algorithms that exist in the literature, they are based on either unjustifiably strong assumptions about initial synchrony of the nodes or on the existence of a common pulse at the nodes. The Byzantine self-stabilizing clock synchronization protocol presented here does not rely on any assumptions about the initial state of the clocks. Furthermore, there is neither a central clock nor an externally generated pulse system. The proposed protocol converges deterministically, is scalable, and self-stabilizes in a short amount of time. The convergence time is linear with respect to the self-stabilization period. Proofs of the correctness of the protocol as well as the results of formal verification efforts are reported.
DNA barcode-based delineation of putative species: efficient start for taxonomic workflows
Kekkonen, Mari; Hebert, Paul D N
2014-01-01
The analysis of DNA barcode sequences with varying techniques for cluster recognition provides an efficient approach for recognizing putative species (operational taxonomic units, OTUs). This approach accelerates and improves taxonomic workflows by exposing cryptic species and decreasing the risk of synonymy. This study tested the congruence of OTUs resulting from the application of three analytical methods (ABGD, BIN, GMYC) to sequence data for Australian hypertrophine moths. OTUs supported by all three approaches were viewed as robust, but 20% of the OTUs were only recognized by one or two of the methods. These OTUs were examined for three criteria to clarify their status. Monophyly and diagnostic nucleotides were both uninformative, but information on ranges was useful as sympatric sister OTUs were viewed as distinct, while allopatric OTUs were merged. This approach revealed 124 OTUs of Hypertrophinae, a more than twofold increase from the currently recognized 51 species. Because this analytical protocol is both fast and repeatable, it provides a valuable tool for establishing a basic understanding of species boundaries that can be validated with subsequent studies. PMID:24479435
Prescreening of microbial populations for the assessment of sequencing potential.
Hanning, Irene B; Ricke, Steven C
2011-01-01
Next-generation sequencing (NGS) is a powerful tool that can be utilized to profile and compare microbial populations. By amplifying a target gene present in all bacteria and subsequently sequencing amplicons, the bacteria genera present in the populations can be identified and compared. In some scenarios, little to no difference may exist among microbial populations being compared in which case a prescreening method would be practical to determine which microbial populations would be suitable for further analysis by NGS. Denaturing density-gradient electrophoresis (DGGE) is relatively cheaper than NGS and the data comparing microbial populations are ready to be viewed immediately after electrophoresis. DGGE follows essentially the same initial methodology as NGS by targeting and amplifying the 16S rRNA gene. However, as opposed to sequencing amplicons, DGGE amplicons are analyzed by electrophoresis. By prescreening microbial populations with DGGE, more efficient use of NGS methods can be accomplished. In this chapter, we outline the protocol for DGGE targeting the same gene (16S rRNA) that would be targeted for NGS to compare and determine differences in microbial populations from a wide range of ecosystems.
Emerman, Amy B; Bowman, Sarah K; Barry, Andrew; Henig, Noa; Patel, Kruti M; Gardner, Andrew F; Hendrickson, Cynthia L
2017-07-05
Next-generation sequencing (NGS) is a powerful tool for genomic studies, translational research, and clinical diagnostics that enables the detection of single nucleotide polymorphisms, insertions and deletions, copy number variations, and other genetic variations. Target enrichment technologies improve the efficiency of NGS by only sequencing regions of interest, which reduces sequencing costs while increasing coverage of the selected targets. Here we present NEBNext Direct ® , a hybridization-based, target-enrichment approach that addresses many of the shortcomings of traditional target-enrichment methods. This approach features a simple, 7-hr workflow that uses enzymatic removal of off-target sequences to achieve a high specificity for regions of interest. Additionally, unique molecular identifiers are incorporated for the identification and filtering of PCR duplicates. The same protocol can be used across a wide range of input amounts, input types, and panel sizes, enabling NEBNext Direct to be broadly applicable across a wide variety of research and diagnostic needs. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.
Ong, Hui San; Rahim, Mohd Syafiq; Firdaus-Raih, Mohd; Ramlan, Effirul Ikhwan
2015-01-01
The unique programmability of nucleic acids offers alternative in constructing excitable and functional nanostructures. This work introduces an autonomous protocol to construct DNA Tetris shapes (L-Shape, B-Shape, T-Shape and I-Shape) using modular DNA blocks. The protocol exploits the rich number of sequence combinations available from the nucleic acid alphabets, thus allowing for diversity to be applied in designing various DNA nanostructures. Instead of a deterministic set of sequences corresponding to a particular design, the protocol promotes a large pool of DNA shapes that can assemble to conform to any desired structures. By utilising evolutionary programming in the design stage, DNA blocks are subjected to processes such as sequence insertion, deletion and base shifting in order to enrich the diversity of the resulting shapes based on a set of cascading filters. The optimisation algorithm allows mutation to be exerted indefinitely on the candidate sequences until these sequences complied with all the four fitness criteria. Generated candidates from the protocol are in agreement with the filter cascades and thermodynamic simulation. Further validation using gel electrophoresis indicated the formation of the designed shapes. Thus, supporting the plausibility of constructing DNA nanostructures in a more hierarchical, modular, and interchangeable manner.
Vignion-Dewalle, Anne-Sophie; Baert, Gregory; Devos, Laura; Thecua, Elise; Vicentini, Claire; Mortier, Laurent; Mordon, Serge
2017-09-01
Photodynamic therapy (PDT) is an emerging treatment modality for various diseases, especially for dermatological conditions. Although, the standard PDT protocol for the treatment of actinic keratoses in Europe has shown to be effective, treatment-associated pain is often observed in patients. Different modifications to this protocol attempted to decrease pain have been investigated. The decrease in fluence rate seems to be a promising solution. Moreover, it has been suggested that light fractionation significantly increases the efficacy of PDT. Based on a flexible light-emitting textile, the FLEXITHERALIGHT device specifically provides a fractionated illumination at a fluence rate more than six times lower than that of the standard protocol. In a recently completed clinical trial of PDT for the treatment of actinic keratosis, the non-inferiority of a protocol involving illumination with the FLEXITHERALIGHT device after a short incubation time and referred to as the FLEXITHERALIGHT protocol has been assessed compared to the standard protocol. In this paper, we propose a comparison of the two above mentioned 635 nm red light protocols with 37 J/cm 2 in the PDT treatment of actinic keratosis: the standard protocol and the FLEXITHERALIGHT one through a mathematical modeling. This mathematical modeling, which slightly differs from the one we have already published, enables the local damage induced by the therapy to be estimated. The comparison performed in terms of the local damage induced by the therapy demonstrates that the FLEXITHERALIGHT protocol with lower fluence rate, light fractionation and shorter incubation time is somewhat less efficient than the standard protocol. Nevertheless, from the clinical trial results, the FLEXITHERALIGHT protocol results in non-inferior response rates compared to the standard protocol. This finding raises the question of whether the PDT local damage achieved by the FLEXITHERALIGHT protocol (respectively, the standard protocol) is sufficient (respectively, excessive) to destroy actinic keratosis cells. Lasers Surg. Med. 49:686-697, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
2014-01-01
This article attempts to define terminology and to describe a process for writing adaptive, early phase study protocols which are transparent, self-intuitive and uniform. It provides a step by step guide, giving templates from projects which received regulatory authorisation and were successfully performed in the UK. During adaptive studies evolving data is used to modify the trial design and conduct within the protocol-defined remit. Adaptations within that remit are documented using non-substantial protocol amendments which do not require regulatory or ethical review. This concept is efficient in gathering relevant data in exploratory early phase studies, ethical and time- and cost-effective. PMID:24980283
Assisted closed-loop optimization of SSVEP-BCI efficiency
Fernandez-Vargas, Jacobo; Pfaff, Hanns U.; Rodríguez, Francisco B.; Varona, Pablo
2012-01-01
We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research. PMID:23443214
Assisted closed-loop optimization of SSVEP-BCI efficiency.
Fernandez-Vargas, Jacobo; Pfaff, Hanns U; Rodríguez, Francisco B; Varona, Pablo
2013-01-01
We designed a novel assisted closed-loop optimization protocol to improve the efficiency of brain-computer interfaces (BCI) based on steady state visually evoked potentials (SSVEP). In traditional paradigms, the control over the BCI-performance completely depends on the subjects' ability to learn from the given feedback cues. By contrast, in the proposed protocol both the subject and the machine share information and control over the BCI goal. Generally, the innovative assistance consists in the delivery of online information together with the online adaptation of BCI stimuli properties. In our case, this adaptive optimization process is realized by (1) a closed-loop search for the best set of SSVEP flicker frequencies and (2) feedback of actual SSVEP magnitudes to both the subject and the machine. These closed-loop interactions between subject and machine are evaluated in real-time by continuous measurement of their efficiencies, which are used as online criteria to adapt the BCI control parameters. The proposed protocol aims to compensate for variability in possibly unknown subjects' state and trait dimensions. In a study with N = 18 subjects, we found significant evidence that our protocol outperformed classic SSVEP-BCI control paradigms. Evidence is presented that it takes indeed into account interindividual variabilities: e.g., under the new protocol, baseline resting state EEG measures predict subjects' BCI performances. This paper illustrates the promising potential of assisted closed-loop protocols in BCI systems. Probably their applicability might be expanded to innovative uses, e.g., as possible new diagnostic/therapeutic tools for clinical contexts and as new paradigms for basic research.
NASA Astrophysics Data System (ADS)
Marukame, Takao; Nishi, Yoshifumi; Yasuda, Shin-ichi; Tanamoto, Tetsufumi
2018-04-01
The use of memristive devices for creating artificial neurons is promising for brain-inspired computing from the viewpoints of computation architecture and learning protocol. We present an energy-efficient multiplier accumulator based on a memristive array architecture incorporating both analog and digital circuitries. The analog circuitry is used to full advantage for neural networks, as demonstrated by the spike-timing-dependent plasticity (STDP) in fabricated AlO x /TiO x -based metal-oxide memristive devices. STDP protocols for controlling periodic analog resistance with long-range stability were experimentally verified using a variety of voltage amplitudes and spike timings.
Quick, Josh; Grubaugh, Nathan D; Pullan, Steven T; Claro, Ingra M; Smith, Andrew D; Gangavarapu, Karthik; Oliveira, Glenn; Robles-Sikisaka, Refugio; Rogers, Thomas F; Beutler, Nathan A; Burton, Dennis R; Lewis-Ximenez, Lia Laura; de Jesus, Jaqueline Goes; Giovanetti, Marta; Hill, Sarah; Black, Allison; Bedford, Trevor; Carroll, Miles W; Nunes, Marcio; Alcantara, Luiz Carlos; Sabino, Ester C; Baylis, Sally A; Faria, Nuno; Loose, Matthew; Simpson, Jared T; Pybus, Oliver G; Andersen, Kristian G; Loman, Nicholas J
2018-01-01
Genome sequencing has become a powerful tool for studying emerging infectious diseases; however, genome sequencing directly from clinical samples without isolation remains challenging for viruses such as Zika, where metagenomic sequencing methods may generate insufficient numbers of viral reads. Here we present a protocol for generating coding-sequence complete genomes comprising an online primer design tool, a novel multiplex PCR enrichment protocol, optimised library preparation methods for the portable MinION sequencer (Oxford Nanopore Technologies) and the Illumina range of instruments, and a bioinformatics pipeline for generating consensus sequences. The MinION protocol does not require an internet connection for analysis, making it suitable for field applications with limited connectivity. Our method relies on multiplex PCR for targeted enrichment of viral genomes from samples containing as few as 50 genome copies per reaction. Viral consensus sequences can be achieved starting with clinical samples in 1-2 days following a simple laboratory workflow. This method has been successfully used by several groups studying Zika virus evolution and is facilitating an understanding of the spread of the virus in the Americas. PMID:28538739
Evaluating Protocol Lifecycle Time Intervals in HIV/AIDS Clinical Trials
Schouten, Jeffrey T.; Dixon, Dennis; Varghese, Suresh; Cope, Marie T.; Marci, Joe; Kagan, Jonathan M.
2014-01-01
Background Identifying efficacious interventions for the prevention and treatment of human diseases depends on the efficient development and implementation of controlled clinical trials. Essential to reducing the time and burden of completing the clinical trial lifecycle is determining which aspects take the longest, delay other stages, and may lead to better resource utilization without diminishing scientific quality, safety, or the protection of human subjects. Purpose In this study we modeled time-to-event data to explore relationships between clinical trial protocol development and implementation times, as well as identify potential correlates of prolonged development and implementation. Methods We obtained time interval and participant accrual data from 111 interventional clinical trials initiated between 2006 and 2011 by NIH’s HIV/AIDS Clinical Trials Networks. We determined the time (in days) required to complete defined phases of clinical trial protocol development and implementation. Kaplan-Meier estimates were used to assess the rates at which protocols reached specified terminal events, stratified by study purpose (therapeutic, prevention) and phase group (pilot/phase I, phase II, and phase III/ IV). We also examined several potential correlates to prolonged development and implementation intervals. Results Even though phase grouping did not determine development or implementation times of either therapeutic or prevention studies, overall we observed wide variation in protocol development times. Moreover, we detected a trend toward phase III/IV therapeutic protocols exhibiting longer developmental (median 2 ½ years) and implementation times (>3years). We also found that protocols exceeding the median number of days for completing the development interval had significantly longer implementation. Limitations The use of a relatively small set of protocols may have limited our ability to detect differences across phase groupings. Some timing effects present for a specific study phase may have been masked by combining protocols into phase groupings. Presence of informative censoring, such as withdrawal of some protocols from development if they began showing signs of lost interest among investigators, complicates interpretation of Kaplan-Meier estimates. Because this study constitutes a retrospective examination over an extended period of time, it does not allow for the precise identification of relative factors impacting timing. Conclusions Delays not only increase the time and cost to complete clinical trials, but they also diminish their usefulness by failing to answer research questions in time. We believe that research analyzing the time spent traversing defined intervals across the clinical trial protocol development and implementation continuum can stimulate business process analyses and reengineering efforts that could lead to reductions in the time from clinical trial concept to results, thereby accelerating progress in clinical research. PMID:24980279
A novel method for efficient archiving and retrieval of biomedical images using MPEG-7
NASA Astrophysics Data System (ADS)
Meyer, Joerg; Pahwa, Ash
2004-10-01
Digital archiving and efficient retrieval of radiological scans have become critical steps in contemporary medical diagnostics. Since more and more images and image sequences (single scans or video) from various modalities (CT/MRI/PET/digital X-ray) are now available in digital formats (e.g., DICOM-3), hospitals and radiology clinics need to implement efficient protocols capable of managing the enormous amounts of data generated daily in a typical clinical routine. We present a method that appears to be a viable way to eliminate the tedious step of manually annotating image and video material for database indexing. MPEG-7 is a new framework that standardizes the way images are characterized in terms of color, shape, and other abstract, content-related criteria. A set of standardized descriptors that are automatically generated from an image is used to compare an image to other images in a database, and to compute the distance between two images for a given application domain. Text-based database queries can be replaced with image-based queries using MPEG-7. Consequently, image queries can be conducted without any prior knowledge of the keys that were used as indices in the database. Since the decoding and matching steps are not part of the MPEG-7 standard, this method also enables searches that were not planned by the time the keys were generated.
Cell-Penetrating Peptide-Mediated Delivery of Cas9 Protein and Guide RNA for Genome Editing.
Suresh, Bharathi; Ramakrishna, Suresh; Kim, Hyongbum
2017-01-01
The clustered, regularly interspaced, short palindromic repeat (CRISPR)-associated (Cas) system represents an efficient tool for genome editing. It consists of two components: the Cas9 protein and a guide RNA. To date, delivery of these two components has been achieved using either plasmid or viral vectors or direct delivery of protein and RNA. Plasmid- and virus-free direct delivery of Cas9 protein and guide RNA has several advantages over the conventional plasmid-mediated approach. Direct delivery results in shorter exposure time at the cellular level, which in turn leads to lower toxicity and fewer off-target mutations with reduced host immune responses, whereas plasmid- or viral vector-mediated delivery can result in uncontrolled integration of the vector sequence into the host genome and unwanted immune responses. Cell-penetrating peptide (CPP), a peptide that has an intrinsic ability to translocate across cell membranes, has been adopted as a means of achieving efficient Cas9 protein and guide RNA delivery. We developed a method for treating human cell lines with CPP-conjugated recombinant Cas9 protein and CPP-complexed guide RNAs that leads to endogenous gene disruption. Here we describe a protocol for preparing an efficient CPP-conjugated recombinant Cas9 protein and CPP-complexed guide RNAs, as well as treatment methods to achieve safe genome editing in human cell lines.
Entanglement distillation for quantum communication network with atomic-ensemble memories.
Li, Tao; Yang, Guo-Jian; Deng, Fu-Guo
2014-10-06
Atomic ensembles are effective memory nodes for quantum communication network due to the long coherence time and the collective enhancement effect for the nonlinear interaction between an ensemble and a photon. Here we investigate the possibility of achieving the entanglement distillation for nonlocal atomic ensembles by the input-output process of a single photon as a result of cavity quantum electrodynamics. We give an optimal entanglement concentration protocol (ECP) for two-atomic-ensemble systems in a partially entangled pure state with known parameters and an efficient ECP for the systems in an unknown partially entangled pure state with a nondestructive parity-check detector (PCD). For the systems in a mixed entangled state, we introduce an entanglement purification protocol with PCDs. These entanglement distillation protocols have high fidelity and efficiency with current experimental techniques, and they are useful for quantum communication network with atomic-ensemble memories.
A Novel Cross-Layer Routing Protocol Based on Network Coding for Underwater Sensor Networks
Wang, Hao; Wang, Shilian; Bu, Renfei; Zhang, Eryang
2017-01-01
Underwater wireless sensor networks (UWSNs) have attracted increasing attention in recent years because of their numerous applications in ocean monitoring, resource discovery and tactical surveillance. However, the design of reliable and efficient transmission and routing protocols is a challenge due to the low acoustic propagation speed and complex channel environment in UWSNs. In this paper, we propose a novel cross-layer routing protocol based on network coding (NCRP) for UWSNs, which utilizes network coding and cross-layer design to greedily forward data packets to sink nodes efficiently. The proposed NCRP takes full advantages of multicast transmission and decode packets jointly with encoded packets received from multiple potential nodes in the entire network. The transmission power is optimized in our design to extend the life cycle of the network. Moreover, we design a real-time routing maintenance protocol to update the route when detecting inefficient relay nodes. Substantial simulations in underwater environment by Network Simulator 3 (NS-3) show that NCRP significantly improves the network performance in terms of energy consumption, end-to-end delay and packet delivery ratio compared with other routing protocols for UWSNs. PMID:28786915
NASA Astrophysics Data System (ADS)
Armeli Minicante, S.; Ambrosi, E.; Back, M.; Barichello, J.; Cattaruzza, E.; Gonella, F.; Scantamburlo, E.; Trave, E.
2016-07-01
Seaweeds are a reserve of natural dyes (chlorophylls a, b and c), characterized by low cost and easy supply, without potential environmental load in terms of land subtraction, and also complying with the requirements of an efficient waste management policy. In particular, the brown seaweed Undaria pinnatifida is a species largely present in the Venice Lagoon area, and for it a removal strategy is actually mandatory. In this paper, we set-up an eco-protocol for the best extraction and preparation procedures of the pigment, with the aim of finding an easy and affordable method for chlorophyll c extraction, exploring at the same time the possibility of using these algae within local sustainable management integrated strategies, among which the possible use of chlorophylls as a dye source in dye sensitized solar cells (DSSCs) is investigated. Experimental results suggest that the developed protocols are useful to optimize the chlorophyll c extraction, as shown by optical absorption spectroscopy measurements. The DSSCs built with the chlorophyll extracted by the proposed eco-protocol exhibit solar energy conversion efficiencies are similar to those obtained following extraction protocols with larger environmental impacts.
1-RAAP: An Efficient 1-Round Anonymous Authentication Protocol for Wireless Body Area Networks
Liu, Jingwei; Zhang, Lihuan; Sun, Rong
2016-01-01
Thanks to the rapid technological convergence of wireless communications, medical sensors and cloud computing, Wireless Body Area Networks (WBANs) have emerged as a novel networking paradigm enabling ubiquitous Internet services, allowing people to receive medical care, monitor health status in real-time, analyze sports data and even enjoy online entertainment remotely. However, because of the mobility and openness of wireless communications, WBANs are inevitably exposed to a large set of potential attacks, significantly undermining their utility and impeding their widespread deployment. To prevent attackers from threatening legitimate WBAN users or abusing WBAN services, an efficient and secure authentication protocol termed 1-Round Anonymous Authentication Protocol (1-RAAP) is proposed in this paper. In particular, 1-RAAP preserves anonymity, mutual authentication, non-repudiation and some other desirable security properties, while only requiring users to perform several low cost computational operations. More importantly, 1-RAAP is provably secure thanks to its design basis, which is resistant to the anonymous in the random oracle model. To validate the computational efficiency of 1-RAAP, a set of comprehensive comparative studies between 1-RAAP and other authentication protocols is conducted, and the results clearly show that 1-RAAP achieves the best performance in terms of computational overhead. PMID:27213384
1-RAAP: An Efficient 1-Round Anonymous Authentication Protocol for Wireless Body Area Networks.
Liu, Jingwei; Zhang, Lihuan; Sun, Rong
2016-05-19
Thanks to the rapid technological convergence of wireless communications, medical sensors and cloud computing, Wireless Body Area Networks (WBANs) have emerged as a novel networking paradigm enabling ubiquitous Internet services, allowing people to receive medical care, monitor health status in real-time, analyze sports data and even enjoy online entertainment remotely. However, because of the mobility and openness of wireless communications, WBANs are inevitably exposed to a large set of potential attacks, significantly undermining their utility and impeding their widespread deployment. To prevent attackers from threatening legitimate WBAN users or abusing WBAN services, an efficient and secure authentication protocol termed 1-Round Anonymous Authentication Protocol (1-RAAP) is proposed in this paper. In particular, 1-RAAP preserves anonymity, mutual authentication, non-repudiation and some other desirable security properties, while only requiring users to perform several low cost computational operations. More importantly, 1-RAAP is provably secure thanks to its design basis, which is resistant to the anonymous in the random oracle model. To validate the computational efficiency of 1-RAAP, a set of comprehensive comparative studies between 1-RAAP and other authentication protocols is conducted, and the results clearly show that 1-RAAP achieves the best performance in terms of computational overhead.
Detection of Proteins on Blot Membranes
Goldman, Aaron; Harper, Sandra; Speicher, David W.
2017-01-01
Staining of blot membranes enables the visualization of bound proteins. Proteins are usually transferred to blot membranes by electroblotting, by direct spotting of protein solutions, or by contact blots. Staining allows the efficiency of transfer to the membrane to be monitored. This unit describes protocols for staining proteins after electroblotting from polyacrylamide gels to blot membranes such as polyvinylidene difluoride (PVDF), nitrocellulose, or nylon membranes. The same methods can be used if proteins are directly spotted, either manually or using robotics. Protocols are included for seven general protein stains (amido black, Coomassie blue, Ponceau S, colloidal gold, colloidal silver, India ink, and MemCode) and three fluorescent protein stains (fluorescamine, IAEDANS, and SYPRO Ruby). Also included is an in-depth discussion of the different blot membrane types and the compatibility of different protein stains with downstream applications, such as immunoblotting or N-terminal Edman sequencing. PMID:27801518
Detection of Proteins on Blot Membranes.
Goldman, Aaron; Harper, Sandra; Speicher, David W
2016-11-01
Staining of blot membranes enables the visualization of bound proteins. Proteins are usually transferred to blot membranes by electroblotting, by direct spotting of protein solutions, or by contact blots. Staining allows the efficiency of transfer to the membrane to be monitored. This unit describes protocols for staining proteins after electroblotting from polyacrylamide gels to blot membranes such as polyvinylidene difluoride (PVDF), nitrocellulose, or nylon membranes. The same methods can be used if proteins are directly spotted, either manually or using robotics. Protocols are included for seven general protein stains (amido black, Coomassie blue, Ponceau S, colloidal gold, colloidal silver, India ink, and MemCode) and three fluorescent protein stains (fluorescamine, IAEDANS, and SYPRO Ruby). Also included is an in-depth discussion of the different blot membrane types and the compatibility of different protein stains with downstream applications, such as immunoblotting or N-terminal Edman sequencing. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Whole-genome multiple displacement amplification from single cells.
Spits, Claudia; Le Caignec, Cédric; De Rycke, Martine; Van Haute, Lindsey; Van Steirteghem, André; Liebaers, Inge; Sermon, Karen
2006-01-01
Multiple displacement amplification (MDA) is a recently described method of whole-genome amplification (WGA) that has proven efficient in the amplification of small amounts of DNA, including DNA from single cells. Compared with PCR-based WGA methods, MDA generates DNA with a higher molecular weight and shows better genome coverage. This protocol was developed for preimplantation genetic diagnosis, and details a method for performing single-cell MDA using the phi29 DNA polymerase. It can also be useful for the amplification of other minute quantities of DNA, such as from forensic material or microdissected tissue. The protocol includes the collection and lysis of single cells, and all materials and steps involved in the MDA reaction. The whole procedure takes 3 h and generates 1-2 microg of DNA from a single cell, which is suitable for multiple downstream applications, such as sequencing, short tandem repeat analysis or array comparative genomic hybridization.
Efficiency and security problems of anonymous key agreement protocol based on chaotic maps
NASA Astrophysics Data System (ADS)
Yoon, Eun-Jun
2012-07-01
In 2011, Niu-Wang proposed an anonymous key agreement protocol based on chaotic maps in [Niu Y, Wang X. An anonymous key agreement protocol based on chaotic maps. Commun Nonlinear Sci Simulat 2011;16(4):1986-92]. Niu-Wang's protocol not only achieves session key agreement between a server and a user, but also allows the user to anonymously interact with the server. Nevertheless, this paper points out that Niu-Wang's protocol has the following efficiency and security problems: (1) The protocol has computational efficiency problem when a trusted third party decrypts the user sending message. (2) The protocol is vulnerable to Denial of Service (DoS) attack based on illegal message modification by an attacker.
Primary Airway Epithelial Cell Gene Editing Using CRISPR-Cas9.
Everman, Jamie L; Rios, Cydney; Seibold, Max A
2018-01-01
The adaptation of the clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR associated endonuclease 9 (CRISPR-Cas9) machinery from prokaryotic organisms has resulted in a gene editing system that is highly versatile, easily constructed, and can be leveraged to generate human cells knocked out (KO) for a specific gene. While standard transfection techniques can be used for the introduction of CRISPR-Cas9 expression cassettes to many cell types, delivery by this method is not efficient in many primary cell types, including primary human airway epithelial cells (AECs). More efficient delivery in AECs can be achieved through lentiviral-mediated transduction, allowing the CRISPR-Cas9 system to be integrated into the genome of the cell, resulting in stable expression of the nuclease machinery and increasing editing rates. In parallel, advancements have been made in the culture, expansion, selection, and differentiation of AECs, which allow the robust generation of a bulk edited AEC population from transduced cells. Applying these methods, we detail here our latest protocol to generate mucociliary epithelial cultures knocked out for a specific gene from donor-isolated primary human basal airway epithelial cells. This protocol includes methods to: (1) design and generate lentivirus which targets a specific gene for KO with CRISPR-Cas9 machinery, (2) efficiently transduce AECs, (3) culture and select for a bulk edited AEC population, (4) molecularly screen AECs for Cas9 cutting and specific sequence edits, and (5) further expand and differentiate edited cells to a mucociliary airway epithelial culture. The AEC knockouts generated using this protocol provide an excellent primary cell model system with which to characterize the function of genes involved in airway dysfunction and disease.
Time- and Cost-Efficient Identification of T-DNA Insertion Sites through Targeted Genomic Sequencing
Lepage, Étienne; Zampini, Éric; Boyle, Brian; Brisson, Normand
2013-01-01
Forward genetic screens enable the unbiased identification of genes involved in biological processes. In Arabidopsis, several mutant collections are publicly available, which greatly facilitates such practice. Most of these collections were generated by agrotransformation of a T-DNA at random sites in the plant genome. However, precise mapping of T-DNA insertion sites in mutants isolated from such screens is a laborious and time-consuming task. Here we report a simple, low-cost and time efficient approach to precisely map T-DNA insertions simultaneously in many different mutants. By combining sequence capture, next-generation sequencing and 2D-PCR pooling, we developed a new method that allowed the rapid localization of T-DNA insertion sites in 55 out of 64 mutant plants isolated in a screen for gyrase inhibition hypersensitivity. PMID:23951038
General method for rapid purification of native chromatin fragments.
Kuznetsov, Vyacheslav I; Haws, Spencer A; Fox, Catherine A; Denu, John M
2018-05-24
Biochemical, proteomic and epigenetic studies of chromatin rely on the efficient ability to isolate native nucleosomes in high yield and purity. However, isolation of native chromatin suitable for many downstream experiments remains a challenging task. This is especially true for the budding yeast Saccharomyces cerevisiae, which continues to serve as an important model organism for the study of chromatin structure and function. Here, we developed a time- and cost-efficient universal protocol for isolation of native chromatin fragments from yeast, insect, and mammalian cells. The resulting protocol preserves histone posttranslational modification in the native chromatin state, and is applicable for both parallel multi-sample spin-column purification and large scale isolation. This protocol is based on the efficient and stable purification of polynucleosomes, features a combination of optimized cell lysis and purification conditions, three options for chromatin fragmentation, and a novel ion-exchange chromatographic purification strategy. The procedure will aid chromatin researchers interested in isolating native chromatin material for biochemical studies, and as a mild, acid- and detergent-free sample preparation method for mass-spectrometry analysis. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Borjas, Zulema; Esteve-Núñez, Abraham; Ortiz, Juan Manuel
2017-07-01
Microbial Desalination Cells constitute an innovative technology where microbial fuel cell and electrodialysis merge in the same device for obtaining fresh water from saline water with no energy-associated cost for the user. In this work, an anodic biofilm of the electroactive bacteria Geobacter sulfurreducens was able to efficiently convert the acetate present in synthetic waste water into electric current (j = 0.32 mA cm-2) able to desalinate water. .Moreover, we implemented an efficient start-up protocol where desalination up to 90% occurred in a desalination cycle (water production:0.308 L m-2 h-1, initial salinity: 9 mS cm-1, final salinity: <1 mS cm-1) using a filter press-based MDC prototype without any energy supply (excluding peristaltic pump energy). This start-up protocol is not only optimized for time but also simplifies operational procedures making it a more feasible strategy for future scaling-up of MDCs either as a single process or as a pre-treatment method combined with other well established desalination technologies such as reverse osmosis (RO) or reverse electrodialysis.
Yu, Qiang; Wei, Dingbang; Huo, Hongwei
2018-06-18
Given a set of t n-length DNA sequences, q satisfying 0 < q ≤ 1, and l and d satisfying 0 ≤ d < l < n, the quorum planted motif search (qPMS) finds l-length strings that occur in at least qt input sequences with up to d mismatches and is mainly used to locate transcription factor binding sites in DNA sequences. Existing qPMS algorithms have been able to efficiently process small standard datasets (e.g., t = 20 and n = 600), but they are too time consuming to process large DNA datasets, such as ChIP-seq datasets that contain thousands of sequences or more. We analyze the effects of t and q on the time performance of qPMS algorithms and find that a large t or a small q causes a longer computation time. Based on this information, we improve the time performance of existing qPMS algorithms by selecting a sample sequence set D' with a small t and a large q from the large input dataset D and then executing qPMS algorithms on D'. A sample sequence selection algorithm named SamSelect is proposed. The experimental results on both simulated and real data show (1) that SamSelect can select D' efficiently and (2) that the qPMS algorithms executed on D' can find implanted or real motifs in a significantly shorter time than when executed on D. We improve the ability of existing qPMS algorithms to process large DNA datasets from the perspective of selecting high-quality sample sequence sets so that the qPMS algorithms can find motifs in a short time in the selected sample sequence set D', rather than take an unfeasibly long time to search the original sequence set D. Our motif discovery method is an approximate algorithm.
Cooperative Energy Harvesting-Adaptive MAC Protocol for WBANs
Esteves, Volker; Antonopoulos, Angelos; Kartsakli, Elli; Puig-Vidal, Manel; Miribel-Català, Pere; Verikoukis, Christos
2015-01-01
In this paper, we introduce a cooperative medium access control (MAC) protocol, named cooperative energy harvesting (CEH)-MAC, that adapts its operation to the energy harvesting (EH) conditions in wireless body area networks (WBANs). In particular, the proposed protocol exploits the EH information in order to set an idle time that allows the relay nodes to charge their batteries and complete the cooperation phase successfully. Extensive simulations have shown that CEH-MAC significantly improves the network performance in terms of throughput, delay and energy efficiency compared to the cooperative operation of the baseline IEEE 802.15.6 standard. PMID:26029950
Revisiting Deng et al.'s Multiparty Quantum Secret Sharing Protocol
NASA Astrophysics Data System (ADS)
Hwang, Tzonelih; Hwang, Cheng-Chieh; Yang, Chun-Wei; Li, Chuan-Ming
2011-09-01
The multiparty quantum secret sharing protocol [Deng et al. in Chin. Phys. Lett. 23: 1084-1087, 2006] is revisited in this study. It is found that the performance of Deng et al.'s protocol can be much improved by using the techniques of block-transmission and decoy single photons. As a result, the qubit efficiency is improved 2.4 times and only one classical communication, a public discussion, and two quantum communications between each agent and the secret holder are needed rather than n classical communications, n public discussions, and 3n/2 quantum communications required in the original scheme.
SMART on FHIR Genomics: facilitating standardized clinico-genomic apps.
Alterovitz, Gil; Warner, Jeremy; Zhang, Peijin; Chen, Yishen; Ullman-Cullere, Mollie; Kreda, David; Kohane, Isaac S
2015-11-01
Supporting clinical decision support for personalized medicine will require linking genome and phenome variants to a patient's electronic health record (EHR), at times on a vast scale. Clinico-genomic data standards will be needed to unify how genomic variant data are accessed from different sequencing systems. A specification for the basis of a clinic-genomic standard, building upon the current Health Level Seven International Fast Healthcare Interoperability Resources (FHIR®) standard, was developed. An FHIR application protocol interface (API) layer was attached to proprietary sequencing platforms and EHRs in order to expose gene variant data for presentation to the end-user. Three representative apps based on the SMART platform were built to test end-to-end feasibility, including integration of genomic and clinical data. Successful design, deployment, and use of the API was demonstrated and adopted by HL7 Clinical Genomics Workgroup. Feasibility was shown through development of three apps by various types of users with background levels and locations. This prototyping work suggests that an entirely data (and web) standards-based approach could prove both effective and efficient for advancing personalized medicine. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Reverse transcription polymerase chain reaction protocols for cloning small circular RNAs.
Navarro, B; Daròs, J A; Flores, R
1998-07-01
A protocol is described for general application for cloning small circular RNAs which requires only minimal amounts of template (approximately 50 ng) of unknown sequence. Both cDNA strands are synthesized with a 26-mer primer whose six 3'-terminal positions are totally degenerate in two consecutive reactions catalyzed by reverse transcriptase and DNA polymerase, respectively. The cDNAs are then PCR-amplified, using a 20-mer primer with the non-degenerate sequence of the previous primer, cloned and sequenced. This information permits the synthesis of one or more pairs of specific and adjacent primers for obtaining full-length cDNA clones by a protocol which is also described.
When seconds count: A study of communication variables in the opening segment of emergency calls.
Penn, Claire; Koole, Tom; Nattrass, Rhona
2017-09-01
The opening sequence of an emergency call influences the efficiency of the ambulance dispatch time. The greeting sequences in 105 calls to a South African emergency service were analysed. Initial results suggested the advantage of a specific two-part opening sequence. An on-site experiment aimed at improving call efficiency was conducted during one shift (1100 calls). Results indicated reduced conversational repairs and a significant reduction of 4 seconds in mean call length. Implications for systems and training are derived.
A Modified Protocol for High-Quality RNA Extraction from Oleoresin-Producing Adult Pines.
de Lima, Júlio César; Füller, Thanise Nogueira; de Costa, Fernanda; Rodrigues-Corrêa, Kelly C S; Fett-Neto, Arthur G
2016-01-01
RNA extraction resulting in good yields and quality is a fundamental step for the analyses of transcriptomes through high-throughput sequencing technologies, microarray, and also northern blots, RT-PCR, and RTqPCR. Even though many specific protocols designed for plants with high content of secondary metabolites have been developed, these are often expensive, time consuming, and not suitable for a wide range of tissues. Here we present a modification of the method previously described using the commercially available Concert™ Plant RNA Reagent (Invitrogen) buffer for field-grown adult pine trees with high oleoresin content.
2008-01-01
on Op. Sys. Principles, ACM SIGOPS, Brighton , UK , October. Pollack, S. and McQuay, W.K. (2005) ‘Joint battlespace infosphere applications using...the voting protocols for good performance while meeting the reliability requirements of data delivery in a high assurance setting. Two metric quantify...the effectiveness of voting protocols: Data Transfer Efficiency (DTE) and Time-to-Complete (TTC) data delivery . DTE captures the network bandwidth
A space-efficient algorithm for local similarities.
Huang, X Q; Hardison, R C; Miller, W
1990-10-01
Existing dynamic-programming algorithms for identifying similar regions of two sequences require time and space proportional to the product of the sequence lengths. Often this space requirement is more limiting than the time requirement. We describe a dynamic-programming local-similarity algorithm that needs only space proportional to the sum of the sequence lengths. The method can also find repeats within a single long sequence. To illustrate the algorithm's potential, we discuss comparison of a 73,360 nucleotide sequence containing the human beta-like globin gene cluster and a corresponding 44,594 nucleotide sequence for rabbit, a problem well beyond the capabilities of other dynamic-programming software.
zUMIs - A fast and flexible pipeline to process RNA sequencing data with UMIs.
Parekh, Swati; Ziegenhain, Christoph; Vieth, Beate; Enard, Wolfgang; Hellmann, Ines
2018-06-01
Single-cell RNA-sequencing (scRNA-seq) experiments typically analyze hundreds or thousands of cells after amplification of the cDNA. The high throughput is made possible by the early introduction of sample-specific bar codes (BCs), and the amplification bias is alleviated by unique molecular identifiers (UMIs). Thus, the ideal analysis pipeline for scRNA-seq data needs to efficiently tabulate reads according to both BC and UMI. zUMIs is a pipeline that can handle both known and random BCs and also efficiently collapse UMIs, either just for exon mapping reads or for both exon and intron mapping reads. If BC annotation is missing, zUMIs can accurately detect intact cells from the distribution of sequencing reads. Another unique feature of zUMIs is the adaptive downsampling function that facilitates dealing with hugely varying library sizes but also allows the user to evaluate whether the library has been sequenced to saturation. To illustrate the utility of zUMIs, we analyzed a single-nucleus RNA-seq dataset and show that more than 35% of all reads map to introns. Also, we show that these intronic reads are informative about expression levels, significantly increasing the number of detected genes and improving the cluster resolution. zUMIs flexibility makes if possible to accommodate data generated with any of the major scRNA-seq protocols that use BCs and UMIs and is the most feature-rich, fast, and user-friendly pipeline to process such scRNA-seq data.
Preoperative magnetic resonance imaging protocol for endoscopic cranial base image-guided surgery.
Grindle, Christopher R; Curry, Joseph M; Kang, Melissa D; Evans, James J; Rosen, Marc R
2011-01-01
Despite the increasing utilization of image-guided surgery, no radiology protocols for obtaining magnetic resonance (MR) imaging of adequate quality are available in the current literature. At our institution, more than 300 endonasal cranial base procedures including pituitary, extended pituitary, and other anterior skullbase procedures have been performed in the past 3 years. To facilitate and optimize preoperative evaluation and assessment, there was a need to develop a magnetic resonance protocol. Retrospective Technical Assessment was performed. Through a collaborative effort between the otolaryngology, neurosurgery, and neuroradiology departments at our institution, a skull base MR image-guided (IGS) protocol was developed with several ends in mind. First, it was necessary to generate diagnostic images useful for the more frequently seen pathologies to improve work flow and limit the expense and inefficiency of case specific MR studies. Second, it was necessary to generate sequences useful for IGS, preferably using sequences that best highlight that lesion. Currently, at our institution, all MR images used for IGS are obtained using this protocol as part of preoperative planning. The protocol that has been developed allows for thin cut precontrast and postcontrast axial cuts that can be used to plan intraoperative image guidance. It also obtains a thin cut T2 axial series that can be compiled separately for intraoperative imaging, or may be fused with computed tomographic images for combined modality. The outlined protocol obtains image sequences effective for diagnostic and operative purposes for image-guided surgery using both T1 and T2 sequences. Copyright © 2011 Elsevier Inc. All rights reserved.
Neurologic 3D MR Spectroscopic Imaging with Low-Power Adiabatic Pulses and Fast Spiral Acquisition
Gagoski, Borjan A.; Sorensen, A. Gregory
2012-01-01
Purpose: To improve clinical three-dimensional (3D) MR spectroscopic imaging with more accurate localization and faster acquisition schemes. Materials and Methods: Institutional review board approval and patient informed consent were obtained. Data were acquired with a 3-T MR imager and a 32-channel head coil in phantoms, five healthy volunteers, and five patients with glioblastoma. Excitation was performed with localized adiabatic spin-echo refocusing (LASER) by using adiabatic gradient-offset independent adiabaticity wideband uniform rate and smooth truncation (GOIA-W[16,4]) pulses with 3.5-msec duration, 20-kHz bandwidth, 0.81-kHz amplitude, and 45-msec echo time. Interleaved constant-density spirals simultaneously encoded one frequency and two spatial dimensions. Conventional phase encoding (PE) (1-cm3 voxels) was performed after LASER excitation and was the reference standard. Spectra acquired with spiral encoding at similar and higher spatial resolution and with shorter imaging time were compared with those acquired with PE. Metabolite levels were fitted with software, and Bland-Altman analysis was performed. Results: Clinical 3D MR spectroscopic images were acquired four times faster with spiral protocols than with the elliptical PE protocol at low spatial resolution (1 cm3). Higher-spatial-resolution images (0.39 cm3) were acquired twice as fast with spiral protocols compared with the low-spatial-resolution elliptical PE protocol. A minimum signal-to-noise ratio (SNR) of 5 was obtained with spiral protocols under these conditions and was considered clinically adequate to reliably distinguish metabolites from noise. The apparent SNR loss was not linear with decreasing voxel sizes because of longer local T2* times. Improvement of spectral line width from 4.8 Hz to 3.5 Hz was observed at high spatial resolution. The Bland-Altman agreement between spiral and PE data is characterized by narrow 95% confidence intervals for their differences (0.12, 0.18 of their means). GOIA-W(16,4) pulses minimize chemical-shift displacement error to 2.1%, reduce nonuniformity of excitation to 5%, and eliminate the need for outer volume suppression. Conclusion: The proposed adiabatic spiral 3D MR spectroscopic imaging sequence can be performed in a standard clinical MR environment. Improvements in image quality and imaging time could enable more routine acquisition of spectroscopic data than is possible with current pulse sequences. © RSNA, 2011 PMID:22187628
Entanglement-secured single-qubit quantum secret sharing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scherpelz, P.; Resch, R.; Berryrieser, D.
In single-qubit quantum secret sharing, a secret is shared between N parties via manipulation and measurement of one qubit at a time. Each qubit is sent to all N parties in sequence; the secret is encoded in the first participant's preparation of the qubit state and the subsequent participants' choices of state rotation or measurement basis. We present a protocol for single-qubit quantum secret sharing using polarization entanglement of photon pairs produced in type-I spontaneous parametric downconversion. We investigate the protocol's security against eavesdropping attack under common experimental conditions: a lossy channel for photon transmission, and imperfect preparation of themore » initial qubit state. A protocol which exploits entanglement between photons, rather than simply polarization correlation, is more robustly secure. We implement the entanglement-based secret-sharing protocol with 87% secret-sharing fidelity, limited by the purity of the entangled state produced by our present apparatus. We demonstrate a photon-number splitting eavesdropping attack, which achieves no success against the entanglement-based protocol while showing the predicted rate of success against a correlation-based protocol.« less
Françoso, Elaine; Gomes, Fernando; Arias, Maria Cristina
2016-07-01
Nuclear mitochondrial DNA insertions (NUMTs) are mitochondrial DNA sequences that have been transferred into the nucleus and are recognized by the presence of indels and stop codons. Although NUMTs have been identified in a diverse range of species, their discovery was frequently accidental. Here, our initial goal was to develop and standardize a simple method for isolating NUMTs from the nuclear genome of a single bee. Subsequently, we tested our new protocol by determining whether the indels and stop codons of the cytochrome c oxidase subunit I (COI) sequence of Melipona flavolineata are of nuclear origin. The new protocol successfully demonstrated the presence of a COI NUMT. In addition to NUMT investigations, the protocol described here will also be very useful for studying mitochondrial mutations related to diseases and for sequencing complete mitochondrial genomes with high read coverage by Next-Generation technology.
A Secure and Efficient Handover Authentication Protocol for Wireless Networks
Wang, Weijia; Hu, Lei
2014-01-01
Handover authentication protocol is a promising access control technology in the fields of WLANs and mobile wireless sensor networks. In this paper, we firstly review an efficient handover authentication protocol, named PairHand, and its existing security attacks and improvements. Then, we present an improved key recovery attack by using the linearly combining method and reanalyze its feasibility on the improved PairHand protocol. Finally, we present a new handover authentication protocol, which not only achieves the same desirable efficiency features of PairHand, but enjoys the provable security in the random oracle model. PMID:24971471
Molecular Test to Assign Individuals within the Cacopsylla pruni Complex
Peccoud, Jean; Labonne, Gérard; Sauvion, Nicolas
2013-01-01
Crop protection requires the accurate identification of disease vectors, a task that can be made difficult when these vectors encompass cryptic species. Here we developed a rapid molecular diagnostic test to identify individuals of Cacopsylla pruni (Scopoli, 1763) (Hemiptera: Psyllidae), the main vector of the European stone fruit yellows phytoplasma. This psyllid encompasses two highly divergent genetic groups that are morphologically similar and that are characterized by genotyping several microsatellite markers, a costly and time-consuming protocol. With the aim of developing species-specific PCR primers, we sequenced the Internal Transcribed Spacer 2 (ITS2) on a collection of C . pruni samples from France and other European countries. ITS2 sequences showed that the two genetic groups represent two highly divergent clades. This enabled us to develop specific primers for the assignment of individuals to either genetic group in a single PCR, based on ITS2 amplicon size. All previously assigned individuals yielded bands of expected sizes, and the PCR proved efficient on a larger sample of 799 individuals. Because none appeared heterozygous at the ITS2 locus (i.e., none produced two bands), we inferred that the genetic groups of C . pruni , whose distribution is partly sympatric, constitute biological species that have not exchanged genes for an extended period of time. Other psyllid species (Cacopsylla, Psylla, Triozidae and Aphalaridae) failed to yield any amplicon. These primers are therefore unlikely to produce false positives and allow rapid assignment of C . pruni individuals to either cryptic species. PMID:23977301
A rapid and efficient assay for extracting DNA from fungi
Griffin, Dale W.; Kellogg, C.A.; Peak, K.K.; Shinn, E.A.
2002-01-01
Aims: A method for the rapid extraction of fungal DNA from small quantities of tissue in a batch-processing format was investigated. Methods and Results: Tissue (< 3.0 mg) was scraped from freshly-grown fungal isolates. The tissue was suspended in buffer AP1 and subjected to seven rounds of freeze/thaw using a crushed dry ice/ethanol bath and a boiling water bath. After a 30 min boiling step, the tissue was quickly ground against the wall of the microfuge tube using a sterile pipette tip. The Qiagen DNeasy Plant Tissue Kit protocol was then used to purify the DNA for PCR/ sequencing applications. Conclusions: The method allowed batch DNA extraction from multiple fungal isolates using a simple yet rapid and reliable assay. Significance and Impact of the Study: Use of this assay will allow researchers to obtain DNA from fungi quickly for use in molecular assays that previously required specialized instrumentation, was time-consuming or was not conducive to batch processing.
Rakesh Minocha; Gabriela Martinez; Benjamin Lyons; Stephanie Long
2009-01-01
Despite the availability of several protocols for the extraction of chlorophylls and carotenoids from foliage of forest trees, information regarding their respective extraction efficiencies is scarce. We compared the efficiencies of acetone, ethanol, dimethyl sulfoxide (DMSO), and N, N-dimethylformamide (DMF) over a range of incubation times for the extraction of...
Ren, Peng; Qian, Jiansheng
2016-01-01
This study proposes a novel power-efficient and anti-fading clustering based on a cross-layer that is specific to the time-varying fading characteristics of channels in the monitoring of coal mine faces with wireless sensor networks. The number of active sensor nodes and a sliding window are set up such that the optimal number of cluster heads (CHs) is selected in each round. Based on a stable expected number of CHs, we explore the channel efficiency between nodes and the base station by using a probe frame and the joint surplus energy in assessing the CH selection. Moreover, the sending power of a node in different periods is regulated by the signal fade margin method. The simulation results demonstrate that compared with several common algorithms, the power-efficient and fading-aware clustering with a cross-layer (PEAFC-CL) protocol features a stable network topology and adaptability under signal time-varying fading, which effectively prolongs the lifetime of the network and reduces network packet loss, thus making it more applicable to the complex and variable environment characteristic of a coal mine face. PMID:27338380
EON: a component-based approach to automation of protocol-directed therapy.
Musen, M A; Tu, S W; Das, A K; Shahar, Y
1996-01-01
Provision of automated support for planning protocol-directed therapy requires a computer program to take as input clinical data stored in an electronic patient-record system and to generate as output recommendations for therapeutic interventions and laboratory testing that are defined by applicable protocols. This paper presents a synthesis of research carried out at Stanford University to model the therapy-planning task and to demonstrate a component-based architecture for building protocol-based decision-support systems. We have constructed general-purpose software components that (1) interpret abstract protocol specifications to construct appropriate patient-specific treatment plans; (2) infer from time-stamped patient data higher-level, interval-based, abstract concepts; (3) perform time-oriented queries on a time-oriented patient database; and (4) allow acquisition and maintenance of protocol knowledge in a manner that facilitates efficient processing both by humans and by computers. We have implemented these components in a computer system known as EON. Each of the components has been developed, evaluated, and reported independently. We have evaluated the integration of the components as a composite architecture by implementing T-HELPER, a computer-based patient-record system that uses EON to offer advice regarding the management of patients who are following clinical trial protocols for AIDS or HIV infection. A test of the reuse of the software components in a different clinical domain demonstrated rapid development of a prototype application to support protocol-based care of patients who have breast cancer. PMID:8930854
Nair, Shalima S; Luu, Phuc-Loi; Qu, Wenjia; Maddugoda, Madhavi; Huschtscha, Lily; Reddel, Roger; Chenevix-Trench, Georgia; Toso, Martina; Kench, James G; Horvath, Lisa G; Hayes, Vanessa M; Stricker, Phillip D; Hughes, Timothy P; White, Deborah L; Rasko, John E J; Wong, Justin J-L; Clark, Susan J
2018-05-28
Comprehensive genome-wide DNA methylation profiling is critical to gain insights into epigenetic reprogramming during development and disease processes. Among the different genome-wide DNA methylation technologies, whole genome bisulphite sequencing (WGBS) is considered the gold standard for assaying genome-wide DNA methylation at single base resolution. However, the high sequencing cost to achieve the optimal depth of coverage limits its application in both basic and clinical research. To achieve 15× coverage of the human methylome, using WGBS, requires approximately three lanes of 100-bp-paired-end Illumina HiSeq 2500 sequencing. It is important, therefore, for advances in sequencing technologies to be developed to enable cost-effective high-coverage sequencing. In this study, we provide an optimised WGBS methodology, from library preparation to sequencing and data processing, to enable 16-20× genome-wide coverage per single lane of HiSeq X Ten, HCS 3.3.76. To process and analyse the data, we developed a WGBS pipeline (METH10X) that is fast and can call SNPs. We performed WGBS on both high-quality intact DNA and degraded DNA from formalin-fixed paraffin-embedded tissue. First, we compared different library preparation methods on the HiSeq 2500 platform to identify the best method for sequencing on the HiSeq X Ten. Second, we optimised the PhiX and genome spike-ins to achieve higher quality and coverage of WGBS data on the HiSeq X Ten. Third, we performed integrated whole genome sequencing (WGS) and WGBS of the same DNA sample in a single lane of HiSeq X Ten to improve data output. Finally, we compared methylation data from the HiSeq 2500 and HiSeq X Ten and found high concordance (Pearson r > 0.9×). Together we provide a systematic, efficient and complete approach to perform and analyse WGBS on the HiSeq X Ten. Our protocol allows for large-scale WGBS studies at reasonable processing time and cost on the HiSeq X Ten platform.
Yu, Jie; Ren, Yan; Xi, XiaoXia; Huang, Weiqiang; Zhang, Heping
2017-01-01
Teat disinfection pre- and post-milking is important for the overall health and hygiene of dairy cows. The objective of this study was to evaluate the efficacy of a novel probiotic lactobacilli-based teat disinfectant based on changes in somatic cell count (SCC) and profiling of the bacterial community. A total of 69 raw milk samples were obtained from eleven Holstein-Friesian dairy cows over 12 days of teat dipping in China. Single molecule, real-time sequencing technology (SMRT) was employed to profile changes in the bacterial community during the cleaning protocol and to compare the efficacy of probiotic lactic acid bacteria (LAB) and commercial teat disinfectants. The SCC gradually decreased following the cleaning protocol and the SCC of the LAB group was slightly lower than that of the commercial disinfectant (CD) group. Our SMRT sequencing results indicate that raw milk from both the LAB and CD groups contained diverse microbial populations that changed over the course of the cleaning protocol. The relative abundances of some species were significantly changed during the cleaning process, which may explain the observed bacterial community differences. Collectively, these results suggest that the LAB disinfectant could reduce mastitis-associated bacteria and improve the microbial environment of the cow teat. It could be used as an alternative to chemical pre- and post-milking teat disinfectants to maintain healthy teats and udders. In addition, the Pacific Biosciences SMRT sequencing with the full-length 16S ribosomal RNA gene was shown to be a powerful tool for monitoring changes in the bacterial population during the cleaning protocol. PMID:29018412
Traboulsee, A.; Simon, J.H.; Stone, L.; Fisher, E.; Jones, D.E.; Malhotra, A.; Newsome, S.D.; Oh, J.; Reich, D.S.; Richert, N.; Rammohan, K.; Khan, O.; Radue, E.-W.; Ford, C.; Halper, J.; Li, D.
2016-01-01
SUMMARY An international group of neurologists and radiologists developed revised guidelines for standardized brain and spinal cord MR imaging for the diagnosis and follow-up of MS. A brain MR imaging with gadolinium is recommended for the diagnosis of MS. A spinal cord MR imaging is recommended if the brain MR imaging is nondiagnostic or if the presenting symptoms are at the level of the spinal cord. A follow-up brain MR imaging with gadolinium is recommended to demonstrate dissemination in time and ongoing clinically silent disease activity while on treatment, to evaluate unexpected clinical worsening, to re-assess the original diagnosis, and as a new baseline before starting or modifying therapy. A routine brain MR imaging should be considered every 6 months to 2 years for all patients with relapsing MS. The brain MR imaging protocol includes 3D T1-weighted, 3D T2-FLAIR, 3D T2-weighted, post-single-dose gadolinium-enhanced T1-weighted sequences, and a DWI sequence. The progressive multifocal leukoencephalopathy surveillance protocol includes FLAIR and DWI sequences only. The spinal cord MR imaging protocol includes sagittal T1-weighted and proton attenuation, STIR or phase-sensitive inversion recovery, axial T2- or T2*-weighted imaging through suspicious lesions, and, in some cases, postcontrast gadolinium-enhanced T1-weighted imaging. The clinical question being addressed should be provided in the requisition for the MR imaging. The radiology report should be descriptive, with results referenced to previous studies. MR imaging studies should be permanently retained and available. The current revision incorporates new clinical information and imaging techniques that have become more available. PMID:26564433
Near-optimal protocols in complex nonequilibrium transformations
Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...
2016-08-29
The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less
Practical State Machine Replication with Confidentiality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, Sisi; Zhang, Haibin
2016-01-01
We study how to enable arbitrary randomized algorithms in Byzantine fault-tolerant (BFT) settings. We formalize a randomized BFT protocol and provide a simple and efficient construction that can be built on any existing BFT protocols while adding practically no overhead. We go one step further to revisit a confidential BFT protocol (Yin et al., SOSP '03). We show that their scheme is potentially susceptible to safety and confidentiality attacks. We then present a new protocol that is secure in the stronger model we formalize, by extending the idea of a randomized BFT protocol. Our protocol uses only efficient symmetric cryptography,more » while Yin et al.'s uses costly threshold signatures. We implemented and evaluated our protocols on microbenchmarks and real-world use cases. We show that our randomized BFT protocol is as efficient as conventional BFT protocols, and our confidential BFT protocol is two to three orders of magnitude faster than Yin et al.'s, which is less secure than ours.« less
Less Is More: Efficacy of Rapid 3D-T2 SPACE in ED Patients with Acute Atypical Low Back Pain.
Koontz, Nicholas A; Wiggins, Richard H; Mills, Megan K; McLaughlin, Michael S; Pigman, Elaine C; Anzai, Yoshimi; Shah, Lubdha M
2017-08-01
Emergency department (ED) patients with acute low back pain (LBP) may present with ambiguous clinical findings that pose diagnostic challenges to exclude cauda equina syndrome (CES). As a proof of concept, we aimed to determine the efficacy of a rapid lumbar spine (LS) magnetic resonance imaging (MRI) screening protocol consisting of a single 3D-T2 SPACE FS (3D-T2 Sampling Perfection with Application optimized Contrasts using different flip angle Evolution fat saturated) sequence relative to conventional LS MRI to exclude emergently treatable pathologies in this complex patient population. LS MRI protocol including a sagittal 3D-T2 SPACE FS pulse sequence was added to the routine for ED patients presenting with acute atypical LBP over a 12-month period. Imaging findings were categorically scored on the 3D-T2 SPACE FS sequence and separately on the reference standard conventional LS MRI sequences. Patients' symptoms were obtained from review of the electronic medical record. Descriptive test statistics were performed. Of the 206 ED patients who obtained MRI for acute atypical LBP, 118 (43.3 ± 13.5 years of age; 61 female) were included. Specific pathologies detected on reference standard conventional MRI included disc herniation (n = 30), acute fracture (n = 3), synovial cyst (n = 3), epidural hematoma (n = 2), cerebrospinal fluid leak (n = 1), and leptomeningeal metastases (n = 1), and on multiple occasions these pathologies resulted in nerve root impingement (n = 36), severe spinal canal stenosis (n = 13), cord/conus compression (n = 2), and cord signal abnormality (n = 2). The 3D-T2 SPACE FS sequence was an effective screen for fracture (sensitivity [sens] = 100%, specificity [spec] = 100%), cord signal abnormality (sens = 100%, spec = 99%), and severe spinal canal stenosis (sens = 100%, spec = 96%), and identified cord compression not seen on reference standard. Motion artifact was not seen on the 3D-T2 SPACE FS but noted on 8.5% of conventional LS MRI. The 3D-T2 SPACE FS sequence MRI is a rapid, effective screen for emergently actionable pathologies that might be a cause of CES in ED patients presenting with acute atypical LBP. As this abbreviated, highly sensitive sequence requires a fraction of the acquisition time of conventional LS MRI, it has the potential of contributing to increased efficiencies in the radiology department and improved ED throughput. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Clement, Bradley J.; Barrett, Anthony C.
2003-01-01
Interacting agents that interleave planning and execution must reach consensus on their commitments to each other. In domains where agents have varying degrees of interaction and different constraints on communication and computation, agents will require different coordination protocols in order to efficiently reach consensus in real time. We briefly describe a largely unexplored class of real-time, distributed planning problems (inspired by interacting spacecraft missions), new challenges they pose, and a general approach to solving the problems. These problems involve self-interested agents that have infrequent communication but collaborate on joint activities. We describe a Shared Activity Coordination (SHAC) framework that provides a decentralized algorithm for negotiating the scheduling of shared activities in a dynamic environment, a soft, real-time approach to reaching consensus during execution with limited communication, and a foundation for customizing protocols for negotiating planner interactions. We apply SHAC to a realistic simulation of interacting Mars missions and illustrate the simplicity of protocol development.
Suraniti, Emmanuel; Studer, Vincent; Sojic, Neso; Mano, Nicolas
2011-04-01
Immobilization and electrical wiring of enzymes is of particular importance for the elaboration of efficient biosensors and can be cumbersome. Here, we report a fast and easy protocol for enzyme immobilization, and as a proof of concept, we applied it to the immobilization of bilirubin oxidase, a labile enzyme. In the first step, bilirubin oxidase is mixed with a redox hydrogel "wiring" the enzyme reaction centers to electrodes. Then, this adduct is covered by an outer layer of PEGDA made by photoinitiated polymerization of poly(ethylene-glycol) diacrylate (PEGDA) and a photoclivable precursor, DAROCUR. This two-step protocol is 18 times faster than the current state-of-the-art protocol and leads to currents 25% higher. In addition, the outer layer of PEGDA acts as a protective layer increasing the lifetime of the electrode by 100% when operating continuously for 2000 s and by 60% when kept in dry state for 24 h. This new protocol is particularly appropriate for labile enzymes that quickly denaturate. In addition, by tuning the ratio PEGDA/DAROCUR, it is possible to make the enzyme electrodes even more active or more stable.
Conservation of coconut (Cocos nucifera L.) germplasm at sub-zero temperature.
Sisunandar; Sopade, Peter A; Samosir, Yohannes M S; Rival, Alain; Adkins, Steve W
2012-01-01
Protocols are proposed for the low (-20 degree C) and ultra-low (-80 degree C) temperature storage of coconut (Cocos nucifera L.) embryos. A tissue dehydration step prior to storage, and a rapid warming step upon recovery optimized the protocol. The thermal properties of water located within embryos were monitored using differential scanning calorimetry (DSC). In the most efficient version of the protocol, embryos were dehydrated under a sterile air flow in a dehydration solution containing glucose (3.33 M) and glycerol (15 percent) for 16 hours. This protocol decreased the embryo water content from 77 to 29 percent FW and at the same time reduced the amount of freezable water down to 0.03 percent. The dehydrated embryos could be stored for up to 3 weeks at -20 degree C (12 percent producing normal plants upon recovery) or 26 weeks at -80 degree C (28 percent producing normal plants upon recovery). These results indicate that it is possible to store coconut germplasm on a medium term basis using an ultra-deep freezer unit. However for more efficient, long term storage, cryopreservation remains the preferred option.
Research of Ad Hoc Networks Access Algorithm
NASA Astrophysics Data System (ADS)
Xiang, Ma
With the continuous development of mobile communication technology, Ad Hoc access network has become a hot research, Ad Hoc access network nodes can be used to expand capacity of multi-hop communication range of mobile communication system, even business adjacent to the community, improve edge data rates. When the ad hoc network is the access network of the internet, the gateway discovery protocol is very important to choose the most appropriate gateway to guarantee the connectivity between ad hoc network and IP based fixed networks. The paper proposes a QoS gateway discovery protocol which uses the time delay and stable route to the gateway selection conditions. And according to the gateway discovery protocol, it also proposes a fast handover scheme which can decrease the handover time and improve the handover efficiency.
Yu, Bo; Su, Fei; Wang, Limin; Xu, Ke; Zhao, Bo; Xu, Ping
2011-01-01
Sporolactobacillus inulinus CASD is an efficient d-lactic acid producer with high optical purity. Here we report for the first time the draft genome sequence of S. inulinus (2,930,096 bp). The large number of annotated two-component system genes makes it possible to explore the mechanism of extraordinary lactate tolerance of S. inulinus CASD. PMID:21952540
Yu, Bo; Su, Fei; Wang, Limin; Xu, Ke; Zhao, Bo; Xu, Ping
2011-10-01
Sporolactobacillus inulinus CASD is an efficient D-lactic acid producer with high optical purity. Here we report for the first time the draft genome sequence of S. inulinus (2,930,096 bp). The large number of annotated two-component system genes makes it possible to explore the mechanism of extraordinary lactate tolerance of S. inulinus CASD.
Microplastics in seafood: Benchmark protocol for their extraction and characterization.
Dehaut, Alexandre; Cassone, Anne-Laure; Frère, Laura; Hermabessiere, Ludovic; Himber, Charlotte; Rinnert, Emmanuel; Rivière, Gilles; Lambert, Christophe; Soudant, Philippe; Huvet, Arnaud; Duflos, Guillaume; Paul-Pont, Ika
2016-08-01
Pollution of the oceans by microplastics (<5 mm) represents a major environmental problem. To date, a limited number of studies have investigated the level of contamination of marine organisms collected in situ. For extraction and characterization of microplastics in biological samples, the crucial step is the identification of solvent(s) or chemical(s) that efficiently dissolve organic matter without degrading plastic polymers for their identification in a time and cost effective way. Most published papers, as well as OSPAR recommendations for the development of a common monitoring protocol for plastic particles in fish and shellfish at the European level, use protocols containing nitric acid to digest the biological tissues, despite reports of polyamide degradation with this chemical. In the present study, six existing approaches were tested and their effects were compared on up to 15 different plastic polymers, as well as their efficiency in digesting biological matrices. Plastic integrity was evaluated through microscopic inspection, weighing, pyrolysis coupled with gas chromatography and mass spectrometry, and Raman spectrometry before and after digestion. Tissues from mussels, crabs and fish were digested before being filtered on glass fibre filters. Digestion efficiency was evaluated through microscopical inspection of the filters and determination of the relative removal of organic matter content after digestion. Five out of the six tested protocols led to significant degradation of plastic particles and/or insufficient tissue digestion. The protocol using a KOH 10% solution and incubation at 60 °C during a 24 h period led to an efficient digestion of biological tissues with no significant degradation on all tested polymers, except for cellulose acetate. This protocol appeared to be the best compromise for extraction and later identification of microplastics in biological samples and should be implemented in further monitoring studies to ensure relevance and comparison of environmental and seafood product quality studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi
2015-08-01
Radio Frequency Identification (RFID) based solutions are widely used for providing many healthcare applications include patient monitoring, object traceability, drug administration system and telecare medicine information system (TMIS) etc. In order to reduce malpractices and ensure patient privacy, in 2015, Srivastava et al. proposed a hash based RFID tag authentication protocol in TMIS. Their protocol uses lightweight hash operation and synchronized secret value shared between back-end server and tag, which is more secure and efficient than other related RFID authentication protocols. Unfortunately, in this paper, we demonstrate that Srivastava et al.'s tag authentication protocol has a serious security problem in that an adversary may use the stolen/lost reader to connect to the medical back-end server that store information associated with tagged objects and this privacy damage causing the adversary could reveal medical data obtained from stolen/lost readers in a malicious way. Therefore, we propose a secure and efficient RFID tag authentication protocol to overcome security flaws and improve the system efficiency. Compared with Srivastava et al.'s protocol, the proposed protocol not only inherits the advantages of Srivastava et al.'s authentication protocol for TMIS but also provides better security with high system efficiency.
Mumford, Leslie; Lam, Rachel; Wright, Virginia; Chau, Tom
2014-08-01
This study applied response efficiency theory to create the Access Technology Delivery Protocol (ATDP), a child and family-centred collaborative approach to the implementation of access technologies. We conducted a descriptive, mixed methods case study to demonstrate the ATDP method with a 12-year-old boy with no reliable means of access to an external device. Evaluations of response efficiency, satisfaction, goal attainment, technology use and participation were made after 8 and 16 weeks of training with a custom smile-based access technology. At the 16 week mark, the new access technology offered better response quality; teacher satisfaction was high; average technology usage was 3-4 times per week for up to 1 h each time; switch sensitivity and specificity reached 78% and 64%, respectively, and participation scores increased by 38%. This case supports further development and testing of the ATDP with additional children with multiple or severe disabilities.
A survey of system architecture requirements for health care-based wireless sensor networks.
Egbogah, Emeka E; Fapojuwo, Abraham O
2011-01-01
Wireless Sensor Networks (WSNs) have emerged as a viable technology for a vast number of applications, including health care applications. To best support these health care applications, WSN technology can be adopted for the design of practical Health Care WSNs (HCWSNs) that support the key system architecture requirements of reliable communication, node mobility support, multicast technology, energy efficiency, and the timely delivery of data. Work in the literature mostly focuses on the physical design of the HCWSNs (e.g., wearable sensors, in vivo embedded sensors, et cetera). However, work towards enhancing the communication layers (i.e., routing, medium access control, et cetera) to improve HCWSN performance is largely lacking. In this paper, the information gleaned from an extensive literature survey is shared in an effort to fortify the knowledge base for the communication aspect of HCWSNs. We highlight the major currently existing prototype HCWSNs and also provide the details of their routing protocol characteristics. We also explore the current state of the art in medium access control (MAC) protocols for WSNs, for the purpose of seeking an energy efficient solution that is robust to mobility and delivers data in a timely fashion. Furthermore, we review a number of reliable transport layer protocols, including a network coding based protocol from the literature, that are potentially suitable for delivering end-to-end reliability of data transmitted in HCWSNs. We identify the advantages and disadvantages of the reviewed MAC, routing, and transport layer protocols as they pertain to the design and implementation of a HCWSN. The findings from this literature survey will serve as a useful foundation for designing a reliable HCWSN and also contribute to the development and evaluation of protocols for improving the performance of future HCWSNs. Open issues that required further investigations are highlighted.
Teacher and Peer Responsivity to Pro-Social Behaviour of High Aggressors in Preschool
ERIC Educational Resources Information Center
McComas, Jennifer J.; Johnson, LeAnne; Symons, Frank J.
2005-01-01
Naturally occurring aggressive and pro-social behaviour among 12 preschool children was examined in relation to teacher and peer responsiveness. A standardized real-time direct observational protocol was used in the context of a repeated measures design to measure the frequency and sequences of aggressive and pro-social behaviour of target…
Qi, Peng; Gimode, Davis; Saha, Dipnarayan; Schröder, Stephan; Chakraborty, Debkanta; Wang, Xuewen; Dida, Mathews M; Malmberg, Russell L; Devos, Katrien M
2018-06-15
Research on orphan crops is often hindered by a lack of genomic resources. With the advent of affordable sequencing technologies, genotyping an entire genome or, for large-genome species, a representative fraction of the genome has become feasible for any crop. Nevertheless, most genotyping-by-sequencing (GBS) methods are geared towards obtaining large numbers of markers at low sequence depth, which excludes their application in heterozygous individuals. Furthermore, bioinformatics pipelines often lack the flexibility to deal with paired-end reads or to be applied in polyploid species. UGbS-Flex combines publicly available software with in-house python and perl scripts to efficiently call SNPs from genotyping-by-sequencing reads irrespective of the species' ploidy level, breeding system and availability of a reference genome. Noteworthy features of the UGbS-Flex pipeline are an ability to use paired-end reads as input, an effective approach to cluster reads across samples with enhanced outputs, and maximization of SNP calling. We demonstrate use of the pipeline for the identification of several thousand high-confidence SNPs with high representation across samples in an F 3 -derived F 2 population in the allotetraploid finger millet. Robust high-density genetic maps were constructed using the time-tested mapping program MAPMAKER which we upgraded to run efficiently and in a semi-automated manner in a Windows Command Prompt Environment. We exploited comparative GBS with one of the diploid ancestors of finger millet to assign linkage groups to subgenomes and demonstrate the presence of chromosomal rearrangements. The paper combines GBS protocol modifications, a novel flexible GBS analysis pipeline, UGbS-Flex, recommendations to maximize SNP identification, updated genetic mapping software, and the first high-density maps of finger millet. The modules used in the UGbS-Flex pipeline and for genetic mapping were applied to finger millet, an allotetraploid selfing species without a reference genome, as a case study. The UGbS-Flex modules, which can be run independently, are easily transferable to species with other breeding systems or ploidy levels.
Gillotin, Sébastien; Guillemot, François
2016-06-20
Chromatin immunoprecipitation followed by deep sequencing (ChIP-Seq) is an important strategy to study gene regulation. When availability of cells is limited, however, it can be useful to focus on specific genes to investigate in depth the role of transcription factors or histone marks. Unfortunately, performing ChIP experiments to study transcription factors' binding to DNA can be difficult when biological material is restricted. This protocol describes a robust method to perform μChIP for over-expressed or endogenous transcription factors using ~100,000 cells per ChIP experiment (Masserdotti et al ., 2015). We also describe optimization steps, which we think are critical for this protocol to work and which can be used to further reduce the number of cells.
Methods for the extraction and RNA profiling of exosomes
Zeringer, Emily; Li, Mu; Barta, Tim; Schageman, Jeoffrey; Pedersen, Ketil Winther; Neurauter, Axl; Magdaleno, Susan; Setterquist, Robert; Vlassov, Alexander V
2013-01-01
AIM: To develop protocols for isolation of exosomes and characterization of their RNA content. METHODS: Exosomes were extracted from HeLa cell culture media and human blood serum using the Total exosome isolation (from cell culture media) reagent, and Total exosome isolation (from serum) reagent respectively. Identity and purity of the exosomes was confirmed by Nanosight® analysis, electron microscopy, and Western blots for CD63 marker. Exosomal RNA cargo was recovered with the Total exosome RNA and protein isolation kit. Finally, RNA was profiled using Bioanalyzer and quantitative reverse transcription-polymerase chain reaction (qRT-PCR) methodology. RESULTS: Here we describe a novel approach for robust and scalable isolation of exosomes from cell culture media and serum, with subsequent isolation and analysis of RNA residing within these vesicles. The isolation procedure is completed in a fraction of the time, compared to the current standard protocols utilizing ultracentrifugation, and allows to recover fully intact exosomes in higher yields. Exosomes were found to contain a very diverse RNA cargo, primarily short sequences 20-200 nt (such as miRNA and fragments of mRNA), however longer RNA species were detected as well, including full-length 18S and 28S rRNA. CONCLUSION: We have successfully developed a set of reagents and a workflow allowing fast and efficient extraction of exosomes, followed by isolation of RNA and its analysis by qRT-PCR and other techniques. PMID:25237619
Janjua, M Burhan; Hoffman, Caitlin E; Souweidane, Mark M
2017-11-01
The management of hydrocephalus can be challenging even in expert hands. Due to acute presentation, recurrence, accompanying complications, the need for urgent diagnosis; a robust management plan is an absolute necessity. We devised a novel time efficient surveillance strategy during emergency, and clinic follow up settings which has never been described in the literature. We searched all articles embracing management/surveillance protocol on pediatric hydrocephalus utilizing the terms "hydrocephalus follow up" or "surveillance protocol after hydrocephalus treatment". The authors present their own strategy based on vast experience in the hydrocephalus management at a single institution. The need for the diagnostic laboratory testing, age and presentation based radiological imaging, significance of neuro-opthalmological exam, and when to consider the emergent exploration have been discussed in detail. Moreover, a definitive triaging strategy has been described with the help of flow chart diagrams for clinicians, and the neurosurgeons in practice. The triage starts from detail history, physical exam, necessary labs, radiological imaging depending on the presentation, and the age of the child. A quick head CT scan helps after shunt surgery while, a FAST sequence MRI scan (fsMRI) is important in post ETV patients. The need for neuro-opthalmological exam, and the shunt series stays vital in asymptomatic patients during regular follow up. Copyright © 2017 Elsevier Ltd. All rights reserved.
Olova, Nelly; Krueger, Felix; Andrews, Simon; Oxley, David; Berrens, Rebecca V; Branco, Miguel R; Reik, Wolf
2018-03-15
Whole-genome bisulfite sequencing (WGBS) is becoming an increasingly accessible technique, used widely for both fundamental and disease-oriented research. Library preparation methods benefit from a variety of available kits, polymerases and bisulfite conversion protocols. Although some steps in the procedure, such as PCR amplification, are known to introduce biases, a systematic evaluation of biases in WGBS strategies is missing. We perform a comparative analysis of several commonly used pre- and post-bisulfite WGBS library preparation protocols for their performance and quality of sequencing outputs. Our results show that bisulfite conversion per se is the main trigger of pronounced sequencing biases, and PCR amplification builds on these underlying artefacts. The majority of standard library preparation methods yield a significantly biased sequence output and overestimate global methylation. Importantly, both absolute and relative methylation levels at specific genomic regions vary substantially between methods, with clear implications for DNA methylation studies. We show that amplification-free library preparation is the least biased approach for WGBS. In protocols with amplification, the choice of bisulfite conversion protocol or polymerase can significantly minimize artefacts. To aid with the quality assessment of existing WGBS datasets, we have integrated a bias diagnostic tool in the Bismark package and offer several approaches for consideration during the preparation and analysis of WGBS datasets.
Walker, Andreas; Siemann, Holger; Groten, Svenja; Ross, R Stefan; Scherbaum, Norbert; Timm, Jörg
2015-09-01
People who inject drugs (PWID) are the most important risk group for incident Hepatitis C virus (HCV) infection. In PWID in Europe HCV genotype 3a is highly prevalent. Unfortunately, many of the recently developed directly acting antiviral drugs against HCV (DAAs) are suboptimal for treatment of this genotype. Detection of resistance-associated variants (RAV) in genotype 3a may help to optimize treatment decisions, however, robust protocols for amplification and sequencing of HCV NS5A as an important target for treatment of genotype 3a are currently lacking. The aim of this study was to establish a protocol for sequencing of HCV NS5A in genotype 3a and to determine the frequency of RAVs in treatment-naïve PWID living in Germany. The full NS5A region was amplified and sequenced from 110 HCV genotype 3a infected PWID using an in-house PCR protocol. With the established protocol the complete NS5A region was successfully amplified and sequenced from 110 out of 112 (98.2%) genotype 3a infected PWID. Phylogenetic analysis of sequences from PWID together with unrelated genotype 3a sequences from a public database showed a scattered distribution without geographic clustering. Viral polymorphisms A30K and Y93H known to confer resistance in a GT3a replication model were present in 8 subjects (7.2%). A protocol for amplification of nearly all GT3a samples was successfully established. Substitutions conferring resistance to NS5A inhibitors were detected in a few treatment-naive PWID. Copyright © 2015 Elsevier B.V. All rights reserved.
Laboratory procedures to generate viral metagenomes.
Thurber, Rebecca V; Haynes, Matthew; Breitbart, Mya; Wegley, Linda; Rohwer, Forest
2009-01-01
This collection of laboratory protocols describes the steps to collect viruses from various samples with the specific aim of generating viral metagenome sequence libraries (viromes). Viral metagenomics, the study of uncultured viral nucleic acid sequences from different biomes, relies on several concentration, purification, extraction, sequencing and heuristic bioinformatic methods. No single technique can provide an all-inclusive approach, and therefore the protocols presented here will be discussed in terms of hypothetical projects. However, care must be taken to individualize each step depending on the source and type of viral-particles. This protocol is a description of the processes we have successfully used to: (i) concentrate viral particles from various types of samples, (ii) eliminate contaminating cells and free nucleic acids and (iii) extract, amplify and purify viral nucleic acids. Overall, a sample can be processed to isolate viral nucleic acids suitable for high-throughput sequencing in approximately 1 week.
Bartlett, Sofia R; Grebely, Jason; Eltahla, Auda A; Reeves, Jacqueline D; Howe, Anita Y M; Miller, Veronica; Ceccherini-Silberstein, Francesca; Bull, Rowena A; Douglas, Mark W; Dore, Gregory J; Harrington, Patrick; Lloyd, Andrew R; Jacka, Brendan; Matthews, Gail V; Wang, Gary P; Pawlotsky, Jean-Michel; Feld, Jordan J; Schinkel, Janke; Garcia, Federico; Lennerstrand, Johan; Applegate, Tanya L
2017-07-01
The significance of the clinical impact of direct-acting antiviral (DAA) resistance-associated substitutions (RASs) in hepatitis C virus (HCV) on treatment failure is unclear. No standardized methods or guidelines for detection of DAA RASs in HCV exist. To facilitate further evaluations of the impact of DAA RASs in HCV, we conducted a systematic review of RAS sequencing protocols, compiled a comprehensive public library of sequencing primers, and provided expert guidance on the most appropriate methods to screen and identify RASs. The development of standardized RAS sequencing protocols is complicated due to a high genetic variability and the need for genotype- and subtype-specific protocols for multiple regions. We have identified several limitations of the available methods and have highlighted areas requiring further research and development. The development, validation, and sharing of standardized methods for all genotypes and subtypes should be a priority. ( Hepatology Communications 2017;1:379-390).
Spitzer Space Telescope Sequencing Operations Software, Strategies, and Lessons Learned
NASA Technical Reports Server (NTRS)
Bliss, David A.
2006-01-01
The Space Infrared Telescope Facility (SIRTF) was launched in August, 2003, and renamed to the Spitzer Space Telescope in 2004. Two years of observing the universe in the wavelength range from 3 to 180 microns has yielded enormous scientific discoveries. Since this magnificent observatory has a limited lifetime, maximizing science viewing efficiency (ie, maximizing time spent executing activities directly related to science observations) was the key operational objective. The strategy employed for maximizing science viewing efficiency was to optimize spacecraft flexibility, adaptability, and use of observation time. The selected approach involved implementation of a multi-engine sequencing architecture coupled with nondeterministic spacecraft and science execution times. This approach, though effective, added much complexity to uplink operations and sequence development. The Jet Propulsion Laboratory (JPL) manages Spitzer s operations. As part of the uplink process, Spitzer s Mission Sequence Team (MST) was tasked with processing observatory inputs from the Spitzer Science Center (SSC) into efficiently integrated, constraint-checked, and modeled review and command products which accommodated the complexity of non-deterministic spacecraft and science event executions without increasing operations costs. The MST developed processes, scripts, and participated in the adaptation of multi-mission core software to enable rapid processing of complex sequences. The MST was also tasked with developing a Downlink Keyword File (DKF) which could instruct Deep Space Network (DSN) stations on how and when to configure themselves to receive Spitzer science data. As MST and uplink operations developed, important lessons were learned that should be applied to future missions, especially those missions which employ command-intensive operations via a multi-engine sequence architecture.
Neuromodulating Attention and Mind-Wandering Processes with a Single Session Real Time EEG.
Gonçalves, Óscar F; Carvalho, Sandra; Mendes, Augusto J; Leite, Jorge; Boggio, Paulo S
2018-06-01
Our minds are continuously alternating between external attention (EA) and mind wandering (MW). An appropriate balance between EA and MW is important for promoting efficient perceptual processing, executive functioning, decision-making, auto-biographical memory, and creativity. There is evidence that EA processes are associated with increased activity in high-frequency EEG bands (e.g., SMR), contrasting with the dominance of low-frequency bands during MW (e.g., Theta). The aim of the present study was to test the effects of two distinct single session real-time EEG (rtEEG) protocols (SMR up-training/Theta down-training-SMR⇑Theta⇓; Theta up-training/SMR down-training-Theta⇑SMR⇓) on EA and MW processes. Thirty healthy volunteers were randomly assigned to one of two rtEEG training protocols (SMR⇑Theta⇓; Theta⇑SMR⇓). Before and after the rtEEG training, participants completed the attention network task (ANT) along with several MW measures. Both training protocols were effective in increasing SMR (SMR⇑Theta⇓) and theta (Theta⇑SMR⇓) amplitudes but not in decreasing the amplitude of down-trained bands. There were no significant effects of the rtEEG training in either EA or MW measures. However, there was a significant positive correlation between post-training SMR increases and the use of deliberate MW (rather than spontaneous) strategies. Additionally, for the Theta⇑SMR⇓ protocol, increase in post-training Theta amplitude was significantly associated with a decreased efficiency in the orientation network.
The "Motor" in Implicit Motor Sequence Learning: A Foot-stepping Serial Reaction Time Task.
Du, Yue; Clark, Jane E
2018-05-03
This protocol describes a modified serial reaction time (SRT) task used to study implicit motor sequence learning. Unlike the classic SRT task that involves finger-pressing movements while sitting, the modified SRT task requires participants to step with both feet while maintaining a standing posture. This stepping task necessitates whole body actions that impose postural challenges. The foot-stepping task complements the classic SRT task in several ways. The foot-stepping SRT task is a better proxy for the daily activities that require ongoing postural control, and thus may help us better understand sequence learning in real-life situations. In addition, response time serves as an indicator of sequence learning in the classic SRT task, but it is unclear whether response time, reaction time (RT) representing mental process, or movement time (MT) reflecting the movement itself, is a key player in motor sequence learning. The foot-stepping SRT task allows researchers to disentangle response time into RT and MT, which may clarify how motor planning and movement execution are involved in sequence learning. Lastly, postural control and cognition are interactively related, but little is known about how postural control interacts with learning motor sequences. With a motion capture system, the movement of the whole body (e.g., the center of mass (COM)) can be recorded. Such measures allow us to reveal the dynamic processes underlying discrete responses measured by RT and MT, and may aid in elucidating the relationship between postural control and the explicit and implicit processes involved in sequence learning. Details of the experimental set-up, procedure, and data processing are described. The representative data are adopted from one of our previous studies. Results are related to response time, RT, and MT, as well as the relationship between the anticipatory postural response and the explicit processes involved in implicit motor sequence learning.
Enguita, Francisco J.; Costa, Marina C.; Fusco-Almeida, Ana Marisa; Mendes-Giannini, Maria José; Leitão, Ana Lúcia
2016-01-01
Fungal invasive infections are an increasing health problem. The intrinsic complexity of pathogenic fungi and the unmet clinical need for new and more effective treatments requires a detailed knowledge of the infection process. During infection, fungal pathogens are able to trigger a specific transcriptional program in their host cells. The detailed knowledge of this transcriptional program will allow for a better understanding of the infection process and consequently will help in the future design of more efficient therapeutic strategies. Simultaneous transcriptomic studies of pathogen and host by high-throughput sequencing (dual RNA-seq) is an unbiased protocol to understand the intricate regulatory networks underlying the infectious process. This protocol is starting to be applied to the study of the interactions between fungal pathogens and their hosts. To date, our knowledge of the molecular basis of infection for fungal pathogens is still very limited, and the putative role of regulatory players such as non-coding RNAs or epigenetic factors remains elusive. The wider application of high-throughput transcriptomics in the near future will help to understand the fungal mechanisms for colonization and survival, as well as to characterize the molecular responses of the host cell against a fungal infection. PMID:29376924
Robust and effective methodologies for cryopreservation and DNA extraction from anaerobic gut fungi.
Solomon, Kevin V; Henske, John K; Theodorou, Michael K; O'Malley, Michelle A
2016-04-01
Cell storage and DNA isolation are essential to developing an expanded suite of microorganisms for biotechnology. However, many features of non-model microbes, such as an anaerobic lifestyle and rigid cell wall, present formidable challenges to creating strain repositories and extracting high quality genomic DNA. Here, we establish accessible, high efficiency, and robust techniques to store lignocellulolytic anaerobic gut fungi long term without specialized equipment. Using glycerol as a cryoprotectant, gut fungal isolates were preserved for a minimum of 23 months at -80 °C. Unlike previously reported approaches, this improved protocol is non-toxic and rapid, with samples surviving twice as long with negligible growth impact. Genomic DNA extraction for these isolates was optimized to yield samples compatible with next generation sequencing platforms (e.g. Illumina, PacBio). Popular DNA isolation kits and precipitation protocols yielded preps that were unsuitable for sequencing due to carbohydrate contaminants from the chitin-rich cell wall and extensive energy reserves of gut fungi. To address this, we identified a proprietary method optimized for hardy plant samples that rapidly yielded DNA fragments in excess of 10 kb with minimal RNA, protein or carbohydrate contamination. Collectively, these techniques serve as fundamental tools to manipulate powerful biomass-degrading gut fungi and improve their accessibility among researchers. Copyright © 2015 Elsevier Ltd. All rights reserved.
James, Dorsha N; Voskresensky, Igor V; Jack, Meg; Cotton, Bryan A
2009-06-01
Pre-hospital airway management represents the intervention most likely to impact outcomes in critically injured patients. As such, airway management issues dominate quality improvement (QI) reviews of aero-medical programs. The purpose of this study was to evaluate current practice patterns of airway management in trauma among U.S. aero-medical service (AMS) programs. The Association of Air Medical Services (AAMS) Resource Guide from 2005 to 2006 was utilized to identify the e-mail addresses of all directors of U.S. aero-medical transport programs. Program directors from 182 U.S. aero-medical programs were asked to participate in an anonymous, web-based survey of emergency airway management protocols and practices. Non-responders to the initial request were contacted a second time by e-mail. 89 programs responded. 98.9% have rapid sequence intubation (RSI) protocols. 90% use succinylcholine, 70% use long-acting neuromuscular blockers (NMB) within their RSI protocol. 77% have protocols for mandatory in-flight sedation but only 13% have similar protocols for maintenance paralytics. 60% administer long-acting NMB immediately after RSI, 13% after confirmation of neurological activity. Given clinical scenarios, however, 97% administer long-acting NMB to patients with scene and in-flight Glasgow Coma Scale (GCS) of 3, even for brief transport times. The majority of AMS programs have well defined RSI and in-flight sedation protocols, while protocols for in-flight NMB are uncommon. Despite this, nearly all programs administer long-acting NMB following RSI, irrespective of GCS or flight time. Given the impact of in-flight NMB on initial assessment, early intervention, and injury severity scoring, a critical appraisal of current AMS airway management practices appears warranted.
GPSit: An automated method for evolutionary analysis of nonculturable ciliated microeukaryotes.
Chen, Xiao; Wang, Yurui; Sheng, Yalan; Warren, Alan; Gao, Shan
2018-05-01
Microeukaryotes are among the most important components of the microbial food web in almost all aquatic and terrestrial ecosystems worldwide. In order to gain a better understanding their roles and functions in ecosystems, sequencing coupled with phylogenomic analyses of entire genomes or transcriptomes is increasingly used to reconstruct the evolutionary history and classification of these microeukaryotes and thus provide a more robust framework for determining their systematics and diversity. More importantly, phylogenomic research usually requires high levels of hands-on bioinformatics experience. Here, we propose an efficient automated method, "Guided Phylogenomic Search in trees" (GPSit), which starts from predicted protein sequences of newly sequenced species and a well-defined customized orthologous database. Compared with previous protocols, our method streamlines the entire workflow by integrating all essential and other optional operations. In so doing, the manual operation time for reconstructing phylogenetic relationships is reduced from days to several hours, compared to other methods. Furthermore, GPSit supports user-defined parameters in most steps and thus allows users to adapt it to their studies. The effectiveness of GPSit is demonstrated by incorporating available online data and new single-cell data of three nonculturable marine ciliates (Anteholosticha monilata, Deviata sp. and Diophrys scutum) under moderate sequencing coverage (~5×). Our results indicate that the former could reconstruct robust "deep" phylogenetic relationships while the latter reveals the presence of intermediate taxa in shallow relationships. Based on empirical phylogenomic data, we also used GPSit to evaluate the impact of different levels of missing data on two commonly used methods of phylogenetic analyses, maximum likelihood (ML) and Bayesian inference (BI) methods. We found that BI is less sensitive to missing data when fast-evolving sites are removed. © 2018 John Wiley & Sons Ltd.