Canver, Matthew C; Lessard, Samuel; Pinello, Luca; Wu, Yuxuan; Ilboudo, Yann; Stern, Emily N; Needleman, Austen J; Galactéros, Frédéric; Brugnara, Carlo; Kutlar, Abdullah; McKenzie, Colin; Reid, Marvin; Chen, Diane D; Das, Partha Pratim; A Cole, Mitchel; Zeng, Jing; Kurita, Ryo; Nakamura, Yukio; Yuan, Guo-Cheng; Lettre, Guillaume; Bauer, Daniel E; Orkin, Stuart H
2017-04-01
Cas9-mediated, high-throughput, saturating in situ mutagenesis permits fine-mapping of function across genomic segments. Disease- and trait-associated variants identified in genome-wide association studies largely cluster at regulatory loci. Here we demonstrate the use of multiple designer nucleases and variant-aware library design to interrogate trait-associated regulatory DNA at high resolution. We developed a computational tool for the creation of saturating-mutagenesis libraries with single or multiple nucleases with incorporation of variants. We applied this methodology to the HBS1L-MYB intergenic region, which is associated with red-blood-cell traits, including fetal hemoglobin levels. This approach identified putative regulatory elements that control MYB expression. Analysis of genomic copy number highlighted potential false-positive regions, thus emphasizing the importance of off-target analysis in the design of saturating-mutagenesis experiments. Together, these data establish a widely applicable high-throughput and high-resolution methodology to identify minimal functional sequences within large disease- and trait-associated regions.
International business communications via Intelsat K-band transponders
NASA Astrophysics Data System (ADS)
Hagmann, W.; Rhodes, S.; Fang, R.
This paper discusses how the transponder throughput and the required earth station HPA power in the Intelsat Business Services Network vary as a function of coding rate and required fade margin. The results indicate that transponder throughputs of 40 to 50 Mbit/s are achievable. A comparison of time domain simulation results with results based on a straightforward link analysis shows that the link analysis results may be fairly optimistic if the satellite traveling wave tube amplifier (TWTA) is operated near saturation; however, there is good agreement for large backoffs.
Functional mapping of yeast genomes by saturated transposition
Michel, Agnès H; Hatakeyama, Riko; Kimmig, Philipp; Arter, Meret; Peter, Matthias; Matos, Joao; De Virgilio, Claudio; Kornmann, Benoît
2017-01-01
Yeast is a powerful model for systems genetics. We present a versatile, time- and labor-efficient method to functionally explore the Saccharomyces cerevisiae genome using saturated transposon mutagenesis coupled to high-throughput sequencing. SAturated Transposon Analysis in Yeast (SATAY) allows one-step mapping of all genetic loci in which transposons can insert without disrupting essential functions. SATAY is particularly suited to discover loci important for growth under various conditions. SATAY (1) reveals positive and negative genetic interactions in single and multiple mutant strains, (2) can identify drug targets, (3) detects not only essential genes, but also essential protein domains, (4) generates both null and other informative alleles. In a SATAY screen for rapamycin-resistant mutants, we identify Pib2 (PhosphoInositide-Binding 2) as a master regulator of TORC1. We describe two antagonistic TORC1-activating and -inhibiting activities located on opposite ends of Pib2. Thus, SATAY allows to easily explore the yeast genome at unprecedented resolution and throughput. DOI: http://dx.doi.org/10.7554/eLife.23570.001 PMID:28481201
USDA-ARS?s Scientific Manuscript database
Rapid development of highly saturated genetic maps aids molecular breeding, which can accelerate gain per breeding cycle in woody perennial plants such as Rubus idaeus (red raspberry). Recently, robust genotyping methods based on high-throughput sequencing were developed, which provide high marker d...
Beyond the Natural Proteome: Nondegenerate Saturation Mutagenesis-Methodologies and Advantages.
Ferreira Amaral, M M; Frigotto, L; Hine, A V
2017-01-01
Beyond the natural proteome, high-throughput mutagenesis offers the protein engineer an opportunity to "tweak" the wild-type activity of a protein to create a recombinant protein with required attributes. Of the various approaches available, saturation mutagenesis is one of the core techniques employed by protein engineers, and in recent times, nondegenerate saturation mutagenesis is emerging as the approach of choice. This review compares the current methodologies available for conducting nondegenerate saturation mutagenesis with traditional, degenerate saturation and briefly outlines the options available for screening the resulting libraries, to discover a novel protein with the required activity and/or specificity. © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Peng, Kaung-Jay; Wu, Chun-Lung; Lin, Yung-Hsiang; Wang, Hwai-Yung; Cheng, Chih-Hsien; Chi, Yu-Chieh; Lin, Gong-Ru
2018-01-01
Using the evanescent-wave saturation effect of hydrogen-free low-temperature synthesized few-layer graphene covered on the cladding region of a side-polished single-mode fiber, a blue pump/infrared probe-based all-optical switch is demonstrated with specific wavelength-dependent probe modulation efficiency. Under the illumination of a blue laser diode at 405 nm, the few-layer graphene exhibits cross-gain modulation at different wavelengths covering the C- and L-bands. At a probe power of 0.5 mW, the L-band switching throughput power variant of 16 μW results in a probe modulation depth of 3.2%. Blue shifting the probe wavelength from 1580 to 1520 nm further enlarges the switching throughput power variant to 24 mW and enhances the probe modulation depth to 5%. Enlarging the probe power from 0.5 to 1 mW further enlarges the switching throughput power variant from 25 to 58 μW to promote its probe modulation depth of up to 5.8% at 1520 nm. In contrast, the probe modulation depth degrades from 5.1% to 1.2% as the pumping power reduces from 85 to 24 mW, which is attributed to the saturable absorption of the few-layer graphene-based evanescent-wave absorber. The modulation depth at wavelength of 1550 nm under a probe power of 1 mW increases from 1.2% to 5.1%, as more carriers can be excited when increasing the blue laser power from 24 to 85 mW, whereas it decreases from 5.1% to 3.3% by increasing the input probe power from 1 to 2 mW to show an easier saturated condition at longer wavelength.
Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I
2017-01-01
Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.
High throughput system for magnetic manipulation of cells, polymers, and biomaterials
Spero, Richard Chasen; Vicci, Leandra; Cribb, Jeremy; Bober, David; Swaminathan, Vinay; O’Brien, E. Timothy; Rogers, Stephen L.; Superfine, R.
2008-01-01
In the past decade, high throughput screening (HTS) has changed the way biochemical assays are performed, but manipulation and mechanical measurement of micro- and nanoscale systems have not benefited from this trend. Techniques using microbeads (particles ∼0.1–10 μm) show promise for enabling high throughput mechanical measurements of microscopic systems. We demonstrate instrumentation to magnetically drive microbeads in a biocompatible, multiwell magnetic force system. It is based on commercial HTS standards and is scalable to 96 wells. Cells can be cultured in this magnetic high throughput system (MHTS). The MHTS can apply independently controlled forces to 16 specimen wells. Force calibrations demonstrate forces in excess of 1 nN, predicted force saturation as a function of pole material, and powerlaw dependence of F∼r−2.7±0.1. We employ this system to measure the stiffness of SR2+ Drosophila cells. MHTS technology is a key step toward a high throughput screening system for micro- and nanoscale biophysical experiments. PMID:19044357
Stiffler, Michael A; Subramanian, Subu K; Salinas, Victor H; Ranganathan, Rama
2016-07-03
Site-directed mutagenesis has long been used as a method to interrogate protein structure, function and evolution. Recent advances in massively-parallel sequencing technology have opened up the possibility of assessing the functional or fitness effects of large numbers of mutations simultaneously. Here, we present a protocol for experimentally determining the effects of all possible single amino acid mutations in a protein of interest utilizing high-throughput sequencing technology, using the 263 amino acid antibiotic resistance enzyme TEM-1 β-lactamase as an example. In this approach, a whole-protein saturation mutagenesis library is constructed by site-directed mutagenic PCR, randomizing each position individually to all possible amino acids. The library is then transformed into bacteria, and selected for the ability to confer resistance to β-lactam antibiotics. The fitness effect of each mutation is then determined by deep sequencing of the library before and after selection. Importantly, this protocol introduces methods which maximize sequencing read depth and permit the simultaneous selection of the entire mutation library, by mixing adjacent positions into groups of length accommodated by high-throughput sequencing read length and utilizing orthogonal primers to barcode each group. Representative results using this protocol are provided by assessing the fitness effects of all single amino acid mutations in TEM-1 at a clinically relevant dosage of ampicillin. The method should be easily extendable to other proteins for which a high-throughput selection assay is in place.
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Poelgeest, F.; Niko, H.; Modwid, A.R.
1991-03-01
Shell Expro and Koninklijke/Shell E and P Laboratorium (KSEPL) have been engaged in a multidisciplinary effort to determine the water flood residual oil saturation (ROS) in two principal reservoirs of the Cormorant oil field in the U.K. sector of the North Sea. Data acquisition included special coring and testing. The study, which involved new reservoir-engineering and petrophysical techniques, was aimed at establishing consistent ROS values. This paper reports that reservoir-engineering work centered on reservoir-condition corefloods in the relative-permeability-at-reservoir-conditions (REPARC) apparatus, in which restoration of representative wettability condition was attempted with the aging technique. Aging results in a consistent reduction ofmore » water-wetness of all core samples. The study indicated that ROS values obtained on aged cores at water throughputs of at least 5 PV represented reservoir conditions. The petrophysical part of the study involved ROS estimation from sponge-core analysis and log evaluation.« less
Study of data I/O performance on distributed disk system in mask data preparation
NASA Astrophysics Data System (ADS)
Ohara, Shuichiro; Odaira, Hiroyuki; Chikanaga, Tomoyuki; Hamaji, Masakazu; Yoshioka, Yasuharu
2010-09-01
Data volume is getting larger every day in Mask Data Preparation (MDP). In the meantime, faster data handling is always required. MDP flow typically introduces Distributed Processing (DP) system to realize the demand because using hundreds of CPU is a reasonable solution. However, even if the number of CPU were increased, the throughput might be saturated because hard disk I/O and network speeds could be bottlenecks. So, MDP needs to invest a lot of money to not only hundreds of CPU but also storage and a network device which make the throughput faster. NCS would like to introduce new distributed processing system which is called "NDE". NDE could be a distributed disk system which makes the throughput faster without investing a lot of money because it is designed to use multiple conventional hard drives appropriately over network. NCS studies I/O performance with OASIS® data format on NDE which contributes to realize the high throughput in this paper.
A tunable hole-burning filter for lidar applications
NASA Astrophysics Data System (ADS)
Billmers, R. I.; Davis, J.; Squicciarini, M.
The fundamental physical principles for the development of a 'hole-burning' optical filter based on saturable absorption in dye-doped glasses are outlined. A model was developed to calculate the required pump intensity, throughput, and linewidth for this type of filter. Rhodamine 6G, operating at 532 nm, was found to require a 'warm-up' time of 110 pulses and a pump intensity of 100 kW/sq cm per pulse. The linewidth was calculated to be approximately 15 GHz at 77 K with a throughput of at least 25 percent and five orders of magnitude noise suppression. A 'hole-burning' filter offers significant advantages over current filter technology, including tunability over a 10-nm bandwidth, perfect wavelength and bandwidth matching to the transmitting laser in a pulsed lidar system, transform limited response times, and moderately high throughputs (at least 25 percent).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yanguas-Gil, Angel; Elam, Jeffrey W.
2014-05-01
In this work, the authors present analytic models for atomic layer deposition (ALD) in three common experimental configurations: cross-flow, particle coating, and spatial ALD. These models, based on the plug-flow and well-mixed approximations, allow us to determine the minimum dose times and materials utilization for all three configurations. A comparison between the three models shows that throughput and precursor utilization can each be expressed by universal equations, in which the particularity of the experimental system is contained in a single parameter related to the residence time of the precursor in the reactor. For the case of cross-flow reactors, the authorsmore » show how simple analytic expressions for the reactor saturation profiles agree well with experimental results. Consequently, the analytic model can be used to extract information about the ALD surface chemistry (e. g., the reaction probability) by comparing the analytic and experimental saturation profiles, providing a useful tool for characterizing new and existing ALD processes. (C) 2014 American Vacuum Society« less
NASA Astrophysics Data System (ADS)
Yu, Nanyang; Wei, Si; Li, Meiying; Yang, Jingping; Li, Kan; Jin, Ling; Xie, Yuwei; Giesy, John P.; Zhang, Xiaowei; Yu, Hongxia
2016-04-01
Perfluorooctanoic acid (PFOA), a perfluoroalkyl acid, can result in hepatotoxicity and neurobehavioral effects in animals. The metabolome, which serves as a connection among transcriptome, proteome and toxic effects, provides pathway-based insights into effects of PFOA. Since understanding of changes in the metabolic profile during hepatotoxicity and neurotoxicity were still incomplete, a high-throughput targeted metabolomics approach (278 metabolites) was used to investigate effects of exposure to PFOA for 28 d on brain and liver of male Balb/c mice. Results of multivariate statistical analysis indicated that PFOA caused alterations in metabolic pathways in exposed individuals. Pathway analysis suggested that PFOA affected metabolism of amino acids, lipids, carbohydrates and energetics. Ten and 18 metabolites were identified as potential unique biomarkers of exposure to PFOA in brain and liver, respectively. In brain, PFOA affected concentrations of neurotransmitters, including serotonin, dopamine, norepinephrine, and glutamate in brain, which provides novel insights into mechanisms of PFOA-induced neurobehavioral effects. In liver, profiles of lipids revealed involvement of β-oxidation and biosynthesis of saturated and unsaturated fatty acids in PFOA-induced hepatotoxicity, while alterations in metabolism of arachidonic acid suggesting potential of PFOA to cause inflammation response in liver. These results provide insight into the mechanism and biomarkers for PFOA-induced effects.
Zych, Konrad; Li, Yang; van der Velde, Joeri K; Joosen, Ronny V L; Ligterink, Wilco; Jansen, Ritsert C; Arends, Danny
2015-02-19
Genetic markers and maps are instrumental in quantitative trait locus (QTL) mapping in segregating populations. The resolution of QTL localization depends on the number of informative recombinations in the population and how well they are tagged by markers. Larger populations and denser marker maps are better for detecting and locating QTLs. Marker maps that are initially too sparse can be saturated or derived de novo from high-throughput omics data, (e.g. gene expression, protein or metabolite abundance). If these molecular phenotypes are affected by genetic variation due to a major QTL they will show a clear multimodal distribution. Using this information, phenotypes can be converted into genetic markers. The Pheno2Geno tool uses mixture modeling to select phenotypes and transform them into genetic markers suitable for construction and/or saturation of a genetic map. Pheno2Geno excludes candidate genetic markers that show evidence for multiple possibly epistatically interacting QTL and/or interaction with the environment, in order to provide a set of robust markers for follow-up QTL mapping. We demonstrate the use of Pheno2Geno on gene expression data of 370,000 probes in 148 A. thaliana recombinant inbred lines. Pheno2Geno is able to saturate the existing genetic map, decreasing the average distance between markers from 7.1 cM to 0.89 cM, close to the theoretical limit of 0.68 cM (with 148 individuals we expect a recombination every 100/148=0.68 cM); this pinpointed almost all of the informative recombinations in the population. The Pheno2Geno package makes use of genome-wide molecular profiling and provides a tool for high-throughput de novo map construction and saturation of existing genetic maps. Processing of the showcase dataset takes less than 30 minutes on an average desktop PC. Pheno2Geno improves QTL mapping results at no additional laboratory cost and with minimum computational effort. Its results are formatted for direct use in R/qtl, the leading R package for QTL studies. Pheno2Geno is freely available on CRAN under "GNU GPL v3". The Pheno2Geno package as well as the tutorial can also be found at: http://pheno2geno.nl .
Zhu, Jianping; Tao, Zhengsu; Lv, Chunfeng
2012-01-01
Studies of the IEEE 802.15.4 Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) scheme have been received considerable attention recently, with most of these studies focusing on homogeneous or saturated traffic. Two novel transmission schemes—OSTS/BSTS (One Service a Time Scheme/Bulk Service a Time Scheme)—are proposed in this paper to improve the behaviors of time-critical buffered networks with heterogeneous unsaturated traffic. First, we propose a model which contains two modified semi-Markov chains and a macro-Markov chain combined with the theory of M/G/1/K queues to evaluate the characteristics of these two improved CSMA/CA schemes, in which traffic arrivals and accessing packets are bestowed with non-preemptive priority over each other, instead of prioritization. Then, throughput, packet delay and energy consumption of unsaturated, unacknowledged IEEE 802.15.4 beacon-enabled networks are predicted based on the overall point of view which takes the dependent interactions of different types of nodes into account. Moreover, performance comparisons of these two schemes with other non-priority schemes are also proposed. Analysis and simulation results show that delay and fairness of our schemes are superior to those of other schemes, while throughput and energy efficiency are superior to others in more heterogeneous situations. Comprehensive simulations demonstrate that the analysis results of these models match well with the simulation results. PMID:22666076
Zhu, Jianping; Tao, Zhengsu; Lv, Chunfeng
2012-01-01
Studies of the IEEE 802.15.4 Carrier Sense Multiple Access with Collision Avoidance (CSMA/CA) scheme have been received considerable attention recently, with most of these studies focusing on homogeneous or saturated traffic. Two novel transmission schemes-OSTS/BSTS (One Service a Time Scheme/Bulk Service a Time Scheme)-are proposed in this paper to improve the behaviors of time-critical buffered networks with heterogeneous unsaturated traffic. First, we propose a model which contains two modified semi-Markov chains and a macro-Markov chain combined with the theory of M/G/1/K queues to evaluate the characteristics of these two improved CSMA/CA schemes, in which traffic arrivals and accessing packets are bestowed with non-preemptive priority over each other, instead of prioritization. Then, throughput, packet delay and energy consumption of unsaturated, unacknowledged IEEE 802.15.4 beacon-enabled networks are predicted based on the overall point of view which takes the dependent interactions of different types of nodes into account. Moreover, performance comparisons of these two schemes with other non-priority schemes are also proposed. Analysis and simulation results show that delay and fairness of our schemes are superior to those of other schemes, while throughput and energy efficiency are superior to others in more heterogeneous situations. Comprehensive simulations demonstrate that the analysis results of these models match well with the simulation results.
Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method
Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo
2013-01-01
Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790
Single-cell measurement of red blood cell oxygen affinity.
Di Caprio, Giuseppe; Stokes, Chris; Higgins, John M; Schonbrun, Ethan
2015-08-11
Oxygen is transported throughout the body by hemoglobin (Hb) in red blood cells (RBCs). Although the oxygen affinity of blood is well-understood and routinely assessed in patients by pulse oximetry, variability at the single-cell level has not been previously measured. In contrast, single-cell measurements of RBC volume and Hb concentration are taken millions of times per day by clinical hematology analyzers, and they are important factors in determining the health of the hematologic system. To better understand the variability and determinants of oxygen affinity on a cellular level, we have developed a system that quantifies the oxygen saturation, cell volume, and Hb concentration for individual RBCs in high throughput. We find that the variability in single-cell saturation peaks at an oxygen partial pressure of 2.9%, which corresponds to the maximum slope of the oxygen-Hb dissociation curve. In addition, single-cell oxygen affinity is positively correlated with Hb concentration but independent of osmolarity, which suggests variation in the Hb to 2,3-diphosphoglycerate (2-3 DPG) ratio on a cellular level. By quantifying the functional behavior of a cellular population, our system adds a dimension to blood cell analysis and other measurements of single-cell variability.
Single-cell measurement of red blood cell oxygen affinity
Di Caprio, Giuseppe; Stokes, Chris; Higgins, John M.; Schonbrun, Ethan
2015-01-01
Oxygen is transported throughout the body by hemoglobin (Hb) in red blood cells (RBCs). Although the oxygen affinity of blood is well-understood and routinely assessed in patients by pulse oximetry, variability at the single-cell level has not been previously measured. In contrast, single-cell measurements of RBC volume and Hb concentration are taken millions of times per day by clinical hematology analyzers, and they are important factors in determining the health of the hematologic system. To better understand the variability and determinants of oxygen affinity on a cellular level, we have developed a system that quantifies the oxygen saturation, cell volume, and Hb concentration for individual RBCs in high throughput. We find that the variability in single-cell saturation peaks at an oxygen partial pressure of 2.9%, which corresponds to the maximum slope of the oxygen–Hb dissociation curve. In addition, single-cell oxygen affinity is positively correlated with Hb concentration but independent of osmolarity, which suggests variation in the Hb to 2,3-diphosphoglycerate (2–3 DPG) ratio on a cellular level. By quantifying the functional behavior of a cellular population, our system adds a dimension to blood cell analysis and other measurements of single-cell variability. PMID:26216973
Du, Yushen; Wu, Nicholas C.; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting
2016-01-01
ABSTRACT Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. PMID:27803181
Improvement of Biocatalysts for Industrial and Environmental Purposes by Saturation Mutagenesis
Valetti, Francesca; Gilardi, Gianfranco
2013-01-01
Laboratory evolution techniques are becoming increasingly widespread among protein engineers for the development of novel and designed biocatalysts. The palette of different approaches ranges from complete randomized strategies to rational and structure-guided mutagenesis, with a wide variety of costs, impacts, drawbacks and relevance to biotechnology. A technique that convincingly compromises the extremes of fully randomized vs. rational mutagenesis, with a high benefit/cost ratio, is saturation mutagenesis. Here we will present and discuss this approach in its many facets, also tackling the issue of randomization, statistical evaluation of library completeness and throughput efficiency of screening methods. Successful recent applications covering different classes of enzymes will be presented referring to the literature and to research lines pursued in our group. The focus is put on saturation mutagenesis as a tool for designing novel biocatalysts specifically relevant to production of fine chemicals for improving bulk enzymes for industry and engineering technical enzymes involved in treatment of waste, detoxification and production of clean energy from renewable sources. PMID:24970191
Moret, Sabrina; Scolaro, Marianna; Barp, Laura; Purcaro, Giorgia; Conte, Lanfranco S
2016-04-01
A high throughput, high-sensitivity procedure, involving simultaneous microwave-assisted extraction (MAS) and unsaponifiable extraction, followed by on-line liquid chromatography (LC)-gas chromatography (GC), has been optimised for rapid and efficient extraction and analytical determination of mineral oil saturated hydrocarbons (MOSH) and mineral oil aromatic hydrocarbons (MOAH) in cereal-based products of different composition. MAS has the advantage of eliminating fat before LC-GC analysis, allowing an increase in the amount of sample extract injected, and hence in sensitivity. The proposed method gave practically quantitative recoveries and good repeatability. Among the different cereal-based products analysed (dry semolina and egg pasta, bread, biscuits, and cakes), egg pasta packed in direct contact with recycled paperboard had on average the highest total MOSH level (15.9 mg kg(-1)), followed by cakes (10.4 mg kg(-1)) and bread (7.5 mg kg(-1)). About 50% of the pasta and bread samples and 20% of the biscuits and cake samples had detectable MOAH amounts. The highest concentrations were found in an egg pasta in direct contact with recycled paperboard (3.6 mg kg(-1)) and in a milk bread (3.6 mg kg(-1)). Copyright © 2015 Elsevier Ltd. All rights reserved.
Traffic signal synchronization.
Huang, Ding-wei; Huang, Wei-neng
2003-05-01
The benefits of traffic signal synchronization are examined within the cellular automata approach. The microsimulations of traffic flow are obtained with different settings of signal period T and time delay delta. Both numerical results and analytical approximations are presented. For undersaturated traffic, the green-light wave solutions can be realized. For saturated traffic, the correlation among the traffic signals has no effect on the throughput. For oversaturated traffic, the benefits of synchronization are manifest only when stochastic noise is suppressed.
Hewitt, Kevin C; Ghassemi Rad, Javad; McGregor, Hanna C; Brouwers, Erin; Sapp, Heidi; Short, Michael A; Fashir, Samia B; Zeng, Haishan; Alwayn, Ian P
2015-10-07
Due to the shortage of healthy donor organs, steatotic livers are commonly used for transplantation, placing patients at higher risk for graft dysfunction and lower survival rates. Raman Spectroscopy is a technique which has shown the ability to rapidly detect the vibration state of C-H bonds in triglycerides. The aim of this study is to determine whether conventional Raman spectroscopy can reliably detect and quantify fat in an animal model of liver steatosis. Mice and rats fed a methionine and choline-deficient (MCD) and control diets were sacrificed on one, two, three and four weeks' time points. A confocal Raman microscope, a commercial Raman (iRaman) fiber optic probe and a highly sensitive Raman fiber optic probe system, the latter utilizing a 785 nm excitation laser, were used to detect changes in the Raman spectra of steatotic mouse livers. Thin layer chromatography was used to assess the triglyceride content of liver specimens, and sections were scored blindly for fat content using histological examination. Principal component analysis (PCA) of Raman spectra was used to extract the principal components responsible for spectroscopic differences with MCD week (time on MCD diet). Confocal Raman microscopy revealed the presence of saturated fats in mice liver sections. A commercially available handheld Raman spectroscopy probe could not distinguish the presence of fat in the liver whereas our specially designed, high throughput Raman system could clearly distinguish lobe-specific changes in fat content. In the left lobe in particular, the Raman PC scores exhibited a significant correlation (R(2) = 0.96) with the gold standard, blinded scoring by histological examination. The specially designed, high throughput Raman system can be used for clinical purposes. Its application to the field of transplantation would enable surgeons to determine the hepatic fat content of the donor's liver in the field prior to proceeding with organ retrieval. Next steps include validating these results in a prospective analysis of human liver transplantation implant biopsies.
Identification of Nanoparticle Prototypes and Archetypes.
Fernandez, Michael; Barnard, Amanda S
2015-12-22
High-throughput (HT) computational characterization of nanomaterials is poised to accelerate novel material breakthroughs. The number of possible nanomaterials is increasing exponentially along with their complexity, and so statistical and information technology will play a fundamental role in rationalizing nanomaterials HT data. We demonstrate that multivariate statistical analysis of heterogeneous ensembles can identify the truly significant nanoparticles and their most relevant properties. Virtual samples of diamond nanoparticles and graphene nanoflakes are characterized using clustering and archetypal analysis, where we find that saturated particles are defined by their geometry, while nonsaturated nanoparticles are defined by their carbon chemistry. At the complex hull of the nanostructure spaces, a combination of complex archetypes can efficiency describe a large number of members of the ensembles, whereas the regular shapes that are typically assumed to be representative can only describe a small set of the most regular morphologies. This approach provides a route toward the characterization of computationally intractable virtual nanomaterial spaces, which can aid nanomaterials discovery in the foreseen big data scenario.
The Integrated Air Transportation System Evaluation Tool
NASA Technical Reports Server (NTRS)
Wingrove, Earl R., III; Hees, Jing; Villani, James A.; Yackovetsky, Robert E. (Technical Monitor)
2002-01-01
Throughout U.S. history, our nation has generally enjoyed exceptional economic growth, driven in part by transportation advancements. Looking forward 25 years, when the national highway and skyway systems are saturated, the nation faces new challenges in creating transportation-driven economic growth and wealth. To meet the national requirement for an improved air traffic management system, NASA developed the goal of tripling throughput over the next 20 years, in all weather conditions while maintaining safety. Analysis of the throughput goal has primarily focused on major airline operations, primarily through the hub and spoke system.However, many suggested concepts to increase throughput may operate outside the hub and spoke system. Examples of such concepts include the Small Aircraft Transportation System, civil tiltrotor, and improved rotorcraft. Proper assessment of the potential contribution of these technologies to the domestic air transportation system requires a modeling capability that includes the country's numerous smaller airports, acting as a fundamental component of the National Air space System, and the demand for such concepts and technologies. Under this task for NASA, the Logistics Management Institute developed higher fidelity demand models that capture the interdependence of short-haul air travel with other transportation modes and explicitly consider the costs of commercial air and other transport modes. To accomplish this work, we generated forecasts of the distribution of general aviation based aircraft and GA itinerant operations at each of nearly 3.000 airport based on changes in economic conditions and demographic trends. We also built modules that estimate the demand for travel by different modes, particularly auto, commercial air, and GA. We examined GA demand from two perspectives: top-down and bottom-up, described in detail.
Quantitative image analysis of immunohistochemical stains using a CMYK color model
Pham, Nhu-An; Morrison, Andrew; Schwock, Joerg; Aviel-Ronen, Sarit; Iakovlev, Vladimir; Tsao, Ming-Sound; Ho, James; Hedley, David W
2007-01-01
Background Computer image analysis techniques have decreased effects of observer biases, and increased the sensitivity and the throughput of immunohistochemistry (IHC) as a tissue-based procedure for the evaluation of diseases. Methods We adapted a Cyan/Magenta/Yellow/Key (CMYK) model for automated computer image analysis to quantify IHC stains in hematoxylin counterstained histological sections. Results The spectral characteristics of the chromogens AEC, DAB and NovaRed as well as the counterstain hematoxylin were first determined using CMYK, Red/Green/Blue (RGB), normalized RGB and Hue/Saturation/Lightness (HSL) color models. The contrast of chromogen intensities on a 0–255 scale (24-bit image file) as well as compared to the hematoxylin counterstain was greatest using the Yellow channel of a CMYK color model, suggesting an improved sensitivity for IHC evaluation compared to other color models. An increase in activated STAT3 levels due to growth factor stimulation, quantified using the Yellow channel image analysis was associated with an increase detected by Western blotting. Two clinical image data sets were used to compare the Yellow channel automated method with observer-dependent methods. First, a quantification of DAB-labeled carbonic anhydrase IX hypoxia marker in 414 sections obtained from 138 biopsies of cervical carcinoma showed strong association between Yellow channel and positive color selection results. Second, a linear relationship was also demonstrated between Yellow intensity and visual scoring for NovaRed-labeled epidermal growth factor receptor in 256 non-small cell lung cancer biopsies. Conclusion The Yellow channel image analysis method based on a CMYK color model is independent of observer biases for threshold and positive color selection, applicable to different chromogens, tolerant of hematoxylin, sensitive to small changes in IHC intensity and is applicable to simple automation procedures. These characteristics are advantageous for both basic as well as clinical research in an unbiased, reproducible and high throughput evaluation of IHC intensity. PMID:17326824
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takamiya, Mari; Discovery Technology Laboratories, Sohyaku, Innovative Research Division, Mitsubishi Tanabe Pharma Corporation, Kawagishi, Toda-shi, Saitama; Sakurai, Masaaki
A high-throughput RapidFire mass spectrometry assay is described for elongation of very long-chain fatty acids family 6 (Elovl6). Elovl6 is a microsomal enzyme that regulates the elongation of C12-16 saturated and monounsaturated fatty acids. Elovl6 may be a new therapeutic target for fat metabolism disorders such as obesity, type 2 diabetes, and nonalcoholic steatohepatitis. To identify new Elovl6 inhibitors, we developed a high-throughput fluorescence screening assay in 1536-well format. However, a number of false positives caused by fluorescent interference have been identified. To pick up the real active compounds among the primary hits from the fluorescence assay, we developed amore » RapidFire mass spectrometry assay and a conventional radioisotope assay. These assays have the advantage of detecting the main products directly without using fluorescent-labeled substrates. As a result, 276 compounds (30%) of the primary hits (921 compounds) in a fluorescence ultra-high-throughput screening method were identified as common active compounds in these two assays. It is concluded that both methods are very effective to eliminate false positives. Compared with the radioisotope method using an expensive {sup 14}C-labeled substrate, the RapidFire mass spectrometry method using unlabeled substrates is a high-accuracy, high-throughput method. In addition, some of the hit compounds selected from the screening inhibited cellular fatty acid elongation in HEK293 cells expressing Elovl6 transiently. This result suggests that these compounds may be promising lead candidates for therapeutic drugs. Ultra-high-throughput fluorescence screening followed by a RapidFire mass spectrometry assay was a suitable strategy for lead discovery against Elovl6. - Highlights: • A novel assay for elongation of very-long-chain fatty acids 6 (Elovl6) is proposed. • RapidFire mass spectrometry (RF-MS) assay is useful to select real screening hits. • RF-MS assay is proved to be beneficial because of its high-throughput and accuracy. • A combination of fluorescent and RF-MS assays is effective for Elovl6 inhibitors.« less
Erickson, Heidi S
2012-09-28
The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.
Do, Thanh D.; Comi, Troy J.; Dunham, Sage J. B.; Rubakhin, Stanislav S.; Sweedler, Jonathan V.
2017-01-01
A high-throughput single cell profiling method has been developed for matrix-enhanced secondary ion mass spectrometry (ME-SIMS) to investigate the lipid profiles of neuronal cells. Populations of cells are dispersed onto the substrate, their locations determined using optical microscopy, and the cell locations used to guide the acquisition of SIMS spectra from the cells. Up to 2,000 cells can be assayed in one experiment at a rate of 6 s per cell. Multiple saturated and unsaturated phosphatidylcholines (PCs) and their fragments are detected and verified with tandem mass spectrometry from individual cells when ionic liquids are employed as a matrix. Optically guided single cell profiling with ME-SIMS is suitable for a range of cell sizes, from Aplysia californica neurons larger than 75 μm to 7-μm rat cerebellar neurons. ME-SIMS analysis followed by t-distributed stochastic neighbor embedding of peaks in the lipid molecular mass range (m/z 700–850) distinguishes several cell types from the rat central nervous system, largely based on the relative proportions of the four dominant lipids, PC(32:0), PC(34:1), PC(36:1), and PC(38:5). Furthermore, subpopulations within each cell type are tentatively classified consistent with their endogenous lipid ratios. The results illustrate the efficacy of a new approach to classify single cell populations and subpopulations using SIMS profiling of lipid and metabolite contents. These methods are broadly applicable for high throughput single cell chemical analyses. PMID:28194949
Danhelova, Hana; Hradecky, Jaromir; Prinosilova, Sarka; Cajka, Tomas; Riddellova, Katerina; Vaclavik, Lukas; Hajslova, Jana
2012-07-01
The development and use of a fast method employing a direct analysis in real time (DART) ion source coupled to high-resolution time-of-flight mass spectrometry (TOFMS) for the quantitative analysis of caffeine in various coffee samples has been demonstrated in this study. A simple sample extraction procedure employing hot water was followed by direct, high-throughput (<1 min per run) examination of the extracts spread on a glass rod under optimized conditions of ambient mass spectrometry, without any prior chromatographic separation. For quantification of caffeine using DART-TOFMS, an external calibration was used. Isotopically labeled caffeine was used to compensate for the variations of the ion intensities of caffeine signal. Recoveries of the DART-TOFMS method were 97% for instant coffee at the spiking levels of 20 and 60 mg/g, respectively, while for roasted ground coffee, the obtained values were 106% and 107% at the spiking levels of 10 and 30 mg/g, respectively. The repeatability of the whole analytical procedure (expressed as relative standard deviation, RSD, %) was <5% for all tested spiking levels and matrices. Since the linearity range of the method was relatively narrow (two orders of magnitude), an optimization of sample dilution prior the DART-TOFMS measurement to avoid saturation of the detector was needed.
Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.
Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian
2016-07-05
This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.
On the Achievable Throughput Over TVWS Sensor Networks
Caleffi, Marcello; Cacciapuoti, Angela Sara
2016-01-01
In this letter, we study the throughput achievable by an unlicensed sensor network operating over TV white space spectrum in presence of coexistence interference. Through the letter, we first analytically derive the achievable throughput as a function of the channel ordering. Then, we show that the problem of deriving the maximum expected throughput through exhaustive search is computationally unfeasible. Finally, we derive a computational-efficient algorithm characterized by polynomial-time complexity to compute the channel set maximizing the expected throughput and, stemming from this, we derive a closed-form expression of the maximum expected throughput. Numerical simulations validate the theoretical analysis. PMID:27043565
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Andrew J.; Capo, Rosemary C.; Stewart, Brian W.
2016-09-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hakala, Jacqueline Alexandra
2016-11-22
This technical report presents the details of the Sr column configuration and the high-throughput Sr separation protocol. Data showing the performance of the method as well as the best practices for optimizing Sr isotope analysis by MC-ICP-MS is presented. Lastly, this report offers tools for data handling and data reduction of Sr isotope results from the Thermo Scientific Neptune software to assist in data quality assurance, which help avoid issues of data glut associated with high sample throughput rapid analysis.
High-throughput analysis of yeast replicative aging using a microfluidic system
Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong
2015-01-01
Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317
Performance analysis of Aloha networks with power capture and near/far effect
NASA Astrophysics Data System (ADS)
McCartin, Joseph T.
1989-06-01
An analysis is presented for the throughput characteristics for several classes of Aloha packet networks. Specifically, the throughput for variable packet length Aloha utilizing multiple power levels to induce receiver capture is derived. The results are extended to an analysis of a selective-repeat ARQ Aloha network. Analytical results are presented which indicate a significant increase in throughput for a variable packet network implementing a random two power level capture scheme. Further research into the area of the near/far effect on Aloha networks is included. Improvements in throughput for mobile radio Aloha networks which are subject to the near/far effect are presented. Tactical Command, Control and Communications (C3) systems of the future will rely on Aloha ground mobile data networks. The incorporation of power capture and the near/far effect into future tactical networks will result in improved system analysis, design, and performance.
The dynamic three-dimensional organization of the diploid yeast genome
Kim, Seungsoo; Liachko, Ivan; Brickner, Donna G; Cook, Kate; Noble, William S; Brickner, Jason H; Shendure, Jay; Dunham, Maitreya J
2017-01-01
The budding yeast Saccharomyces cerevisiae is a long-standing model for the three-dimensional organization of eukaryotic genomes. However, even in this well-studied model, it is unclear how homolog pairing in diploids or environmental conditions influence overall genome organization. Here, we performed high-throughput chromosome conformation capture on diverged Saccharomyces hybrid diploids to obtain the first global view of chromosome conformation in diploid yeasts. After controlling for the Rabl-like orientation using a polymer model, we observe significant homolog proximity that increases in saturated culture conditions. Surprisingly, we observe a localized increase in homologous interactions between the HAS1-TDA1 alleles specifically under galactose induction and saturated growth. This pairing is accompanied by relocalization to the nuclear periphery and requires Nup2, suggesting a role for nuclear pore complexes. Together, these results reveal that the diploid yeast genome has a dynamic and complex 3D organization. DOI: http://dx.doi.org/10.7554/eLife.23623.001 PMID:28537556
Yong, Kelvin J; Vaid, Tasneem M; Shilling, Patrick J; Wu, Feng-Jie; Williams, Lisa M; Deluigi, Mattia; Plückthun, Andreas; Bathgate, Ross A D; Gooley, Paul R; Scott, Daniel J
2018-04-20
α 1A - and α 1B -adrenoceptors (α 1A -AR and α 1B -AR) are closely related G protein-coupled receptors (GPCRs) that modulate the cardiovascular and nervous systems in response to binding epinephrine and norepinephrine. The GPCR gene superfamily is made up of numerous subfamilies that, like α 1A -AR and α 1B -AR, are activated by the same endogenous agonists but may modulate different physiological processes. A major challenge in GPCR research and drug discovery is determining how compounds interact with receptors at the molecular level, especially to assist in the optimization of drug leads. Nuclear magnetic resonance spectroscopy (NMR) can provide great insight into ligand-binding epitopes, modes, and kinetics. Ideally, ligand-based NMR methods require purified, well-behaved protein samples. The instability of GPCRs upon purification in detergents, however, makes the application of NMR to study ligand binding challenging. Here, stabilized α 1A -AR and α 1B -AR variants were engineered using Cellular High-throughput Encapsulation, Solubilization, and Screening (CHESS), allowing the analysis of ligand binding with Saturation Transfer Difference NMR (STD NMR). STD NMR was used to map the binding epitopes of epinephrine and A-61603 to both receptors, revealing the molecular determinants for the selectivity of A-61603 for α 1A -AR over α 1B -AR. The use of stabilized GPCRs for ligand-observed NMR experiments will lead to a deeper understanding of binding processes and assist structure-based drug design.
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
NASA Astrophysics Data System (ADS)
Zhang, Xuanni; Zhang, Chunmin
2013-01-01
A polarization interference imaging spectrometer based on Savart polariscope was presented. Its optical throughput was analyzed by Jones calculus. The throughput expression was given, and clearly showed that the optical throughput mainly depended on the intensity of incident light, transmissivity, refractive index and the layout of optical system. The simulation and analysis gave the optimum layout in view of both optical throughput and interference fringe visibility, and verified that the layout of our former design was optimum. The simulation showed that a small deviation from the optimum layout influenced interference fringe visibility little for the optimum one, but influenced severely for others, so a small deviation is admissible in the optimum, and this can mitigate the manufacture difficulty. These results pave the way for further research and engineering design.
Quantifying protein-protein interactions in high throughput using protein domain microarrays.
Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin
2010-04-01
Protein microarrays provide an efficient way to identify and quantify protein-protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain-peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (K(D)s) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein-ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein-protein interaction networks.
Optimal Time Advance In Terminal Area Arrivals: Throughput vs. Fuel Savings
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V .; Swenson, Harry N.; Haskell, William B.; Rakas, Jasenka
2011-01-01
The current operational practice in scheduling air traffic arriving at an airport is to adjust flight schedules by delay, i.e. a postponement of an aircrafts arrival at a scheduled location, to manage safely the FAA-mandated separation constraints between aircraft. To meet the observed and forecast growth in traffic demand, however, the practice of time advance (speeding up an aircraft toward a scheduled location) is envisioned for future operations as a practice additional to delay. Time advance has two potential advantages. The first is the capability to minimize, or at least reduce, the excess separation (the distances between pairs of aircraft immediately in-trail) and thereby to increase the throughput of the arriving traffic. The second is to reduce the total traffic delay when the traffic sample is below saturation density. A cost associated with time advance is the fuel expenditure required by an aircraft to speed up. We present an optimal control model of air traffic arriving in a terminal area and solve it using the Pontryagin Maximum Principle. The admissible controls allow time advance, as well as delay, some of the way. The cost function reflects the trade-off between minimizing two competing objectives: excess separation (negatively correlated with throughput) and fuel burn. A number of instances are solved using three different methods, to demonstrate consistency of solutions.
Yendrek, Craig R.; Tomaz, Tiago; Montes, Christopher M.; Cao, Youyuan; Morse, Alison M.; Brown, Patrick J.; McIntyre, Lauren M.; Leakey, Andrew D.B.
2017-01-01
High-throughput, noninvasive field phenotyping has revealed genetic variation in crop morphological, developmental, and agronomic traits, but rapid measurements of the underlying physiological and biochemical traits are needed to fully understand genetic variation in plant-environment interactions. This study tested the application of leaf hyperspectral reflectance (λ = 500–2,400 nm) as a high-throughput phenotyping approach for rapid and accurate assessment of leaf photosynthetic and biochemical traits in maize (Zea mays). Leaf traits were measured with standard wet-laboratory and gas-exchange approaches alongside measurements of leaf reflectance. Partial least-squares regression was used to develop a measure of leaf chlorophyll content, nitrogen content, sucrose content, specific leaf area, maximum rate of phosphoenolpyruvate carboxylation, [CO2]-saturated rate of photosynthesis, and leaf oxygen radical absorbance capacity from leaf reflectance spectra. Partial least-squares regression models accurately predicted five out of seven traits and were more accurate than previously used simple spectral indices for leaf chlorophyll, nitrogen content, and specific leaf area. Correlations among leaf traits and statistical inferences about differences among genotypes and treatments were similar for measured and modeled data. The hyperspectral reflectance approach to phenotyping was dramatically faster than traditional measurements, enabling over 1,000 rows to be phenotyped during midday hours over just 2 to 4 d, and offers a nondestructive method to accurately assess physiological and biochemical trait responses to environmental stress. PMID:28049858
An image analysis toolbox for high-throughput C. elegans assays
Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.
2012-01-01
We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656
Identification of functional modules using network topology and high-throughput data.
Ulitsky, Igor; Shamir, Ron
2007-01-26
With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data.
Using a portable ion mobility spectrometer to screen dietary supplements for sibutramine.
Dunn, Jamie D; Gryniewicz-Ruzicka, Connie M; Kauffman, John F; Westenberger, Benjamin J; Buhse, Lucinda F
2011-02-20
In response to recent incidents of undeclared sibutramine, an appetite suppressant found in dietary supplements, we developed a method to detect sibutramine using hand-held ion mobility spectrometers with an analysis time of 15 s. Ion mobility spectrometry is a high-throughput and sensitive technique that has been used for illicit drug, explosive, volatile organic compound and chemical warfare detection. We evaluated a hand-held ion mobility spectrometer as a tool for the analysis of supplement extracts containing sibutramine. The overall instrumental limit of detection of five portable ion mobility spectrometers was 2 ng of sibutramine HCl. When sample extractions containing 30 ng/μl or greater of sibutramine were analyzed, saturation of the ionization chamber of the spectrometer occurred and the instrument required more than three cleaning cycles to remove the drug. Hence, supplement samples suspected of containing sibutramine should be prepared at concentrations of 2-20 ng/μl. To obtain this target concentration range for products containing unknown amounts of sibutramine, we provided a simple sample preparation procedure, allowing the U.S. Food and Drug Administration or other agencies to screen products using the portable ion mobility spectrometer. Published by Elsevier B.V.
Performance Analysis of IEEE 802.15.6 CSMA/CA Protocol for WBAN Medical Scenario through DTMC Model.
Kumar, Vivek; Gupta, Bharat
2016-12-01
The newly drafted IEEE 802.15.6 standard for Wireless Body Area Networks (WBAN) has been concentrating on a numerous medical and non-medical applications. Such short range wireless communication standard offers ultra-low power consumption with variable data rates from few Kbps to Mbps in, on or around the proximity of the human body. In this paper, the performance analysis of carrier sense multiple access with collision avoidance (CSMA/CA) scheme based on IEEE 802.15.6 standard in terms of throughput, reliability, clear channel assessment (CCA) failure probability, packet drop probability, and end-to-end delay has been presented. We have developed a discrete-time Markov chain (DTMC) to significantly evaluate the performances of IEEE 802.15.6 CSMA/CA under non-ideal channel condition having saturated traffic condition including node wait time and service time. We also visualize that, as soon as the payload length increases the CCA failure probability increases, which results in lower node's reliability. Also, we have calculated the end-to-end delay in order to prioritize the node wait time cause by backoff and retransmission. The user priority (UP) wise DTMC analysis has been performed to show the importance of the standard especially for medical scenario.
High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis
Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.
2016-09-23
High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe 2O 3, Cu 2V 2O 7, and BiVOmore » 4. Here, the applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less
Atlanta I-85 HOV-to-HOT conversion : analysis of vehicle and person throughput.
DOT National Transportation Integrated Search
2013-10-01
This report summarizes the vehicle and person throughput analysis for the High Occupancy Vehicle to High Occupancy Toll Lane : conversion in Atlanta, GA, undertaken by the Georgia Institute of Technology research team. The team tracked changes in : o...
Guo, Yabin; Levin, Henry L
2010-02-01
The biological impact of transposons on the physiology of the host depends greatly on the frequency and position of integration. Previous studies of Tf1, a long terminal repeat retrotransposon in Schizosaccharomyces pombe, showed that integration occurs at the promoters of RNA polymerase II (Pol II) transcribed genes. To determine whether specific promoters are preferred targets of integration, we sequenced large numbers of insertions using high-throughput pyrosequencing. In four independent experiments we identified a total of 73,125 independent integration events. These data provided strong support for the conclusion that Pol II promoters are the targets of Tf1 integration. The size and number of the integration experiments resulted in reproducible measures of integration for each intergenic region and ORF in the S. pombe genome. The reproducibility of the integration activity from experiment to experiment demonstrates that we have saturated the full set of insertion sites that are actively targeted by Tf1. We found Tf1 integration was highly biased in favor of a specific set of Pol II promoters. The overwhelming majority (76%) of the insertions were distributed in intergenic sequences that contained 31% of the promoters of S. pombe. Interestingly, there was no correlation between the amount of integration at these promoters and their level of transcription. Instead, we found Tf1 had a strong preference for promoters that are induced by conditions of stress. This targeting of stress response genes coupled with the ability of Tf1 to regulate the expression of adjacent genes suggests Tf1 may improve the survival of S. pombe when cells are exposed to environmental stress.
Guo, Yabin; Levin, Henry L.
2010-01-01
The biological impact of transposons on the physiology of the host depends greatly on the frequency and position of integration. Previous studies of Tf1, a long terminal repeat retrotransposon in Schizosaccharomyces pombe, showed that integration occurs at the promoters of RNA polymerase II (Pol II) transcribed genes. To determine whether specific promoters are preferred targets of integration, we sequenced large numbers of insertions using high-throughput pyrosequencing. In four independent experiments we identified a total of 73,125 independent integration events. These data provided strong support for the conclusion that Pol II promoters are the targets of Tf1 integration. The size and number of the integration experiments resulted in reproducible measures of integration for each intergenic region and ORF in the S. pombe genome. The reproducibility of the integration activity from experiment to experiment demonstrates that we have saturated the full set of insertion sites that are actively targeted by Tf1. We found Tf1 integration was highly biased in favor of a specific set of Pol II promoters. The overwhelming majority (76%) of the insertions were distributed in intergenic sequences that contained 31% of the promoters of S. pombe. Interestingly, there was no correlation between the amount of integration at these promoters and their level of transcription. Instead, we found Tf1 had a strong preference for promoters that are induced by conditions of stress. This targeting of stress response genes coupled with the ability of Tf1 to regulate the expression of adjacent genes suggests Tf1 may improve the survival of S. pombe when cells are exposed to environmental stress. PMID:20040583
Chen, Danyu; Li, Caiwu; Feng, Lan; Zhang, Zhizhong; Zhang, Heming; Cheng, Guangyang; Li, Desheng; Zhang, Guiquan; Wang, Hongning; Chen, Yanxi; Feng, Mingfu; Wang, Chengdong; Wu, Honglin; Deng, Linhua; Ming, He; Yang, Xin
2018-02-01
A recent study has described the normal vaginal bacterial community in giant pandas, but there is a lack of knowledge of the fungal community residing in the vagina of giant pandas. In order to comprehensively understand the vaginal fungal microbial diversity and abundance in giant pandas, high throughput sequencing was used to analyse the ITS1 region, based on thirteen samples taken from the pandas' vaginas, which were grouped by sampling points and age. The results showed that the most abundant phyla were Basidiomycota (73.37%), followed by Ascomycota (20.04%), Zygomycota (5.23%), Glomeromycota (0.014%) and Chytridiomycota (0.006%). At the genus level, Guehomyces (37.92%) was the most abundant, followed by Cladosporium (9.072%), Trichosporon (6.2%) and Mucor (4.97%). Furthermore, Candida only accounted for a low percentage of the vaginal fungal community. With the saturation of rarefaction curves and fungal diversity indices, the samples from Dujiangyan and Chungking Safari Park (DC group) showed a higher fungal species richness and diversity than other living environments. Shannon diversity indices showed significant difference between group WL (Wolong nature reserve) and DC (P < .05). Additionally, a higher diversity was found in ten to fifteen years old (Group 2) than other groups. Group 2 and Group 3 displayed significant differences in the diversities of their vaginal fungal communities (P < .05). These data that has been collected from this research will be helpful for further study to improve the reproductive status of giant pandas. Copyright © 2018 Elsevier Ltd. All rights reserved.
Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S
2015-11-01
High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.
Jung, Seung-Yong; Notton, Timothy; Fong, Erika; ...
2015-01-07
Particle sorting using acoustofluidics has enormous potential but widespread adoption has been limited by complex device designs and low throughput. Here, we report high-throughput separation of particles and T lymphocytes (600 μL min -1) by altering the net sonic velocity to reposition acoustic pressure nodes in a simple two-channel device. Finally, the approach is generalizable to other microfluidic platforms for rapid, high-throughput analysis.
The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer ...
Du, Yushen; Wu, Nicholas C; Jiang, Lin; Zhang, Tianhao; Gong, Danyang; Shu, Sara; Wu, Ting-Ting; Sun, Ren
2016-11-01
Identification and annotation of functional residues are fundamental questions in protein sequence analysis. Sequence and structure conservation provides valuable information to tackle these questions. It is, however, limited by the incomplete sampling of sequence space in natural evolution. Moreover, proteins often have multiple functions, with overlapping sequences that present challenges to accurate annotation of the exact functions of individual residues by conservation-based methods. Using the influenza A virus PB1 protein as an example, we developed a method to systematically identify and annotate functional residues. We used saturation mutagenesis and high-throughput sequencing to measure the replication capacity of single nucleotide mutations across the entire PB1 protein. After predicting protein stability upon mutations, we identified functional PB1 residues that are essential for viral replication. To further annotate the functional residues important to the canonical or noncanonical functions of viral RNA-dependent RNA polymerase (vRdRp), we performed a homologous-structure analysis with 16 different vRdRp structures. We achieved high sensitivity in annotating the known canonical polymerase functional residues. Moreover, we identified a cluster of noncanonical functional residues located in the loop region of the PB1 β-ribbon. We further demonstrated that these residues were important for PB1 protein nuclear import through the interaction with Ran-binding protein 5. In summary, we developed a systematic and sensitive method to identify and annotate functional residues that are not restrained by sequence conservation. Importantly, this method is generally applicable to other proteins about which homologous-structure information is available. To fully comprehend the diverse functions of a protein, it is essential to understand the functionality of individual residues. Current methods are highly dependent on evolutionary sequence conservation, which is usually limited by sampling size. Sequence conservation-based methods are further confounded by structural constraints and multifunctionality of proteins. Here we present a method that can systematically identify and annotate functional residues of a given protein. We used a high-throughput functional profiling platform to identify essential residues. Coupling it with homologous-structure comparison, we were able to annotate multiple functions of proteins. We demonstrated the method with the PB1 protein of influenza A virus and identified novel functional residues in addition to its canonical function as an RNA-dependent RNA polymerase. Not limited to virology, this method is generally applicable to other proteins that can be functionally selected and about which homologous-structure information is available. Copyright © 2016 Du et al.
Athavale, Ajay
2018-01-04
Ajay Athavale (Monsanto) presents "High Throughput Plasmid Sequencing with Illumina and CLC Bio" at the 7th Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting held in June, 2012 in Santa Fe, NM.
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Boozer, Christina; Kim, Gibum; Cong, Shuxin; Guan, Hannwen; Londergan, Timothy
2006-08-01
Surface plasmon resonance (SPR) biosensors have enabled a wide range of applications in which researchers can monitor biomolecular interactions in real time. Owing to the fact that SPR can provide affinity and kinetic data, unique features in applications ranging from protein-peptide interaction analysis to cellular ligation experiments have been demonstrated. Although SPR has historically been limited by its throughput, new methods are emerging that allow for the simultaneous analysis of many thousands of interactions. When coupled with new protein array technologies, high-throughput SPR methods give users new and improved methods to analyze pathways, screen drug candidates and monitor protein-protein interactions.
improved and higher throughput methods for analysis of biomass feedstocks Agronomics-using NIR spectroscopy in-house and external client training. She has also developed improved and high-throughput methods
High Throughput Sequence Analysis for Disease Resistance in Maize
USDA-ARS?s Scientific Manuscript database
Preliminary results of a computational analysis of high throughput sequencing data from Zea mays and the fungus Aspergillus are reported. The Illumina Genome Analyzer was used to sequence RNA samples from two strains of Z. mays (Va35 and Mp313) collected over a time course as well as several specie...
The US EPA’s ToxCastTM program seeks to combine advances in high-throughput screening technology with methodologies from statistics and computer science to develop high-throughput decision support tools for assessing chemical hazard and risk. To develop new methods of analysis of...
An Efficient and QoS Supported Multichannel MAC Protocol for Vehicular Ad Hoc Networks
Tan, Guozhen; Yu, Chao
2017-01-01
Vehicular Ad Hoc Networks (VANETs) employ multichannel to provide a variety of safety and non-safety (transport efficiency and infotainment) applications, based on the IEEE 802.11p and IEEE 1609.4 protocols. Different types of applications require different levels Quality-of-Service (QoS) support. Recently, transport efficiency and infotainment applications (e.g., electronic map download and Internet access) have received more and more attention, and this kind of applications is expected to become a big market driver in a near future. In this paper, we propose an Efficient and QoS supported Multichannel Medium Access Control (EQM-MAC) protocol for VANETs in a highway environment. The EQM-MAC protocol utilizes the service channel resources for non-safety message transmissions during the whole synchronization interval, and it dynamically adjusts minimum contention window size for different non-safety services according to the traffic conditions. Theoretical model analysis and extensive simulation results show that the EQM-MAC protocol can support QoS services, while ensuring the high saturation throughput and low transmission delay for non-safety applications. PMID:28991217
Tu, Anqi; Ma, Qiang; Bai, Hua; Du, Zhenxia
2017-04-15
Triacylglycerols (TAGs) as the major component of milk fat are significant factors to ensure the healthy growth of infants. An efficient method for identifying TAGs in human milk (HM) and infant formula (IF) was established using supercritical fluid chromatograph (SFC) coupled with quadruple time-of-flight mass spectrometry (Q-TOF-MS). The results indicated the feasibility of this method with satisfactory recoveries (>80%) and correlation coefficients (r 2 ⩾0.993). More than 60 TAGs in HM and 50 TAGs in IF were identified. The profiling results demonstrated that TAGs in HM were greatly affected by lactation stage. Significant differences were found between HM and IF, such as much higher medium chain TAGs and saturated TAGs in IF, indicating that the formulas developed by foreign manufacturers were not suitable for Chinese babies. This high-throughput method exhibits a huge potential for analysis of milk samples and the result may serve as an important guide for Chinese infants diet. Copyright © 2016 Elsevier Ltd. All rights reserved.
A novel Laser Ion Mobility Spectrometer
NASA Astrophysics Data System (ADS)
Göbel, J.; Kessler, M.; Langmeier, A.
2009-05-01
IMS is a well know technology within the range of security based applications. Its main advantages lie in the simplicity of measurement, along with a fast and sensitive detection method. Contemporary technology often fails due to interference substances, in conjunction with saturation effects and a low dynamic detection range. High throughput facilities, such as airports, require the analysis of many samples at low detection limits within a very short timeframe. High detection reliability is a requirement for safe and secure operation. In our present work we developed a laser based ion-mobility-sensor which shows several advantages over known IMS sensor technology. The goal of our research was to increase the sensitivity compared to the range of 63Ni based instruments. This was achieved with an optimised geometric drift tube design and a pulsed UV laser system at an efficient intensity. In this intensity range multi-photon ionisation is possible, which leads to higher selectivity in the ion-formation process itself. After high speed capturing of detection samples, a custom designed pattern recognition software toolbox provides reliable auto-detection capability with a learning algorithm and a graphical user interface.
Using multi-class queuing network to solve performance models of e-business sites.
Zheng, Xiao-ying; Chen, De-ren
2004-01-01
Due to e-business's variety of customers with different navigational patterns and demands, multi-class queuing network is a natural performance model for it. The open multi-class queuing network(QN) models are based on the assumption that no service center is saturated as a result of the combined loads of all the classes. Several formulas are used to calculate performance measures, including throughput, residence time, queue length, response time and the average number of requests. The solution technique of closed multi-class QN models is an approximate mean value analysis algorithm (MVA) based on three key equations, because the exact algorithm needs huge time and space requirement. As mixed multi-class QN models, include some open and some closed classes, the open classes should be eliminated to create a closed multi-class QN so that the closed model algorithm can be applied. Some corresponding examples are given to show how to apply the algorithms mentioned in this article. These examples indicate that multi-class QN is a reasonably accurate model of e-business and can be solved efficiently.
Robust, high-throughput solution structural analyses by small angle X-ray scattering (SAXS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hura, Greg L.; Menon, Angeli L.; Hammel, Michal
2009-07-20
We present an efficient pipeline enabling high-throughput analysis of protein structure in solution with small angle X-ray scattering (SAXS). Our SAXS pipeline combines automated sample handling of microliter volumes, temperature and anaerobic control, rapid data collection and data analysis, and couples structural analysis with automated archiving. We subjected 50 representative proteins, mostly from Pyrococcus furiosus, to this pipeline and found that 30 were multimeric structures in solution. SAXS analysis allowed us to distinguish aggregated and unfolded proteins, define global structural parameters and oligomeric states for most samples, identify shapes and similar structures for 25 unknown structures, and determine envelopes formore » 41 proteins. We believe that high-throughput SAXS is an enabling technology that may change the way that structural genomics research is done.« less
Greil, Stefanie; Rahman, Atikur; Liu, Mingzhao; ...
2017-10-10
Here, we report the fabrication of ultrathin, nanoporous silicon nitride membranes made from templates of regular, nanoscale features in self-assembled block copolymer thin films. The inorganic membranes feature thicknesses less than 50 nm and volume porosities over 30%, with straight-through pores that offer high throughout for gas transport and separation applications. As fabricated, the pores are uniformly around 20 nm in diameter, but they can be controllably and continuously tuned to single-digit nanometer dimensions by atomic layer deposition of conformal coatings. A deviation from expected Knudsen diffusion is revealed for transport characteristics of saturated vapors of organic solvents across themore » membrane, which becomes more significant for membranes of smaller pores. We attribute this to capillary condensation of saturated vapors within membrane pores, which reduces membrane throughput by over 1 order of magnitude but significantly improves the membrane’s selectivity. Between vapors of acetone and ethyl acetate, we measure selectivities as high as 7:1 at ambient pressure and temperature, 4 times more than the Knudsen selectivity.« less
Ma, Leyuan; Boucher, Jeffrey I; Paulsen, Janet; Matuszewski, Sebastian; Eide, Christopher A; Ou, Jianhong; Eickelberg, Garrett; Press, Richard D; Zhu, Lihua Julie; Druker, Brian J; Branford, Susan; Wolfe, Scot A; Jensen, Jeffrey D; Schiffer, Celia A; Green, Michael R; Bolon, Daniel N
2017-10-31
Developing tools to accurately predict the clinical prevalence of drug-resistant mutations is a key step toward generating more effective therapeutics. Here we describe a high-throughput CRISPR-Cas9-based saturated mutagenesis approach to generate comprehensive libraries of point mutations at a defined genomic location and systematically study their effect on cell growth. As proof of concept, we mutagenized a selected region within the leukemic oncogene BCR-ABL1 Using bulk competitions with a deep-sequencing readout, we analyzed hundreds of mutations under multiple drug conditions and found that the effects of mutations on growth in the presence or absence of drug were critical for predicting clinically relevant resistant mutations, many of which were cancer adaptive in the absence of drug pressure. Using this approach, we identified all clinically isolated BCR-ABL1 mutations and achieved a prediction score that correlated highly with their clinical prevalence. The strategy described here can be broadly applied to a variety of oncogenes to predict patient mutations and evaluate resistance susceptibility in the development of new therapeutics. Published under the PNAS license.
Advanced valve-regulated lead-acid batteries for hybrid vehicle applications
NASA Astrophysics Data System (ADS)
Soria, M. L.; Trinidad, F.; Lacadena, J. M.; Sánchez, A.; Valenciano, J.
Future vehicle applications require the development of reliable and long life batteries operating under high-rate partial-state-of-charge (HRPSoC) working conditions. Work presented in this paper deals with the study of different design parameters, manufacturing process and charging conditions of spiral wound valve-regulated lead-acid (VRLA) batteries, in order to improve their reliability and cycle life for hybrid vehicle applications. Test results show that both electrolyte saturation and charge conditions have a strong effect on cycle life at HRPSoC performance, presumably because water loss finally accelerates battery failure, which is linked to irreversible sulphation in the upper part of the negative electrodes. By adding expanded graphite to the negative active mass formulation, increasing the electrolyte saturation degree (>95%) and controlling overcharge during regenerative braking periods (voltage limitation and occasional boosting) it is possible to achieve up to 220,000 cycles at 2.5% DOD, equivalent to 5500 capacity throughput. These results could make lead acid batteries a strong competitor for HEV applications versus other advanced systems such as Ni-MH or Li-ion batteries.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greil, Stefanie; Rahman, Atikur; Liu, Mingzhao
Here, we report the fabrication of ultrathin, nanoporous silicon nitride membranes made from templates of regular, nanoscale features in self-assembled block copolymer thin films. The inorganic membranes feature thicknesses less than 50 nm and volume porosities over 30%, with straight-through pores that offer high throughout for gas transport and separation applications. As fabricated, the pores are uniformly around 20 nm in diameter, but they can be controllably and continuously tuned to single-digit nanometer dimensions by atomic layer deposition of conformal coatings. A deviation from expected Knudsen diffusion is revealed for transport characteristics of saturated vapors of organic solvents across themore » membrane, which becomes more significant for membranes of smaller pores. We attribute this to capillary condensation of saturated vapors within membrane pores, which reduces membrane throughput by over 1 order of magnitude but significantly improves the membrane’s selectivity. Between vapors of acetone and ethyl acetate, we measure selectivities as high as 7:1 at ambient pressure and temperature, 4 times more than the Knudsen selectivity.« less
Soulard, Patricia; McLaughlin, Meg; Stevens, Jessica; Connolly, Brendan; Coli, Rocco; Wang, Leyu; Moore, Jennifer; Kuo, Ming-Shang T; LaMarr, William A; Ozbal, Can C; Bhat, B Ganesh
2008-10-03
Several recent reports suggest that stearoyl-CoA desaturase 1 (SCD1), the rate-limiting enzyme in monounsaturated fatty acid synthesis, plays an important role in regulating lipid homeostasis and lipid oxidation in metabolically active tissues. As several manifestations of type 2 diabetes and related metabolic disorders are associated with alterations in intracellular lipid partitioning, pharmacological manipulation of SCD1 activity might be of benefit in the treatment of these disease states. In an effort to identify small molecule inhibitors of SCD1, we have developed a mass spectrometry based high-throughput screening (HTS) assay using deuterium labeled stearoyl-CoA substrate and induced rat liver microsomes. The methodology developed allows the use of a nonradioactive substrate which avoids interference by the endogenous SCD1 substrate and/or product that exist in the non-purified enzyme source. Throughput of the assay was up to twenty 384-well assay plates per day. The assay was linear with protein concentration and time, and was saturable for stearoyl-CoA substrate (K(m)=10.5 microM). The assay was highly reproducible with an average Z' value=0.6. Conjugated linoleic acid and sterculic acid, known inhibitors of SCD1, exhibited IC(50) values of 0.88 and 0.12 microM, respectively. High-throughput mass spectrometry screening of over 1.7 million compounds in compressed format demonstrated that the enzyme target is druggable. A total of 2515 hits were identified (0.1% hit rate), and 346 were confirmed active (>40% inhibition of total SCD activity at 20 microM--14% conformation rate). Of the confirmed hits 172 had IC(50) values of <10 microM, including 111 <1 microM and 48 <100 nM. A large number of potent drug-like (MW<450) hits representing six different chemical series were identified. The application of mass spectrometry to high-throughput screening permitted the development of a high-quality screening protocol for an otherwise intractable target, SCD1. Further medicinal chemistry and characterization of SCD inhibitors should lead to the development of reagents to treat metabolic disorders.
An improved high-throughput lipid extraction method for the analysis of human brain lipids.
Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett
2013-03-01
We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.
High-throughput determination of structural phase diagram and constituent phases using GRENDEL
NASA Astrophysics Data System (ADS)
Kusne, A. G.; Keller, D.; Anderson, A.; Zaban, A.; Takeuchi, I.
2015-11-01
Advances in high-throughput materials fabrication and characterization techniques have resulted in faster rates of data collection and rapidly growing volumes of experimental data. To convert this mass of information into actionable knowledge of material process-structure-property relationships requires high-throughput data analysis techniques. This work explores the use of the Graph-based endmember extraction and labeling (GRENDEL) algorithm as a high-throughput method for analyzing structural data from combinatorial libraries, specifically, to determine phase diagrams and constituent phases from both x-ray diffraction and Raman spectral data. The GRENDEL algorithm utilizes a set of physical constraints to optimize results and provides a framework by which additional physics-based constraints can be easily incorporated. GRENDEL also permits the integration of database data as shown by the use of critically evaluated data from the Inorganic Crystal Structure Database in the x-ray diffraction data analysis. Also the Sunburst radial tree map is demonstrated as a tool to visualize material structure-property relationships found through graph based analysis.
A high-throughput label-free nanoparticle analyser.
Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M; Ruoslahti, Erkki; Cleland, Andrew N
2011-05-01
Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10⁻⁶ l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.
Using Adverse Outcome Pathway Analysis to Guide Development of High-Throughput Screening Assays for Thyroid-Disruptors Katie B. Paul1,2, Joan M. Hedge2, Daniel M. Rotroff4, Kevin M. Crofton4, Michael W. Hornung3, Steven O. Simmons2 1Oak Ridge Institute for Science Education Post...
Francis, Jill J; Johnston, Marie; Robertson, Clare; Glidewell, Liz; Entwistle, Vikki; Eccles, Martin P; Grimshaw, Jeremy M
2010-12-01
In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies.
Genome-Wide Discovery of Genes Required for Capsule Production by Uropathogenic Escherichia coli.
Goh, Kelvin G K; Phan, Minh-Duy; Forde, Brian M; Chong, Teik Min; Yin, Wai-Fong; Chan, Kok-Gan; Ulett, Glen C; Sweet, Matthew J; Beatson, Scott A; Schembri, Mark A
2017-10-24
Uropathogenic Escherichia coli (UPEC) is a major cause of urinary tract and bloodstream infections and possesses an array of virulence factors for colonization, survival, and persistence. One such factor is the polysaccharide K capsule. Among the different K capsule types, the K1 serotype is strongly associated with UPEC infection. In this study, we completely sequenced the K1 UPEC urosepsis strain PA45B and employed a novel combination of a lytic K1 capsule-specific phage, saturated Tn 5 transposon mutagenesis, and high-throughput transposon-directed insertion site sequencing (TraDIS) to identify the complement of genes required for capsule production. Our analysis identified known genes involved in capsule biosynthesis, as well as two additional regulatory genes ( mprA and lrhA ) that we characterized at the molecular level. Mutation of mprA resulted in protection against K1 phage-mediated killing, a phenotype restored by complementation. We also identified a significantly increased unidirectional Tn 5 insertion frequency upstream of the lrhA gene and showed that strong expression of LrhA induced by a constitutive Pcl promoter led to loss of capsule production. Further analysis revealed loss of MprA or overexpression of LrhA affected the transcription of capsule biosynthesis genes in PA45B and increased sensitivity to killing in whole blood. Similar phenotypes were also observed in UPEC strains UTI89 (K1) and CFT073 (K2), demonstrating that the effects were neither strain nor capsule type specific. Overall, this study defined the genome of a UPEC urosepsis isolate and identified and characterized two new regulatory factors that affect UPEC capsule production. IMPORTANCE Urinary tract infections (UTIs) are among the most common bacterial infections in humans and are primarily caused by uropathogenic Escherichia coli (UPEC). Many UPEC strains express a polysaccharide K capsule that provides protection against host innate immune factors and contributes to survival and persistence during infection. The K1 serotype is one example of a polysaccharide capsule type and is strongly associated with UPEC strains that cause UTIs, bloodstream infections, and meningitis. The number of UTIs caused by antibiotic-resistant UPEC is steadily increasing, highlighting the need to better understand factors (e.g., the capsule) that contribute to UPEC pathogenesis. This study describes the original and novel application of lytic capsule-specific phage killing, saturated Tn 5 transposon mutagenesis, and high-throughput transposon-directed insertion site sequencing to define the entire complement of genes required for capsule production in UPEC. Our comprehensive approach uncovered new genes involved in the regulation of this key virulence determinant. Copyright © 2017 Goh et al.
CrossCheck: an open-source web tool for high-throughput screen data analysis.
Najafov, Jamil; Najafov, Ayaz
2017-07-19
Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.
A 13-week research-based biochemistry laboratory curriculum.
Lefurgy, Scott T; Mundorff, Emily C
2017-09-01
Here, we present a 13-week research-based biochemistry laboratory curriculum designed to provide the students with the experience of engaging in original research while introducing foundational biochemistry laboratory techniques. The laboratory experience has been developed around the directed evolution of an enzyme chosen by the instructor, with mutations designed by the students. Ideal enzymes for this curriculum are able to be structurally modeled, solubly expressed, and monitored for activity by UV/Vis spectroscopy, and an example curriculum for haloalkane dehalogenase is given. Unique to this curriculum is a successful implementation of saturation mutagenesis and high-throughput screening of enzyme function, along with bioinformatics analysis, homology modeling, structural analysis, protein expression and purification, polyacrylamide gel electrophoresis, UV/Vis spectroscopy, and enzyme kinetics. Each of these techniques is carried out using a novel student-designed mutant library or enzyme variant unique to the lab team and, importantly, not described previously in the literature. Use of a well-established set of protocols promotes student data quality. Publication may result from the original student-generated hypotheses and data, either from the class as a whole or individual students that continue their independent projects upon course completion. © 2017 by The International Union of Biochemistry and Molecular Biology, 45(5):437-448, 2017. © 2017 The International Union of Biochemistry and Molecular Biology.
High-throughput search for new permanent magnet materials.
Goll, D; Loeffler, R; Herbst, J; Karimi, R; Schneider, G
2014-02-12
The currently highest-performance Fe-Nd-B magnets show limited cost-effectiveness and lifetime due to their rare-earth (RE) content. The demand for novel hard magnetic phases with more widely available RE metals, reduced RE content or, even better, completely free of RE metals is therefore tremendous. The chances are that such materials still exist given the large number of as yet unexplored alloy systems. To discover such phases, an elaborate concept is necessary which can restrict and prioritize the search field while making use of efficient synthesis and analysis methods. It is shown that an efficient synthesis of new phases using heterogeneous non-equilibrium diffusion couples and reaction sintering is possible. Quantitative microstructure analysis of the domain pattern of the hard magnetic phases can be used to estimate the intrinsic magnetic parameters (saturation polarization from the domain contrast, anisotropy constant from the domain width, Curie temperature from the temperature dependence of the domain contrast). The probability of detecting TM-rich phases for a given system is high, therefore the approach enables one to scan through even higher component systems with one single sample. The visualization of newly occurring hard magnetic phases via their typical domain structure and the correlation existing between domain structure and intrinsic magnetic properties allows an evaluation of the industrial relevance of these novel phases.
NASA Astrophysics Data System (ADS)
Kassem Jebai, Al; Malrait, François; Martin, Philippe; Rouchon, Pierre
2016-03-01
Sensorless control of permanent-magnet synchronous motors at low velocity remains a challenging task. A now well-established method consists of injecting a high-frequency signal and using the rotor saliency, both geometric and magnetic-saturation induced. This paper proposes a clear and original analysis based on second-order averaging of how to recover the position information from signal injection; this analysis blends well with a general model of magnetic saturation. It also proposes a simple parametric model of the saturated motor, based on an energy function which simply encompasses saturation and cross-saturation effects. Experimental results on a surface-mounted motor and an interior magnet motor illustrate the relevance of the approach.
Development of rapid and sensitive high throughput pharmacologic assays for marine phycotoxins.
Van Dolah, F M; Finley, E L; Haynes, B L; Doucette, G J; Moeller, P D; Ramsdell, J S
1994-01-01
The lack of rapid, high throughput assays is a major obstacle to many aspects of research on marine phycotoxins. Here we describe the application of microplate scintillation technology to develop high throughput assays for several classes of marine phycotoxin based on their differential pharmacologic actions. High throughput "drug discovery" format microplate receptor binding assays developed for brevetoxins/ciguatoxins and for domoic acid are described. Analysis for brevetoxins/ciguatoxins is carried out by binding competition with [3H] PbTx-3 for site 5 on the voltage dependent sodium channel in rat brain synaptosomes. Analysis of domoic acid is based on binding competition with [3H] kainic acid for the kainate/quisqualate glutamate receptor using frog brain synaptosomes. In addition, a high throughput microplate 45Ca flux assay for determination of maitotoxins is described. These microplate assays can be completed within 3 hours, have sensitivities of less than 1 ng, and can analyze dozens of samples simultaneously. The assays have been demonstrated to be useful for assessing algal toxicity and for assay-guided purification of toxins, and are applicable to the detection of biotoxins in seafood.
High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luo, Wei; Shabbir, Faizan; Gong, Chao
2015-04-13
We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processingmore » units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.« less
Voltage-spike analysis for a free-running parallel inverter
NASA Technical Reports Server (NTRS)
Lee, F. C. Y.; Wilson, T. G.
1974-01-01
Unwanted and sometimes damaging high-amplitude voltage spikes occur during each half cycle in many transistor saturable-core inverters at the moment when the core saturates and the transistors switch. The analysis shows that spikes are an intrinsic characteristic of certain types of inverters even with negligible leakage inductance and purely resistive load. The small but unavoidable after-saturation inductance of the saturable-core transformer plays an essential role in creating these undesired thigh-voltage spikes. State-plane analysis provides insight into the complex interaction between core and transistors, and shows the circuit parameters upon which the magnitude of these spikes depends.
Hosokawa, Masahito; Nishikawa, Yohei; Kogawa, Masato; Takeyama, Haruko
2017-07-12
Massively parallel single-cell genome sequencing is required to further understand genetic diversities in complex biological systems. Whole genome amplification (WGA) is the first step for single-cell sequencing, but its throughput and accuracy are insufficient in conventional reaction platforms. Here, we introduce single droplet multiple displacement amplification (sd-MDA), a method that enables massively parallel amplification of single cell genomes while maintaining sequence accuracy and specificity. Tens of thousands of single cells are compartmentalized in millions of picoliter droplets and then subjected to lysis and WGA by passive droplet fusion in microfluidic channels. Because single cells are isolated in compartments, their genomes are amplified to saturation without contamination. This enables the high-throughput acquisition of contamination-free and cell specific sequence reads from single cells (21,000 single-cells/h), resulting in enhancement of the sequence data quality compared to conventional methods. This method allowed WGA of both single bacterial cells and human cancer cells. The obtained sequencing coverage rivals those of conventional techniques with superior sequence quality. In addition, we also demonstrate de novo assembly of uncultured soil bacteria and obtain draft genomes from single cell sequencing. This sd-MDA is promising for flexible and scalable use in single-cell sequencing.
Microfluidics for cell-based high throughput screening platforms - A review.
Du, Guansheng; Fang, Qun; den Toonder, Jaap M J
2016-01-15
In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery. Copyright © 2015 Elsevier B.V. All rights reserved.
BiQ Analyzer HT: locus-specific analysis of DNA methylation by high-throughput bisulfite sequencing
Lutsik, Pavlo; Feuerbach, Lars; Arand, Julia; Lengauer, Thomas; Walter, Jörn; Bock, Christoph
2011-01-01
Bisulfite sequencing is a widely used method for measuring DNA methylation in eukaryotic genomes. The assay provides single-base pair resolution and, given sufficient sequencing depth, its quantitative accuracy is excellent. High-throughput sequencing of bisulfite-converted DNA can be applied either genome wide or targeted to a defined set of genomic loci (e.g. using locus-specific PCR primers or DNA capture probes). Here, we describe BiQ Analyzer HT (http://biq-analyzer-ht.bioinf.mpi-inf.mpg.de/), a user-friendly software tool that supports locus-specific analysis and visualization of high-throughput bisulfite sequencing data. The software facilitates the shift from time-consuming clonal bisulfite sequencing to the more quantitative and cost-efficient use of high-throughput sequencing for studying locus-specific DNA methylation patterns. In addition, it is useful for locus-specific visualization of genome-wide bisulfite sequencing data. PMID:21565797
USDA-ARS?s Scientific Manuscript database
This study demonstrated the application of an automated high-throughput mini-cartridge solid-phase extraction (mini-SPE) cleanup for the rapid low-pressure gas chromatography – tandem mass spectrometry (LPGC-MS/MS) analysis of pesticides and environmental contaminants in QuEChERS extracts of foods. ...
2016-06-01
unlimited. v List of Tables Table 1 Single-lap-joint experimental parameters ..............................................7 Table 2 Survey ...Joints: Experimental and Workflow Protocols by Robert E Jensen, Daniel C DeSchepper, and David P Flanagan Approved for...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental
Aihara, Masamune; Yamamoto, Shigeru; Nishioka, Hiroko; Inoue, Yutaro; Hamano, Kimikazu; Oka, Masaaki; Mizukami, Yoichi
2012-06-15
G protein-coupled receptor 30/G protein estrogen receptor-1 (GPR30/GPER-1) is a novel membrane receptor for estrogen whose mRNA is expressed at high levels in estrogen-dependent cells such as breast cancer cell lines. However, mutations in GRP30 related to diseases remain unreported. To detect unknown mutations in the GPR30 open reading frame (ORF) quickly, the experimental conditions for high-resolution melting (HRM) analysis were examined for PCR primers, Taq polymerases, saturation DNA binding dyes, Mg(2+) concentration, and normalized temperatures. Nine known SNPs and 13 artificial point mutations within the GPR30 ORF, as well as single nucleotide variants in DNA extracted from subjects with breast cancers were tested under the optimal experimental conditions. The combination of Expand High Fidelity(PLUS) and SYTO9 in the presence of 2.0 mM MgCl(2) produced the best separation in melting curves of mutations in all regions of the GPR30 ORF. Under these experimental conditions, the mutations were clearly detected in both heterozygotes and homozygotes. HRM analysis of GPR30 using genomic DNA from subjects with breast cancers showed a novel single nucleotide variant, 111C>T in GPR30 and 4 known SNPs. The experimental conditions determined in this study for HRM analysis are useful for high throughput assays to detect unknown mutations within the GPR30 ORF. Copyright © 2012 Elsevier B.V. All rights reserved.
Independent component analysis decomposition of hospital emergency department throughput measures
NASA Astrophysics Data System (ADS)
He, Qiang; Chu, Henry
2016-05-01
We present a method adapted from medical sensor data analysis, viz. independent component analysis of electroencephalography data, to health system analysis. Timely and effective care in a hospital emergency department is measured by throughput measures such as median times patients spent before they were admitted as an inpatient, before they were sent home, before they were seen by a healthcare professional. We consider a set of five such measures collected at 3,086 hospitals distributed across the U.S. One model of the performance of an emergency department is that these correlated throughput measures are linear combinations of some underlying sources. The independent component analysis decomposition of the data set can thus be viewed as transforming a set of performance measures collected at a site to a collection of outputs of spatial filters applied to the whole multi-measure data. We compare the independent component sources with the output of the conventional principal component analysis to show that the independent components are more suitable for understanding the data sets through visualizations.
Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng
2018-06-04
In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.
A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers
Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.
2016-01-01
Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
MIPHENO: Data normalization for high throughput metabolic analysis.
High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...
Gore, Brooklin
2018-02-01
This presentation includes a brief background on High Throughput Computing, correlating gene transcription factors, optical mapping, genotype to phenotype mapping via QTL analysis, and current work on next gen sequencing.
Droplet-based microfluidic analysis and screening of single plant cells.
Yu, Ziyi; Boehm, Christian R; Hibberd, Julian M; Abell, Chris; Haseloff, Jim; Burgess, Steven J; Reyna-Llorens, Ivan
2018-01-01
Droplet-based microfluidics has been used to facilitate high-throughput analysis of individual prokaryote and mammalian cells. However, there is a scarcity of similar workflows applicable to rapid phenotyping of plant systems where phenotyping analyses typically are time-consuming and low-throughput. We report on-chip encapsulation and analysis of protoplasts isolated from the emergent plant model Marchantia polymorpha at processing rates of >100,000 cells per hour. We use our microfluidic system to quantify the stochastic properties of a heat-inducible promoter across a population of transgenic protoplasts to demonstrate its potential for assessing gene expression activity in response to environmental conditions. We further demonstrate on-chip sorting of droplets containing YFP-expressing protoplasts from wild type cells using dielectrophoresis force. This work opens the door to droplet-based microfluidic analysis of plant cells for applications ranging from high-throughput characterisation of DNA parts to single-cell genomics to selection of rare plant phenotypes.
Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y
2014-07-08
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays
High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...
Automated image alignment for 2D gel electrophoresis in a high-throughput proteomics pipeline.
Dowsey, Andrew W; Dunn, Michael J; Yang, Guang-Zhong
2008-04-01
The quest for high-throughput proteomics has revealed a number of challenges in recent years. Whilst substantial improvements in automated protein separation with liquid chromatography and mass spectrometry (LC/MS), aka 'shotgun' proteomics, have been achieved, large-scale open initiatives such as the Human Proteome Organization (HUPO) Brain Proteome Project have shown that maximal proteome coverage is only possible when LC/MS is complemented by 2D gel electrophoresis (2-DE) studies. Moreover, both separation methods require automated alignment and differential analysis to relieve the bioinformatics bottleneck and so make high-throughput protein biomarker discovery a reality. The purpose of this article is to describe a fully automatic image alignment framework for the integration of 2-DE into a high-throughput differential expression proteomics pipeline. The proposed method is based on robust automated image normalization (RAIN) to circumvent the drawbacks of traditional approaches. These use symbolic representation at the very early stages of the analysis, which introduces persistent errors due to inaccuracies in modelling and alignment. In RAIN, a third-order volume-invariant B-spline model is incorporated into a multi-resolution schema to correct for geometric and expression inhomogeneity at multiple scales. The normalized images can then be compared directly in the image domain for quantitative differential analysis. Through evaluation against an existing state-of-the-art method on real and synthetically warped 2D gels, the proposed analysis framework demonstrates substantial improvements in matching accuracy and differential sensitivity. High-throughput analysis is established through an accelerated GPGPU (general purpose computation on graphics cards) implementation. Supplementary material, software and images used in the validation are available at http://www.proteomegrid.org/rain/.
Savino, Maria; Seripa, Davide; Gallo, Antonietta P; Garrubba, Maria; D'Onofrio, Grazia; Bizzarro, Alessandra; Paroni, Giulia; Paris, Francesco; Mecocci, Patrizia; Masullo, Carlo; Pilotto, Alberto; Santini, Stefano A
2011-01-01
Recent studies investigating the single cytochrome P450 (CYP) 2D6 allele *2A reported an association with the response to drug treatments. More genetic data can be obtained, however, by high-throughput based-technologies. Aim of this study is the high-throughput analysis of the CYP2D6 polymorphisms to evaluate its effectiveness in the identification of patient responders/non-responders to CYP2D6-metabolized drugs. An attempt to compare our results with those previously obtained with the standard analysis of CYP2D6 allele *2A was also made. Sixty blood samples from patients treated with CYP2D6-metabolized drugs previously genotyped for the allele CYP2D6*2A, were analyzed for the CYP2D6 polymorphisms with the AutoGenomics INFINITI CYP4502D6-I assay on the AutoGenomics INFINITI analyzer. A higher frequency of mutated alleles in responder than in non-responder patients (75.38 % vs 43.48 %; p = 0.015) was observed. Thus, the presence of a mutated allele of CYP2D6 was associated with a response to CYP2D6-metabolized drugs (OR = 4.044 (1.348 - 12.154). No difference was observed in the distribution of allele *2A (p = 0.320). The high-throughput genetic analysis of the CYP2D6 polymorphisms better discriminate responders/non-responders with respect to the standard analysis of the CYP2D6 allele *2A. A high-throughput genetic assay of the CYP2D6 may be useful to identify patients with different clinical responses to CYP2D6-metabolized drugs.
Forecasting Container Throughput at the Doraleh Port in Djibouti through Time Series Analysis
NASA Astrophysics Data System (ADS)
Mohamed Ismael, Hawa; Vandyck, George Kobina
The Doraleh Container Terminal (DCT) located in Djibouti has been noted as the most technologically advanced container terminal on the African continent. DCT's strategic location at the crossroads of the main shipping lanes connecting Asia, Africa and Europe put it in a unique position to provide important shipping services to vessels plying that route. This paper aims to forecast container throughput through the Doraleh Container Port in Djibouti by Time Series Analysis. A selection of univariate forecasting models has been used, namely Triple Exponential Smoothing Model, Grey Model and Linear Regression Model. By utilizing the above three models and their combination, the forecast of container throughput through the Doraleh port was realized. A comparison of the different forecasting results of the three models, in addition to the combination forecast is then undertaken, based on commonly used evaluation criteria Mean Absolute Deviation (MAD) and Mean Absolute Percentage Error (MAPE). The study found that the Linear Regression forecasting Model was the best prediction method for forecasting the container throughput, since its forecast error was the least. Based on the regression model, a ten (10) year forecast for container throughput at DCT has been made.
Jordan, Scott
2018-01-24
Scott Jordan on "Advances in high-throughput speed, low-latency communication for embedded instrumentation" at the 2012 Sequencing, Finishing, Analysis in the Future Meeting held June 5-7, 2012 in Santa Fe, New Mexico.
[Current applications of high-throughput DNA sequencing technology in antibody drug research].
Yu, Xin; Liu, Qi-Gang; Wang, Ming-Rong
2012-03-01
Since the publication of a high-throughput DNA sequencing technology based on PCR reaction was carried out in oil emulsions in 2005, high-throughput DNA sequencing platforms have been evolved to a robust technology in sequencing genomes and diverse DNA libraries. Antibody libraries with vast numbers of members currently serve as a foundation of discovering novel antibody drugs, and high-throughput DNA sequencing technology makes it possible to rapidly identify functional antibody variants with desired properties. Herein we present a review of current applications of high-throughput DNA sequencing technology in the analysis of antibody library diversity, sequencing of CDR3 regions, identification of potent antibodies based on sequence frequency, discovery of functional genes, and combination with various display technologies, so as to provide an alternative approach of discovery and development of antibody drugs.
Pediatric Glioblastoma Therapies Based on Patient-Derived Stem Cell Resources
2014-11-01
genomic DNA and then subjected to Illumina high-throughput sequencing . In this analysis, shRNAs lost in the GSC population represent candidate gene...and genomic DNA and then subjected to Illumina high-throughput sequencing . In this analysis, shRNAs lost in the GSC population represent candidate...PRISM 7900 Sequence Detection System ( Genomics Resource, FHCRC). Relative transcript abundance was analyzed using the 2−ΔΔCt method. TRIzol (Invitrogen
Atmospheric pressure atomic layer deposition of Al₂O₃ using trimethyl aluminum and ozone.
Mousa, Moataz Bellah M; Oldham, Christopher J; Parsons, Gregory N
2014-04-08
High throughput spatial atomic layer deposition (ALD) often uses higher reactor pressure than typical batch processes, but the specific effects of pressure on species transport and reaction rates are not fully understood. For aluminum oxide (Al2O3) ALD, water or ozone can be used as oxygen sources, but how reaction pressure influences deposition using ozone has not previously been reported. This work describes the effect of deposition pressure, between ∼2 and 760 Torr, on ALD Al2O3 using TMA and ozone. Similar to reports for pressure dependence during TMA/water ALD, surface reaction saturation studies show self-limiting growth at low and high pressure across a reasonable temperature range. Higher pressure tends to increase the growth per cycle, especially at lower gas velocities and temperatures. However, growth saturation at high pressure requires longer O3 dose times per cycle. Results are consistent with a model of ozone decomposition kinetics versus pressure and temperature. Quartz crystal microbalance (QCM) results confirm the trends in growth rate and indicate that the surface reaction mechanisms for Al2O3 growth using ozone are similar under low and high total pressure, including expected trends in the reaction mechanism at different temperatures.
Quantitative Analysis for Installation Access Planning at Naval Base San Diego
2012-09-01
VPH in one processing lane, then we can assume that if the SECO were to open another sentry processing lane, the total throughput of both lanes would...be 600 VPH . Similarly, if two sentries in tandem can produce a throughput of 500 VPH , then having two lanes with two sentries in tandem each will...produce a total throughput of 1000 VPH . We assume throughout that there is no server idleness and so there is essentially an infinite backlog of
web cellHTS2: a web-application for the analysis of high-throughput screening data.
Pelz, Oliver; Gilsdorf, Moritz; Boutros, Michael
2010-04-12
The analysis of high-throughput screening data sets is an expanding field in bioinformatics. High-throughput screens by RNAi generate large primary data sets which need to be analyzed and annotated to identify relevant phenotypic hits. Large-scale RNAi screens are frequently used to identify novel factors that influence a broad range of cellular processes, including signaling pathway activity, cell proliferation, and host cell infection. Here, we present a web-based application utility for the end-to-end analysis of large cell-based screening experiments by cellHTS2. The software guides the user through the configuration steps that are required for the analysis of single or multi-channel experiments. The web-application provides options for various standardization and normalization methods, annotation of data sets and a comprehensive HTML report of the screening data analysis, including a ranked hit list. Sessions can be saved and restored for later re-analysis. The web frontend for the cellHTS2 R/Bioconductor package interacts with it through an R-server implementation that enables highly parallel analysis of screening data sets. web cellHTS2 further provides a file import and configuration module for common file formats. The implemented web-application facilitates the analysis of high-throughput data sets and provides a user-friendly interface. web cellHTS2 is accessible online at http://web-cellHTS2.dkfz.de. A standalone version as a virtual appliance and source code for platforms supporting Java 1.5.0 can be downloaded from the web cellHTS2 page. web cellHTS2 is freely distributed under GPL.
Bifrost: a Modular Python/C++ Framework for Development of High-Throughput Data Analysis Pipelines
NASA Astrophysics Data System (ADS)
Cranmer, Miles; Barsdell, Benjamin R.; Price, Danny C.; Garsden, Hugh; Taylor, Gregory B.; Dowell, Jayce; Schinzel, Frank; Costa, Timothy; Greenhill, Lincoln J.
2017-01-01
Large radio interferometers have data rates that render long-term storage of raw correlator data infeasible, thus motivating development of real-time processing software. For high-throughput applications, processing pipelines are challenging to design and implement. Motivated by science efforts with the Long Wavelength Array, we have developed Bifrost, a novel Python/C++ framework that eases the development of high-throughput data analysis software by packaging algorithms as black box processes in a directed graph. This strategy to modularize code allows astronomers to create parallelism without code adjustment. Bifrost uses CPU/GPU ’circular memory’ data buffers that enable ready introduction of arbitrary functions into the processing path for ’streams’ of data, and allow pipelines to automatically reconfigure in response to astrophysical transient detection or input of new observing settings. We have deployed and tested Bifrost at the latest Long Wavelength Array station, in Sevilleta National Wildlife Refuge, NM, where it handles throughput exceeding 10 Gbps per CPU core.
High-Throughput Lectin Microarray-Based Analysis of Live Cell Surface Glycosylation
Li, Yu; Tao, Sheng-ce; Zhu, Heng; Schneck, Jonathan P.
2011-01-01
Lectins, plant-derived glycan-binding proteins, have long been used to detect glycans on cell surfaces. However, the techniques used to characterize serum or cells have largely been limited to mass spectrometry, blots, flow cytometry, and immunohistochemistry. While these lectin-based approaches are well established and they can discriminate a limited number of sugar isomers by concurrently using a limited number of lectins, they are not amenable for adaptation to a high-throughput platform. Fortunately, given the commercial availability of lectins with a variety of glycan specificities, lectins can be printed on a glass substrate in a microarray format to profile accessible cell-surface glycans. This method is an inviting alternative for analysis of a broad range of glycans in a high-throughput fashion and has been demonstrated to be a feasible method of identifying binding-accessible cell surface glycosylation on living cells. The current unit presents a lectin-based microarray approach for analyzing cell surface glycosylation in a high-throughput fashion. PMID:21400689
Cai, Jinhai; Okamoto, Mamoru; Atieno, Judith; Sutton, Tim; Li, Yongle; Miklavcic, Stanley J.
2016-01-01
Leaf senescence, an indicator of plant age and ill health, is an important phenotypic trait for the assessment of a plant’s response to stress. Manual inspection of senescence, however, is time consuming, inaccurate and subjective. In this paper we propose an objective evaluation of plant senescence by color image analysis for use in a high throughput plant phenotyping pipeline. As high throughput phenotyping platforms are designed to capture whole-of-plant features, camera lenses and camera settings are inappropriate for the capture of fine detail. Specifically, plant colors in images may not represent true plant colors, leading to errors in senescence estimation. Our algorithm features a color distortion correction and image restoration step prior to a senescence analysis. We apply our algorithm to two time series of images of wheat and chickpea plants to quantify the onset and progression of senescence. We compare our results with senescence scores resulting from manual inspection. We demonstrate that our procedure is able to process images in an automated way for an accurate estimation of plant senescence even from color distorted and blurred images obtained under high throughput conditions. PMID:27348807
De Diego, Nuria; Fürst, Tomáš; Humplík, Jan F; Ugena, Lydia; Podlešáková, Kateřina; Spíchal, Lukáš
2017-01-01
High-throughput plant phenotyping platforms provide new possibilities for automated, fast scoring of several plant growth and development traits, followed over time using non-invasive sensors. Using Arabidops is as a model offers important advantages for high-throughput screening with the opportunity to extrapolate the results obtained to other crops of commercial interest. In this study we describe the development of a highly reproducible high-throughput Arabidopsis in vitro bioassay established using our OloPhen platform, suitable for analysis of rosette growth in multi-well plates. This method was successfully validated on example of multivariate analysis of Arabidopsis rosette growth in different salt concentrations and the interaction with varying nutritional composition of the growth medium. Several traits such as changes in the rosette area, relative growth rate, survival rate and homogeneity of the population are scored using fully automated RGB imaging and subsequent image analysis. The assay can be used for fast screening of the biological activity of chemical libraries, phenotypes of transgenic or recombinant inbred lines, or to search for potential quantitative trait loci. It is especially valuable for selecting genotypes or growth conditions that improve plant stress tolerance.
Overcoming bias and systematic errors in next generation sequencing data.
Taub, Margaret A; Corrada Bravo, Hector; Irizarry, Rafael A
2010-12-10
Considerable time and effort has been spent in developing analysis and quality assessment methods to allow the use of microarrays in a clinical setting. As is the case for microarrays and other high-throughput technologies, data from new high-throughput sequencing technologies are subject to technological and biological biases and systematic errors that can impact downstream analyses. Only when these issues can be readily identified and reliably adjusted for will clinical applications of these new technologies be feasible. Although much work remains to be done in this area, we describe consistently observed biases that should be taken into account when analyzing high-throughput sequencing data. In this article, we review current knowledge about these biases, discuss their impact on analysis results, and propose solutions.
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan; ...
2016-09-23
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suram, Santosh K.; Newhouse, Paul F.; Zhou, Lan
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4V 1.5Fe 0.5O 10.5 as a light absorber with direct band gap near 2.7 eV. Here, the strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platformmore » for identifying new optical materials.« less
Throughput and delay analysis of IEEE 802.15.6-based CSMA/CA protocol.
Ullah, Sana; Chen, Min; Kwak, Kyung Sup
2012-12-01
The IEEE 802.15.6 is a new communication standard on Wireless Body Area Network (WBAN) that focuses on a variety of medical, Consumer Electronics (CE) and entertainment applications. In this paper, the throughput and delay performance of the IEEE 802.15.6 is presented. Numerical formulas are derived to determine the maximum throughput and minimum delay limits of the IEEE 802.15.6 for an ideal channel with no transmission errors. These limits are derived for different frequency bands and data rates. Our analysis is validated by extensive simulations using a custom C+ + simulator. Based on analytical and simulation results, useful conclusions are derived for network provisioning and packet size optimization for different applications.
Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N
2017-10-01
Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.
Suram, Santosh K; Newhouse, Paul F; Zhou, Lan; Van Campen, Douglas G; Mehta, Apurva; Gregoire, John M
2016-11-14
Combinatorial materials science strategies have accelerated materials development in a variety of fields, and we extend these strategies to enable structure-property mapping for light absorber materials, particularly in high order composition spaces. High throughput optical spectroscopy and synchrotron X-ray diffraction are combined to identify the optical properties of Bi-V-Fe oxides, leading to the identification of Bi 4 V 1.5 Fe 0.5 O 10.5 as a light absorber with direct band gap near 2.7 eV. The strategic combination of experimental and data analysis techniques includes automated Tauc analysis to estimate band gap energies from the high throughput spectroscopy data, providing an automated platform for identifying new optical materials.
Aquifer environment selects for microbial species cohorts in sediment and groundwater
Hug, Laura A; Thomas, Brian C; Brown, Christopher T; Frischkorn, Kyle R; Williams, Kenneth H; Tringe, Susannah G; Banfield, Jillian F
2015-01-01
Little is known about the biogeography or stability of sediment-associated microbial community membership because these environments are biologically complex and generally difficult to sample. High-throughput-sequencing methods provide new opportunities to simultaneously genomically sample and track microbial community members across a large number of sampling sites or times, with higher taxonomic resolution than is associated with 16 S ribosomal RNA gene surveys, and without the disadvantages of primer bias and gene copy number uncertainty. We characterized a sediment community at 5 m depth in an aquifer adjacent to the Colorado River and tracked its most abundant 133 organisms across 36 different sediment and groundwater samples. We sampled sites separated by centimeters, meters and tens of meters, collected on seven occasions over 6 years. Analysis of 1.4 terabase pairs of DNA sequence showed that these 133 organisms were more consistently detected in saturated sediments than in samples from the vadose zone, from distant locations or from groundwater filtrates. Abundance profiles across aquifer locations and from different sampling times identified organism cohorts that comprised subsets of the 133 organisms that were consistently associated. The data suggest that cohorts are partly selected for by shared environmental adaptation. PMID:25647349
Uplink Downlink Rate Balancing and Throughput Scaling in FDD Massive MIMO Systems
NASA Astrophysics Data System (ADS)
Bergel, Itsik; Perets, Yona; Shamai, Shlomo
2016-05-01
In this work we extend the concept of uplink-downlink rate balancing to frequency division duplex (FDD) massive MIMO systems. We consider a base station with large number antennas serving many single antenna users. We first show that any unused capacity in the uplink can be traded off for higher throughput in the downlink in a system that uses either dirty paper (DP) coding or linear zero-forcing (ZF) precoding. We then also study the scaling of the system throughput with the number of antennas in cases of linear Beamforming (BF) Precoding, ZF Precoding, and DP coding. We show that the downlink throughput is proportional to the logarithm of the number of antennas. While, this logarithmic scaling is lower than the linear scaling of the rate in the uplink, it can still bring significant throughput gains. For example, we demonstrate through analysis and simulation that increasing the number of antennas from 4 to 128 will increase the throughput by more than a factor of 5. We also show that a logarithmic scaling of downlink throughput as a function of the number of receive antennas can be achieved even when the number of transmit antennas only increases logarithmically with the number of receive antennas.
USDA-ARS?s Scientific Manuscript database
Contigs with sequence similarities to several nucleorhabdoviruses were identified by high-throughput sequencing analysis from a black currant (Ribes nigrum L.) cultivar. The complete genomic sequence of this new nucleorhabdovirus is 14,432 nucleotides. Its genomic organization is typical of nucleorh...
Multispot single-molecule FRET: High-throughput analysis of freely diffusing molecules
Panzeri, Francesco
2017-01-01
We describe an 8-spot confocal setup for high-throughput smFRET assays and illustrate its performance with two characteristic experiments. First, measurements on a series of freely diffusing doubly-labeled dsDNA samples allow us to demonstrate that data acquired in multiple spots in parallel can be properly corrected and result in measured sample characteristics consistent with those obtained with a standard single-spot setup. We then take advantage of the higher throughput provided by parallel acquisition to address an outstanding question about the kinetics of the initial steps of bacterial RNA transcription. Our real-time kinetic analysis of promoter escape by bacterial RNA polymerase confirms results obtained by a more indirect route, shedding additional light on the initial steps of transcription. Finally, we discuss the advantages of our multispot setup, while pointing potential limitations of the current single laser excitation design, as well as analysis challenges and their solutions. PMID:28419142
Ding, Jiaqi; Chen, Xiaoli; Lin, Jiaji; Zhu, Junling; Li, Zhuyi
2018-01-01
Objective To study the effects of dopamine receptor D2 (DRD2) on the adipogenesis genes in mouse primary mesencephalic neurons. Methods The lentiviral vectors which expressed specific shRNA targeting DRD2 were constructed to decrease DRD2 expression in mouse primary mesencephalic neurons. High throughput sequencing (HTS) analysis was used to investigate gene expression changes between the DRD2 knock-down group and the negative control group. Real-time quantitative PCR (qRT-PCR) and Western blot analysis were applied to verify the differently expressed genes. Fatty acids were measured by fatty acid detection kit. Results DRD2 expression was effectively down-regulated in mouse primary mesencephalic neurons by lentiviral vectors. HTS revealed adipogenesis genes were significantly up-regulated after DRD2 down-regulation, mainly including delta(14)-sterol reductase, acetyl-coenzyme A synthetase, insulin-induced gene 1 protein and especially stearoyl-coenzyme A desaturase 1 (SCD1, 4-fold upregulated). The qRT-PCR and Western blot analysis verified that SCD1 was upregulated 2.6 folds and 2 folds respectively by lentiviral DRD2-shRNA vectors. Moreover, the SCD1-related free fatty acids were significantly more increased than the negative control group. Conclusion DRD2 in primary mesencephalic neurons had a significant regulative effect on the adipogenesis genes. The up-regulation of SCD1 can accelerate the conversion of saturated fatty acids to monounsaturated fatty acids and prevent the damage of lipid toxicity to cells.
Picotti, Paola; Clement-Ziza, Mathieu; Lam, Henry; Campbell, David S.; Schmidt, Alexander; Deutsch, Eric W.; Röst, Hannes; Sun, Zhi; Rinner, Oliver; Reiter, Lukas; Shen, Qin; Michaelson, Jacob J.; Frei, Andreas; Alberti, Simon; Kusebauch, Ulrike; Wollscheid, Bernd; Moritz, Robert; Beyer, Andreas; Aebersold, Ruedi
2013-01-01
Complete reference maps or datasets, like the genomic map of an organism, are highly beneficial tools for biological and biomedical research. Attempts to generate such reference datasets for a proteome so far failed to reach complete proteome coverage, with saturation apparent at approximately two thirds of the proteomes tested, even for the most thoroughly characterized proteomes. Here, we used a strategy based on high-throughput peptide synthesis and mass spectrometry to generate a close to complete reference map (97% of the genome-predicted proteins) of the S. cerevisiae proteome. We generated two versions of this mass spectrometric map one supporting discovery- (shotgun) and the other hypothesis-driven (targeted) proteomic measurements. The two versions of the map, therefore, constitute a complete set of proteomic assays to support most studies performed with contemporary proteomic technologies. The reference libraries can be browsed via a web-based repository and associated navigation tools. To demonstrate the utility of the reference libraries we applied them to a protein quantitative trait locus (pQTL) analysis, which requires measurement of the same peptides over a large number of samples with high precision. Protein measurements over a set of 78 S. cerevisiae strains revealed a complex relationship between independent genetic loci, impacting on the levels of related proteins. Our results suggest that selective pressure favors the acquisition of sets of polymorphisms that maintain the stoichiometry of protein complexes and pathways. PMID:23334424
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering.
Groen, Nathalie; Guvendiren, Murat; Rabitz, Herschel; Welsh, William J; Kohn, Joachim; de Boer, Jan
2016-04-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. In this opinion paper, we postulate that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. Copyright © 2016. Published by Elsevier Ltd.
Molecular characterization of a novel Luteovirus from peach identified by high-throughput sequencing
USDA-ARS?s Scientific Manuscript database
Contigs with sequence homologies to Cherry-associated luteovirus were identified by high-throughput sequencing analysis of two peach accessions undergoing quarantine testing. The complete genomic sequences of the two isolates of this virus are 5,819 and 5,814 nucleotides. Their genome organization i...
High-throughput profiling and analysis of plant responses over time to abiotic stress
USDA-ARS?s Scientific Manuscript database
Energy sorghum (Sorghum bicolor (L.) Moench) is a rapidly growing, high-biomass, annual crop prized for abiotic stress tolerance. Measuring genotype-by-environment (G x E) interactions remains a progress bottleneck. High throughput phenotyping within controlled environments has been proposed as a po...
ToxCast Workflow: High-throughput screening assay data processing, analysis and management (SOT)
US EPA’s ToxCast program is generating data in high-throughput screening (HTS) and high-content screening (HCS) assays for thousands of environmental chemicals, for use in developing predictive toxicity models. Currently the ToxCast screening program includes over 1800 unique c...
The CTD2 Center at Emory University used high-throughput protein-protein interaction (PPI) mapping for Hippo signaling pathway profiling to rapidly unveil promising PPIs as potential therapeutic targets and advance functional understanding of signaling circuitry in cells. Read the abstract.
Immobilization methods for the rapid total chemical synthesis of proteins on microtiter plates.
Zitterbart, Robert; Krumrey, Michael; Seitz, Oliver
2017-07-01
The chemical synthesis of proteins typically involves the solid-phase peptide synthesis of unprotected peptide fragments that are stitched together in solution by native chemical ligation (NCL). The process is slow, and throughput is limited because of the need for repeated high performance liquid chromatography purification steps after both solid-phase peptide synthesis and NCL. With an aim to provide faster access to functional proteins and to accelerate the functional analysis of synthetic proteins by parallelization, we developed a method for the high performance liquid chromatography-free synthesis of proteins on the surface of microtiter plates. The method relies on solid-phase synthesis of unprotected peptide fragments, immobilization of the C-terminal fragment and on-surface NCL with an unprotected peptide thioester in crude form. Herein, we describe the development of a suitable immobilization chemistry. We compared (i) formation of nickel(II)-oligohistidine complexes, (ii) Cu-based [2 + 3] alkine-azide cycloaddition and (iii) hydrazone ligation. The comparative study identified the hydrazone ligation as most suitable. The sequence of immobilization via hydrazone ligation, on-surface NCL and radical desulfurization furnished the targeted SH3 domains in near quantitative yield. The synthetic proteins were functional as demonstrated by an on-surface fluorescence-based saturation binding analysis. Copyright © 2017 European Peptide Society and John Wiley & Sons, Ltd. Copyright © 2017 European Peptide Society and John Wiley & Sons, Ltd.
LRO-LAMP failsafe door-open performance: improving FUV measurements of dayside lunar hydration
NASA Astrophysics Data System (ADS)
Davis, Michael W.; Greathouse, Thomas K.; Kaufmann, David E.; Retherford, Kurt D.; Versteeg, Maarten H.
2017-08-01
The Lunar Reconnaissance Orbiter's (LRO) Lyman Alpha Mapping Project (LAMP) is a lightweight (6.1 kg), lowpower (4.5 W), ultraviolet spectrograph based on the Alice instruments aboard the European Space Agency's Rosetta spacecraft and NASA's New Horizons spacecraft. Its primary job is to identify and localize exposed water frost in permanently shadowed regions (PSRs) near the Moon's poles, and to characterize landforms and albedos in PSRs. LRO launched on June 18, 2009 and reached lunar orbit four days later. LAMP operated with its failsafe door closed for its first seven years in flight. The failsafe door was opened in October 2016 to increase light throughput during dayside operations at the expense of no longer having the capacity to take further dark observations and slightly more operational complexity to avoid saturating the instrument. This one-time irreversible operation was approved after extensive review, and was conducted flawlessly. The increased throughput allows measurement of dayside hydration in one orbit, instead of averaging multiple orbits together to reach enough signal-to-noise. The new measurement mode allows greater time resolution of dayside water migration for improved investigations into the source and loss processes on the lunar surface. LAMP performance and optical characteristics after the failsafe door opening are described herein, including the new effective area, wavelength solution, and resolution.
Luu, Van; Jona, Janan; Stanton, Mary K; Peterson, Matthew L; Morrison, Henry G; Nagapudi, Karthik; Tan, Helming
2013-01-30
A 96-well high-throughput cocrystal screening workflow has been developed consisting of solvent-mediated sonic blending synthesis and on-plate solid/solution stability characterization by XRPD. A strategy of cocrystallization screening in selected blend solvents including water mixtures is proposed to not only manipulate solubility of the cocrystal components but also differentiate physical stability of the cocrystal products. Caffeine-oxalic acid and theophylline-oxalic acid cocrystals were prepared and evaluated in relation to saturation levels of the cocrystal components and stability of the cocrystal products in anhydrous and hydrous solvents. AMG 517 was screened with a number of coformers, and solid/solution stability of the resulting cocrystals on the 96-well plate was investigated. A stability trend was observed and confirmed that cocrystals comprised of lower aqueous solubility coformers tended to be more stable in water. Furthermore, cocrystals which could be isolated under hydrous solvent blending condition exhibited superior physical stability to those which could only be obtained under anhydrous condition. This integrated HTS workflow provides an efficient route in an API-sparing approach to screen and identify cocrystal candidates with proper solubility and solid/solution stability properties. Copyright © 2012 Elsevier B.V. All rights reserved.
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
The development of a general purpose ARM-based processing unit for the ATLAS TileCal sROD
NASA Astrophysics Data System (ADS)
Cox, M. A.; Reed, R.; Mellado, B.
2015-01-01
After Phase-II upgrades in 2022, the data output from the LHC ATLAS Tile Calorimeter will increase significantly. ARM processors are common in mobile devices due to their low cost, low energy consumption and high performance. It is proposed that a cost-effective, high data throughput Processing Unit (PU) can be developed by using several consumer ARM processors in a cluster configuration to allow aggregated processing performance and data throughput while maintaining minimal software design difficulty for the end-user. This PU could be used for a variety of high-level functions on the high-throughput raw data such as spectral analysis and histograms to detect possible issues in the detector at a low level. High-throughput I/O interfaces are not typical in consumer ARM System on Chips but high data throughput capabilities are feasible via the novel use of PCI-Express as the I/O interface to the ARM processors. An overview of the PU is given and the results for performance and throughput testing of four different ARM Cortex System on Chips are presented.
Xu, Chun-Xiu; Yin, Xue-Feng
2011-02-04
A chip-based microfluidic system for high-throughput single-cell analysis is described. The system was integrated with continuous introduction of individual cells, rapid dynamic lysis, capillary electrophoretic (CE) separation and laser induced fluorescence (LIF) detection. A cross microfluidic chip with one sheath-flow channel located on each side of the sampling channel was designed. The labeled cells were hydrodynamically focused by sheath-flow streams and sequentially introduced into the cross section of the microchip under hydrostatic pressure generated by adjusting liquid levels in the reservoirs. Combined with the electric field applied on the separation channel, the aligned cells were driven into the separation channel and rapidly lysed within 33ms at the entry of the separation channel by Triton X-100 added in the sheath-flow solution. The maximum rate for introducing individual cells into the separation channel was about 150cells/min. The introduction of sheath-flow streams also significantly reduced the concentration of phosphate-buffered saline (PBS) injected into the separation channel along with single cells, thus reducing Joule heating during electrophoretic separation. The performance of this microfluidic system was evaluated by analysis of reduced glutathione (GSH) and reactive oxygen species (ROS) in single erythrocytes. A throughput of 38cells/min was obtained. The proposed method is simple and robust for high-throughput single-cell analysis, allowing for analysis of cell population with considerable size to generate results with statistical significance. Copyright © 2010 Elsevier B.V. All rights reserved.
An analysis of the development of port operation in Da Nang Port, Vietnam
NASA Astrophysics Data System (ADS)
Nguyen, T. D. H.; Cools, M.
2018-04-01
This paper presents the current operating status in Da Nang Port, Vietnam in the period 2012-2016. The port operation had positive changes that were reflected by a significant increase in total throughputs, especially containerized cargo volumes. Classical decomposition techniques are used to find trend-cycle and seasonal components of monthly throughput flows. Appropriate predictive models of different kinds of throughputs are proposed. Finally, a development strategy towards containerization and investment policies in facilities, equipment, and infrastructure are suggested based on the predictive results.
Mao, Yong; Singh-Varma, Anya; Hoffman, Tyler; Dhall, Sandeep; Danilkovitch, Alla; Kohn, Joachim
2018-01-08
Biofilm, a community of bacteria, is tolerant to antimicrobial agents and ubiquitous in chronic wounds. In a chronic DFU (Diabetic Foot Ulcers) clinical trial, the use of a human cryopreserved viable amniotic membrane (CVAM) resulted in a high rate of wound closure and reduction of wound-related infections. Our previous study demonstrated that CVAM possesses intrinsic antimicrobial activity against a spectrum of wound-associated bacteria under planktonic culture conditions. In this study, we evaluated the effect of CVAM and cryopreserved viable umbilical tissue (CVUT) on biofilm formation of S. aureus and P. aeruginosa , the two most prominent pathogens associated with chronic wounds. Firstly, we showed that, like CVAM, CVUT released antibacterial activity against multiple bacterial pathogens and the devitalization of CVUT reduced its antibacterial activity. The biofilm formation was then measured using a high throughput method and an ex vivo porcine dermal tissue model. We demonstrate that the formation of biofilm was significantly reduced in the presence of CVAM- or CVUT-derived conditioned media compared to control assay medium. The formation of P. aeruginosa biofilm on CVAM-conditioned medium saturated porcine dermal tissues was reduced 97% compared with the biofilm formation on the control medium saturated dermal tissues. The formation of S. auerus biofilm on CVUT-conditioned medium saturated dermal tissues was reduced 72% compared with the biofilm formation on the control tissues. This study is the first to show that human cryopreserved viable placental tissues release factors that inhibit biofilm formation. Our results provide an explanation for the in vivo observation of their ability to support wound healing.
Microreactor Cells for High-Throughput X-ray Absorption Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beesley, Angela; Tsapatsaris, Nikolaos; Weiher, Norbert
2007-01-19
High-throughput experimentation has been applied to X-ray Absorption spectroscopy as a novel route for increasing research productivity in the catalysis community. Suitable instrumentation has been developed for the rapid determination of the local structure in the metal component of precursors for supported catalysts. An automated analytical workflow was implemented that is much faster than traditional individual spectrum analysis. It allows the generation of structural data in quasi-real time. We describe initial results obtained from the automated high throughput (HT) data reduction and analysis of a sample library implemented through the 96 well-plate industrial standard. The results show that a fullymore » automated HT-XAS technology based on existing industry standards is feasible and useful for the rapid elucidation of geometric and electronic structure of materials.« less
Analysis of gene network robustness based on saturated fixed point attractors
2014-01-01
The analysis of gene network robustness to noise and mutation is important for fundamental and practical reasons. Robustness refers to the stability of the equilibrium expression state of a gene network to variations of the initial expression state and network topology. Numerical simulation of these variations is commonly used for the assessment of robustness. Since there exists a great number of possible gene network topologies and initial states, even millions of simulations may be still too small to give reliable results. When the initial and equilibrium expression states are restricted to being saturated (i.e., their elements can only take values 1 or −1 corresponding to maximum activation and maximum repression of genes), an analytical gene network robustness assessment is possible. We present this analytical treatment based on determination of the saturated fixed point attractors for sigmoidal function models. The analysis can determine (a) for a given network, which and how many saturated equilibrium states exist and which and how many saturated initial states converge to each of these saturated equilibrium states and (b) for a given saturated equilibrium state or a given pair of saturated equilibrium and initial states, which and how many gene networks, referred to as viable, share this saturated equilibrium state or the pair of saturated equilibrium and initial states. We also show that the viable networks sharing a given saturated equilibrium state must follow certain patterns. These capabilities of the analytical treatment make it possible to properly define and accurately determine robustness to noise and mutation for gene networks. Previous network research conclusions drawn from performing millions of simulations follow directly from the results of our analytical treatment. Furthermore, the analytical results provide criteria for the identification of model validity and suggest modified models of gene network dynamics. The yeast cell-cycle network is used as an illustration of the practical application of this analytical treatment. PMID:24650364
Whyte, Robin K; Nelson, Harvey; Roberts, Robin S; Schmidt, Barbara
2017-03-01
It has been reported in the 3 Benefits of Oxygen Saturation Targeting (BOOST-II) trials that changes in oximeter calibration software resulted in clearer separation between the oxygen saturations in the two trial target groups. A revised analysis of the published BOOST-II data does not support this conclusion. Copyright © 2016 Elsevier Inc. All rights reserved.
2013-01-01
Background Rapid development of highly saturated genetic maps aids molecular breeding, which can accelerate gain per breeding cycle in woody perennial plants such as Rubus idaeus (red raspberry). Recently, robust genotyping methods based on high-throughput sequencing were developed, which provide high marker density, but result in some genotype errors and a large number of missing genotype values. Imputation can reduce the number of missing values and can correct genotyping errors, but current methods of imputation require a reference genome and thus are not an option for most species. Results Genotyping by Sequencing (GBS) was used to produce highly saturated maps for a R. idaeus pseudo-testcross progeny. While low coverage and high variance in sequencing resulted in a large number of missing values for some individuals, a novel method of imputation based on maximum likelihood marker ordering from initial marker segregation overcame the challenge of missing values, and made map construction computationally tractable. The two resulting parental maps contained 4521 and 2391 molecular markers spanning 462.7 and 376.6 cM respectively over seven linkage groups. Detection of precise genomic regions with segregation distortion was possible because of map saturation. Microsatellites (SSRs) linked these results to published maps for cross-validation and map comparison. Conclusions GBS together with genome-independent imputation provides a rapid method for genetic map construction in any pseudo-testcross progeny. Our method of imputation estimates the correct genotype call of missing values and corrects genotyping errors that lead to inflated map size and reduced precision in marker placement. Comparison of SSRs to published R. idaeus maps showed that the linkage maps constructed with GBS and our method of imputation were robust, and marker positioning reliable. The high marker density allowed identification of genomic regions with segregation distortion in R. idaeus, which may help to identify deleterious alleles that are the basis of inbreeding depression in the species. PMID:23324311
Schwarz, Patric; Pannes, Klaus Dieter; Nathan, Michel; Reimer, Hans Jorg; Kleespies, Axel; Kuhn, Nicole; Rupp, Anne; Zügel, Nikolaus Peter
2011-10-01
The decision to optimize the processes in the operating tract was based on two factors: competition among clinics and a desire to optimize the use of available resources. The aim of the project was to improve operating room (OR) capacity utilization by reduction of change and throughput time per patient. The study was conducted at Centre Hospitalier Emil Mayrisch Clinic for specialized care (n = 618 beds) Luxembourg (South). A prospective analysis was performed before and after the implementation of optimized processes. Value stream analysis and design (value stream mapping, VSM) were used as tools. VSM depicts patient throughput and the corresponding information flows. Furthermore it is used to identify process waste (e.g. time, human resources, materials, etc.). For this purpose, change times per patient (extubation of patient 1 until intubation of patient 2) and throughput times (inward transfer until outward transfer) were measured. VSM, change and throughput times for 48 patient flows (VSM A(1), actual state = initial situation) served as the starting point. Interdisciplinary development of an optimized VSM (VSM-O) was evaluated. Prospective analysis of 42 patients (VSM-A(2)) without and 75 patients (VSM-O) with an optimized process in place were conducted. The prospective analysis resulted in a mean change time of (mean ± SEM) VSM-A(2) 1,507 ± 100 s versus VSM-O 933 ± 66 s (p < 0.001). The mean throughput time VSM-A(2) (mean ± SEM) was 151 min (±8) versus VSM-O 120 min (±10) (p < 0.05). This corresponds to a 23% decrease in waiting time per patient in total. Efficient OR capacity utilization and the optimized use of human resources allowed an additional 1820 interventions to be carried out per year without any increase in human resources. In addition, perioperative patient monitoring was increased up to 100%.
Collaborative Core Research Program for Chemical-Biological Warfare Defense
2015-01-04
Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD...Discovery through High Throughput Screening (HTS) and Fragment-Based Drug Design (FBDD) Current pharmaceutical approaches involving drug discovery...structural analysis and docking program generally known as fragment based drug design (FBDD). The main advantage of using these approaches is that
Prevailing methodologies in the analysis of gene expression data often neglect to incorporate full concentration and time response due to limitations in throughput and sensitivity with traditional microarray approaches. We have developed a high throughput assay suite using primar...
Gas hydrate characterization from a 3D seismic dataset in the deepwater eastern Gulf of Mexico
DOE Office of Scientific and Technical Information (OSTI.GOV)
McConnell, Daniel; Haneberg, William C.
Seismic stratigraphic features are delineated using principal component analysis of the band limited data at potential gas hydrate sands, and compared and calibrated with spectral decomposition thickness to constrain thickness in the absence of well control. Layers in the abyssal fan sediments are thinner than can be resolved with 50 Hz seismic and thus comprise composite thin-bed reflections. Amplitude vs frequency analysis are used to indicate gas and gas hydrate reflections. Synthetic seismic wedge models show that with 50Hz seismic data, a 40% saturation of a Plio Pleistocene GoM sand in the hydrate stability zone with no subjacent gas canmore » produce a phase change (negative to positive) with a strong correlation between amplitude and hydrate saturation. The synthetic seismic response is more complicated if the gas hydrate filled sediments overlie gassy sediments. Hydrate (or gas) saturation in thin beds enhances the amplitude response and can be used to estimate saturation. Gas hydrate saturation from rock physics, amplitude, and frequency analysis is compared to saturation derived from inversion at several interpreted gas hydrate accumulations in the eastern Gulf of Mexico.« less
Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard
2015-02-09
Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Chisholm, Bret J.; Olson, Daniel R.; Brennan, Michael J.; Molaison, Chris A.
2002-02-01
Design, validation, and implementation of an optical spectroscopic system for high-throughput analysis of combinatorially developed protective organic coatings are reported. Our approach replaces labor-intensive coating evaluation steps with an automated system that rapidly analyzes 8x6 arrays of coating elements that are deposited on a plastic substrate. Each coating element of the library is 10 mm in diameter and 2 to 5 micrometers thick. Performance of coatings is evaluated with respect to their resistance to wear abrasion because this parameter is one of the primary considerations in end-use applications. Upon testing, the organic coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Coatings are abraded using industry-accepted abrasion test methods at single-or multiple-abrasion conditions, followed by high- throughput analysis of abrasion-induced light scatter. The developed automated system is optimized for the analysis of diffusively scattered light that corresponds to 0 to 30% haze. System precision of 0.1 to 2.5% relative standard deviation provides capability for the reliable ranking of coatings performance. While the system was implemented for high-throughput screening of combinatorially developed organic protective coatings for automotive applications, it can be applied to a variety of other applications where materials ranking can be achieved using optical spectroscopic tools.
Analysis of High-Throughput ELISA Microarray Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Amanda M.; Daly, Don S.; Zangar, Richard C.
Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).
Using ALFA for high throughput, distributed data transmission in the ALICE O2 system
NASA Astrophysics Data System (ADS)
Wegrzynek, A.;
2017-10-01
ALICE (A Large Ion Collider Experiment) is a heavy-ion detector designed to study the physics of strongly interacting matter (the Quark-Gluon Plasma at the CERN LHC (Large Hadron Collider). ALICE has been successfully collecting physics data in Run 2 since spring 2015. In parallel, preparations for a major upgrade of the computing system, called O2 (Online-Offline), scheduled for the Long Shutdown 2 in 2019-2020, are being made. One of the major requirements of the system is the capacity to transport data between so-called FLPs (First Level Processors), equipped with readout cards, and the EPNs (Event Processing Node), performing data aggregation, frame building and partial reconstruction. It is foreseen to have 268 FLPs dispatching data to 1500 EPNs with an average output of 20 Gb/s each. In overall, the O2 processing system will operate at terabits per second of throughput while handling millions of concurrent connections. The ALFA framework will standardize and handle software related tasks such as readout, data transport, frame building, calibration, online reconstruction and more in the upgraded computing system. ALFA supports two data transport libraries: ZeroMQ and nanomsg. This paper discusses the efficiency of ALFA in terms of high throughput data transport. The tests were performed with multiple FLPs pushing data to multiple EPNs. The transfer was done using push-pull communication patterns and two socket configurations: bind, connect. The set of benchmarks was prepared to get the most performant results on each hardware setup. The paper presents the measurement process and final results - data throughput combined with computing resources usage as a function of block size. The high number of nodes and connections in the final set up may cause race conditions that can lead to uneven load balancing and poor scalability. The performed tests allow us to validate whether the traffic is distributed evenly over all receivers. It also measures the behaviour of the network in saturation and evaluates scalability from a 1-to-1 to a N-to-M solution.
Klukas, Christian; Chen, Dijun; Pape, Jean-Michel
2014-01-01
High-throughput phenotyping is emerging as an important technology to dissect phenotypic components in plants. Efficient image processing and feature extraction are prerequisites to quantify plant growth and performance based on phenotypic traits. Issues include data management, image analysis, and result visualization of large-scale phenotypic data sets. Here, we present Integrated Analysis Platform (IAP), an open-source framework for high-throughput plant phenotyping. IAP provides user-friendly interfaces, and its core functions are highly adaptable. Our system supports image data transfer from different acquisition environments and large-scale image analysis for different plant species based on real-time imaging data obtained from different spectra. Due to the huge amount of data to manage, we utilized a common data structure for efficient storage and organization of data for both input data and result data. We implemented a block-based method for automated image processing to extract a representative list of plant phenotypic traits. We also provide tools for build-in data plotting and result export. For validation of IAP, we performed an example experiment that contains 33 maize (Zea mays ‘Fernandez’) plants, which were grown for 9 weeks in an automated greenhouse with nondestructive imaging. Subsequently, the image data were subjected to automated analysis with the maize pipeline implemented in our system. We found that the computed digital volume and number of leaves correlate with our manually measured data in high accuracy up to 0.98 and 0.95, respectively. In summary, IAP provides a multiple set of functionalities for import/export, management, and automated analysis of high-throughput plant phenotyping data, and its analysis results are highly reliable. PMID:24760818
High throughput protein production screening
Beernink, Peter T [Walnut Creek, CA; Coleman, Matthew A [Oakland, CA; Segelke, Brent W [San Ramon, CA
2009-09-08
Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.
Orchestrating high-throughput genomic analysis with Bioconductor
Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin
2015-01-01
Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503
Oxygen saturation in the dental pulp of permanent teeth: a critical review.
Bruno, Kely Firmino; Barletta, Fernando Branco; Felippe, Wilson Tadeu; Silva, Júlio Almeida; Gonçalves de Alencar, Ana Helena; Estrela, Carlos
2014-08-01
Pulse oximetry is a noninvasive method for assessing vascular health based on oxygen saturation level. The method has recently also been used to assess dental pulp vitality, but a median oxygen saturation level suggestive of normal pulp physiology has not been determined. The objective of this study was to make a critical analysis of the published research to establish the median oxygen saturation for the diagnosis of normal dental pulps in maxillary anterior permanent teeth using pulse oximetry. Studies reporting on the use of pulse oximeters to determine oxygen saturation in dental pulps were retrieved using the MEDLINE, Scientific Electronic Library Online, and Cochrane Central Register of Controlled Trials databases plus a manual search of relevant references cited by selected articles. Different combinations of the terms "oximetry," "oximeter," "pulp," "dental," and "dentistry" were used in the search. Statistical analysis was performed for each group of teeth (central incisors, lateral incisors, and canines) using R statistical software (US EPA ORD NHEERL, Corvallis, OR) and a random effects model (P < .0001) with an I(2) of 99%. Of the 295 articles found, only 6 met the inclusion criteria (472 teeth). Of these, the number of articles included in each analysis (according to tooth group) was as follows: all 6 studies (288 teeth) for central incisors at a median oxygen saturation of 87.73%, 3 studies (90 teeth) for lateral incisors at a median oxygen saturation of 87.24%, and 4 studies (94 teeth) for canines at a median oxygen saturation of 87.26%. The median oxygen saturation in normal dental pulps of permanent central incisors, lateral incisors, and canines was higher than 87%. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
Na, Hong; Laver, John D.; Jeon, Jouhyun; Singh, Fateh; Ancevicius, Kristin; Fan, Yujie; Cao, Wen Xi; Nie, Kun; Yang, Zhenglin; Luo, Hua; Wang, Miranda; Rissland, Olivia; Westwood, J. Timothy; Kim, Philip M.; Smibert, Craig A.; Lipshitz, Howard D.; Sidhu, Sachdev S.
2016-01-01
Post-transcriptional regulation of mRNAs plays an essential role in the control of gene expression. mRNAs are regulated in ribonucleoprotein (RNP) complexes by RNA-binding proteins (RBPs) along with associated protein and noncoding RNA (ncRNA) cofactors. A global understanding of post-transcriptional control in any cell type requires identification of the components of all of its RNP complexes. We have previously shown that these complexes can be purified by immunoprecipitation using anti-RBP synthetic antibodies produced by phage display. To develop the large number of synthetic antibodies required for a global analysis of RNP complex composition, we have established a pipeline that combines (i) a computationally aided strategy for design of antigens located outside of annotated domains, (ii) high-throughput antigen expression and purification in Escherichia coli, and (iii) high-throughput antibody selection and screening. Using this pipeline, we have produced 279 antibodies against 61 different protein components of Drosophila melanogaster RNPs. Together with those produced in our low-throughput efforts, we have a panel of 311 antibodies for 67 RNP complex proteins. Tests of a subset of our antibodies demonstrated that 89% immunoprecipitate their endogenous target from embryo lysate. This panel of antibodies will serve as a resource for global studies of RNP complexes in Drosophila. Furthermore, our high-throughput pipeline permits efficient production of synthetic antibodies against any large set of proteins. PMID:26847261
Model-Based Design of Long-Distance Tracer Transport Experiments in Plants.
Bühler, Jonas; von Lieres, Eric; Huber, Gregor J
2018-01-01
Studies of long-distance transport of tracer isotopes in plants offer a high potential for functional phenotyping, but so far measurement time is a bottleneck because continuous time series of at least 1 h are required to obtain reliable estimates of transport properties. Hence, usual throughput values are between 0.5 and 1 samples h -1 . Here, we propose to increase sample throughput by introducing temporal gaps in the data acquisition of each plant sample and measuring multiple plants one after each other in a rotating scheme. In contrast to common time series analysis methods, mechanistic tracer transport models allow the analysis of interrupted time series. The uncertainties of the model parameter estimates are used as a measure of how much information was lost compared to complete time series. A case study was set up to systematically investigate different experimental schedules for different throughput scenarios ranging from 1 to 12 samples h -1 . Selected designs with only a small amount of data points were found to be sufficient for an adequate parameter estimation, implying that the presented approach enables a substantial increase of sample throughput. The presented general framework for automated generation and evaluation of experimental schedules allows the determination of a maximal sample throughput and the respective optimal measurement schedule depending on the required statistical reliability of data acquired by future experiments.
Embedded Hyperchaotic Generators: A Comparative Analysis
NASA Astrophysics Data System (ADS)
Sadoudi, Said; Tanougast, Camel; Azzaz, Mohamad Salah; Dandache, Abbas
In this paper, we present a comparative analysis of FPGA implementation performances, in terms of throughput and resources cost, of five well known autonomous continuous hyperchaotic systems. The goal of this analysis is to identify the embedded hyperchaotic generator which leads to designs with small logic area cost, satisfactory throughput rates, low power consumption and low latency required for embedded applications such as secure digital communications between embedded systems. To implement the four-dimensional (4D) chaotic systems, we use a new structural hardware architecture based on direct VHDL description of the forth order Runge-Kutta method (RK-4). The comparative analysis shows that the hyperchaotic Lorenz generator provides attractive performances compared to that of others. In fact, its hardware implementation requires only 2067 CLB-slices, 36 multipliers and no block RAMs, and achieves a throughput rate of 101.6 Mbps, at the output of the FPGA circuit, at a clock frequency of 25.315 MHz with a low latency time of 316 ns. Consequently, these good implementation performances offer to the embedded hyperchaotic Lorenz generator the advantage of being the best candidate for embedded communications applications.
Arrayed water-in-oil droplet bilayers for membrane transport analysis.
Watanabe, R; Soga, N; Hara, M; Noji, H
2016-08-02
The water-in-oil droplet bilayer is a simple and useful lipid bilayer system for membrane transport analysis. The droplet interface bilayer is readily formed by the contact of two water-in-oil droplets enwrapped by a phospholipid monolayer. However, the size of individual droplets with femtoliter volumes in a high-throughput manner is difficult to control, resulting in low sensitivity and throughput of membrane transport analysis. To overcome this drawback, in this study, we developed a novel micro-device in which a large number of droplet interface bilayers (>500) are formed at a time by using femtoliter-sized droplet arrays immobilized on a hydrophobic/hydrophilic substrate. The droplet volume was controllable from 3.5 to 350 fL by changing the hydrophobic/hydrophilic pattern on the device, allowing high-throughput analysis of membrane transport mechanisms including membrane permeability to solutes (e.g., ions or small molecules) with or without the aid of transport proteins. Thus, this novel platform broadens the versatility of water-in-oil droplet bilayers and will pave the way for novel analytical and pharmacological applications such as drug screening.
High-throughput tetrad analysis.
Ludlow, Catherine L; Scott, Adrian C; Cromie, Gareth A; Jeffery, Eric W; Sirr, Amy; May, Patrick; Lin, Jake; Gilbert, Teresa L; Hays, Michelle; Dudley, Aimée M
2013-07-01
Tetrad analysis has been a gold-standard genetic technique for several decades. Unfortunately, the need to manually isolate, disrupt and space tetrads has relegated its application to small-scale studies and limited its integration with high-throughput DNA sequencing technologies. We have developed a rapid, high-throughput method, called barcode-enabled sequencing of tetrads (BEST), that uses (i) a meiosis-specific GFP fusion protein to isolate tetrads by FACS and (ii) molecular barcodes that are read during genotyping to identify spores derived from the same tetrad. Maintaining tetrad information allows accurate inference of missing genetic markers and full genotypes of missing (and presumably nonviable) individuals. An individual researcher was able to isolate over 3,000 yeast tetrads in 3 h, an output equivalent to that of almost 1 month of manual dissection. BEST is transferable to other microorganisms for which meiotic mapping is significantly more laborious.
Lee, M.W.; Collett, T.S.
2009-01-01
During the Indian National Gas Hydrate Program Expedition 01 (NGHP-Ol), one of the richest marine gas hydrate accumulations was discovered at Site NGHP-01-10 in the Krishna-Godavari Basin. The occurrence of concentrated gas hydrate at this site is primarily controlled by the presence of fractures. Assuming the resistivity of gas hydratebearing sediments is isotropic, th?? conventional Archie analysis using the logging while drilling resistivity log yields gas hydrate saturations greater than 50% (as high as ???80%) of the pore space for the depth interval between ???25 and ???160 m below seafloor. On the other hand, gas hydrate saturations estimated from pressure cores from nearby wells were less than ???26% of the pore space. Although intrasite variability may contribute to the difference, the primary cause of the saturation difference is attributed to the anisotropic nature of the reservoir due to gas hydrate in high-angle fractures. Archie's law can be used to estimate gas hydrate saturations in anisotropic reservoir, with additional information such as elastic velocities to constrain Archie cementation parameters m and the saturation exponent n. Theory indicates that m and n depend on the direction of the measurement relative to fracture orientation, as well as depending on gas hydrate saturation. By using higher values of m and n in the resistivity analysis for fractured reservoirs, the difference between saturation estimates is significantly reduced, although a sizable difference remains. To better understand the nature of fractured reservoirs, wireline P and S wave velocities were also incorporated into the analysis.
Sánchez-Sevilla, José F.; Horvath, Aniko; Botella, Miguel A.; Gaston, Amèlia; Folta, Kevin; Kilian, Andrzej; Denoyes, Beatrice; Amaya, Iraida
2015-01-01
Cultivated strawberry (Fragaria × ananassa) is a genetically complex allo-octoploid crop with 28 pairs of chromosomes (2n = 8x = 56) for which a genome sequence is not yet available. The diploid Fragaria vesca is considered the donor species of one of the octoploid sub-genomes and its available genome sequence can be used as a reference for genomic studies. A wide number of strawberry cultivars are stored in ex situ germplasm collections world-wide but a number of previous studies have addressed the genetic diversity present within a limited number of these collections. Here, we report the development and application of two platforms based on the implementation of Diversity Array Technology (DArT) markers for high-throughput genotyping in strawberry. The first DArT microarray was used to evaluate the genetic diversity of 62 strawberry cultivars that represent a wide range of variation based on phenotype, geographical and temporal origin and pedigrees. A total of 603 DArT markers were used to evaluate the diversity and structure of the population and their cluster analyses revealed that these markers were highly efficient in classifying the accessions in groups based on historical, geographical and pedigree-based cues. The second DArTseq platform took benefit of the complexity reduction method optimized for strawberry and the development of next generation sequencing technologies. The strawberry DArTseq was used to generate a total of 9,386 SNP markers in the previously developed ‘232’ × ‘1392’ mapping population, of which, 4,242 high quality markers were further selected to saturate this map after several filtering steps. The high-throughput platforms here developed for genotyping strawberry will facilitate genome-wide characterizations of large accessions sets and complement other available options. PMID:26675207
USDA-ARS?s Scientific Manuscript database
Recent developments in high-throughput sequencing technology have made low-cost sequencing an attractive approach for many genome analysis tasks. Increasing read lengths, improving quality and the production of increasingly larger numbers of usable sequences per instrument-run continue to make whole...
USDA-ARS?s Scientific Manuscript database
The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...
USDA-ARS?s Scientific Manuscript database
Extraction of DNA from tissue samples can be expensive both in time and monetary resources and can often require handling and disposal of hazardous chemicals. We have developed a high throughput protocol for extracting DNA from honey bees that is of a high enough quality and quantity to enable hundr...
Lessons from high-throughput protein crystallization screening: 10 years of practical experience
JR, Luft; EH, Snell; GT, DeTitta
2011-01-01
Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073
High-throughput screening based on label-free detection of small molecule microarrays
NASA Astrophysics Data System (ADS)
Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong
2017-02-01
Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.
David, Fabrice P A; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch.
HTSstation: A Web Application and Open-Access Libraries for High-Throughput Sequencing Data Analysis
David, Fabrice P. A.; Delafontaine, Julien; Carat, Solenne; Ross, Frederick J.; Lefebvre, Gregory; Jarosz, Yohan; Sinclair, Lucas; Noordermeer, Daan; Rougemont, Jacques; Leleu, Marion
2014-01-01
The HTSstation analysis portal is a suite of simple web forms coupled to modular analysis pipelines for various applications of High-Throughput Sequencing including ChIP-seq, RNA-seq, 4C-seq and re-sequencing. HTSstation offers biologists the possibility to rapidly investigate their HTS data using an intuitive web application with heuristically pre-defined parameters. A number of open-source software components have been implemented and can be used to build, configure and run HTS analysis pipelines reactively. Besides, our programming framework empowers developers with the possibility to design their own workflows and integrate additional third-party software. The HTSstation web application is accessible at http://htsstation.epfl.ch. PMID:24475057
Deciphering the genomic targets of alkylating polyamide conjugates using high-throughput sequencing
Chandran, Anandhakumar; Syed, Junetha; Taylor, Rhys D.; Kashiwazaki, Gengo; Sato, Shinsuke; Hashiya, Kaori; Bando, Toshikazu; Sugiyama, Hiroshi
2016-01-01
Chemically engineered small molecules targeting specific genomic sequences play an important role in drug development research. Pyrrole-imidazole polyamides (PIPs) are a group of molecules that can bind to the DNA minor-groove and can be engineered to target specific sequences. Their biological effects rely primarily on their selective DNA binding. However, the binding mechanism of PIPs at the chromatinized genome level is poorly understood. Herein, we report a method using high-throughput sequencing to identify the DNA-alkylating sites of PIP-indole-seco-CBI conjugates. High-throughput sequencing analysis of conjugate 2 showed highly similar DNA-alkylating sites on synthetic oligos (histone-free DNA) and on human genomes (chromatinized DNA context). To our knowledge, this is the first report identifying alkylation sites across genomic DNA by alkylating PIP conjugates using high-throughput sequencing. PMID:27098039
Madanecki, Piotr; Bałut, Magdalena; Buckley, Patrick G; Ochocka, J Renata; Bartoszewski, Rafał; Crossman, David K; Messiaen, Ludwine M; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp).
Bałut, Magdalena; Buckley, Patrick G.; Ochocka, J. Renata; Bartoszewski, Rafał; Crossman, David K.; Messiaen, Ludwine M.; Piotrowski, Arkadiusz
2018-01-01
High-throughput technologies generate considerable amount of data which often requires bioinformatic expertise to analyze. Here we present High-Throughput Tabular Data Processor (HTDP), a platform independent Java program. HTDP works on any character-delimited column data (e.g. BED, GFF, GTF, PSL, WIG, VCF) from multiple text files and supports merging, filtering and converting of data that is produced in the course of high-throughput experiments. HTDP can also utilize itemized sets of conditions from external files for complex or repetitive filtering/merging tasks. The program is intended to aid global, real-time processing of large data sets using a graphical user interface (GUI). Therefore, no prior expertise in programming, regular expression, or command line usage is required of the user. Additionally, no a priori assumptions are imposed on the internal file composition. We demonstrate the flexibility and potential of HTDP in real-life research tasks including microarray and massively parallel sequencing, i.e. identification of disease predisposing variants in the next generation sequencing data as well as comprehensive concurrent analysis of microarray and sequencing results. We also show the utility of HTDP in technical tasks including data merge, reduction and filtering with external criteria files. HTDP was developed to address functionality that is missing or rudimentary in other GUI software for processing character-delimited column data from high-throughput technologies. Flexibility, in terms of input file handling, provides long term potential functionality in high-throughput analysis pipelines, as the program is not limited by the currently existing applications and data formats. HTDP is available as the Open Source software (https://github.com/pmadanecki/htdp). PMID:29432475
NASA Astrophysics Data System (ADS)
Zhao, Luanxiao; Yuan, Hemin; Yang, Jingkang; Han, De-hua; Geng, Jianhua; Zhou, Rui; Li, Hui; Yao, Qiuliang
2017-11-01
Conventional seismic analysis in partially saturated rocks normally lays emphasis on estimating pore fluid content and saturation, typically ignoring the effect of mobility, which decides the ability of fluids moving in the porous rocks. Deformation resulting from a seismic wave in heterogeneous partially saturated media can cause pore fluid pressure relaxation at mesoscopic scale, thereby making the fluid mobility inherently associated with poroelastic reflectivity. For two typical gas-brine reservoir models, with the given rock and fluid properties, the numerical analysis suggests that variations of patchy fluid saturation, fluid compressibility contrast, and acoustic stiffness of rock frame collectively affect the seismic reflection dependence on mobility. In particular, the realistic compressibility contrast of fluid patches in shallow and deep reservoir environments plays an important role in determining the reflection sensitivity to mobility. We also use a time-lapse seismic data set from a Steam-Assisted Gravity Drainage producing heavy oil reservoir to demonstrate that mobility change coupled with patchy saturation possibly leads to seismic spectral energy shifting from the baseline to monitor line. Our workflow starts from performing seismic spectral analysis on the targeted reflectivity interface. Then, on the basis of mesoscopic fluid pressure diffusion between patches of steam and heavy oil, poroelastic reflectivity modeling is conducted to understand the shift of the central frequency toward low frequencies after the steam injection. The presented results open the possibility of monitoring mobility change of a partially saturated geological formation from dissipation-related seismic attributes.
High-throughput full-length single-cell mRNA-seq of rare cells.
Ooi, Chin Chun; Mantalas, Gary L; Koh, Winston; Neff, Norma F; Fuchigami, Teruaki; Wong, Dawson J; Wilson, Robert J; Park, Seung-Min; Gambhir, Sanjiv S; Quake, Stephen R; Wang, Shan X
2017-01-01
Single-cell characterization techniques, such as mRNA-seq, have been applied to a diverse range of applications in cancer biology, yielding great insight into mechanisms leading to therapy resistance and tumor clonality. While single-cell techniques can yield a wealth of information, a common bottleneck is the lack of throughput, with many current processing methods being limited to the analysis of small volumes of single cell suspensions with cell densities on the order of 107 per mL. In this work, we present a high-throughput full-length mRNA-seq protocol incorporating a magnetic sifter and magnetic nanoparticle-antibody conjugates for rare cell enrichment, and Smart-seq2 chemistry for sequencing. We evaluate the efficiency and quality of this protocol with a simulated circulating tumor cell system, whereby non-small-cell lung cancer cell lines (NCI-H1650 and NCI-H1975) are spiked into whole blood, before being enriched for single-cell mRNA-seq by EpCAM-functionalized magnetic nanoparticles and the magnetic sifter. We obtain high efficiency (> 90%) capture and release of these simulated rare cells via the magnetic sifter, with reproducible transcriptome data. In addition, while mRNA-seq data is typically only used for gene expression analysis of transcriptomic data, we demonstrate the use of full-length mRNA-seq chemistries like Smart-seq2 to facilitate variant analysis of expressed genes. This enables the use of mRNA-seq data for differentiating cells in a heterogeneous population by both their phenotypic and variant profile. In a simulated heterogeneous mixture of circulating tumor cells in whole blood, we utilize this high-throughput protocol to differentiate these heterogeneous cells by both their phenotype (lung cancer versus white blood cells), and mutational profile (H1650 versus H1975 cells), in a single sequencing run. This high-throughput method can help facilitate single-cell analysis of rare cell populations, such as circulating tumor or endothelial cells, with demonstrably high-quality transcriptomic data.
Condor-COPASI: high-throughput computing for biochemical networks
2012-01-01
Background Mathematical modelling has become a standard technique to improve our understanding of complex biological systems. As models become larger and more complex, simulations and analyses require increasing amounts of computational power. Clusters of computers in a high-throughput computing environment can help to provide the resources required for computationally expensive model analysis. However, exploiting such a system can be difficult for users without the necessary expertise. Results We present Condor-COPASI, a server-based software tool that integrates COPASI, a biological pathway simulation tool, with Condor, a high-throughput computing environment. Condor-COPASI provides a web-based interface, which makes it extremely easy for a user to run a number of model simulation and analysis tasks in parallel. Tasks are transparently split into smaller parts, and submitted for execution on a Condor pool. Result output is presented to the user in a number of formats, including tables and interactive graphical displays. Conclusions Condor-COPASI can effectively use a Condor high-throughput computing environment to provide significant gains in performance for a number of model simulation and analysis tasks. Condor-COPASI is free, open source software, released under the Artistic License 2.0, and is suitable for use by any institution with access to a Condor pool. Source code is freely available for download at http://code.google.com/p/condor-copasi/, along with full instructions on deployment and usage. PMID:22834945
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Boyacı, Ezel; Bojko, Barbara; Reyes-Garcés, Nathaly; Poole, Justen J; Gómez-Ríos, Germán Augusto; Teixeira, Alexandre; Nicol, Beate; Pawliszyn, Janusz
2018-01-18
In vitro high-throughput non-depletive quantitation of chemicals in biofluids is of growing interest in many areas. Some of the challenges facing researchers include the limited volume of biofluids, rapid and high-throughput sampling requirements, and the lack of reliable methods. Coupled to the above, growing interest in the monitoring of kinetics and dynamics of miniaturized biosystems has spurred the demand for development of novel and revolutionary methodologies for analysis of biofluids. The applicability of solid-phase microextraction (SPME) is investigated as a potential technology to fulfill the aforementioned requirements. As analytes with sufficient diversity in their physicochemical features, nicotine, N,N-Diethyl-meta-toluamide, and diclofenac were selected as test compounds for the study. The objective was to develop methodologies that would allow repeated non-depletive sampling from 96-well plates, using 100 µL of sample. Initially, thin film-SPME was investigated. Results revealed substantial depletion and consequent disruption in the system. Therefore, new ultra-thin coated fibers were developed. The applicability of this device to the described sampling scenario was tested by determining the protein binding of the analytes. Results showed good agreement with rapid equilibrium dialysis. The presented method allows high-throughput analysis using small volumes, enabling fast reliable free and total concentration determinations without disruption of system equilibrium.
Molecular mapping and breeding with microsatellite markers.
Lightfoot, David A; Iqbal, Muhammad J
2013-01-01
In genetics databases for crop plant species across the world, there are thousands of mapped loci that underlie quantitative traits, oligogenic traits, and simple traits recognized by association mapping in populations. The number of loci will increase as new phenotypes are measured in more diverse genotypes and genetic maps based on saturating numbers of markers are developed. A period of locus reevaluation will decrease the number of important loci as those underlying mega-environmental effects are recognized. A second wave of reevaluation of loci will follow from developmental series analysis, especially for harvest traits like seed yield and composition. Breeding methods to properly use the accurate maps of QTL are being developed. New methods to map, fine map, and isolate the genes underlying the loci will be critical to future advances in crop biotechnology. Microsatellite markers are the most useful tool for breeders. They are codominant, abundant in all genomes, highly polymorphic so useful in many populations, and both economical and technically easy to use. The selective genotyping approaches, including genotype ranking (indexing) based on partial phenotype data combined with favorable allele data and bulked segregation event (segregant) analysis (BSA), will be increasingly important uses for microsatellites. Examples of the methods for developing and using microsatellites derived from genomic sequences are presented for monogenic, oligogenic, and polygenic traits. Examples of successful mapping, fine mapping, and gene isolation are given. When combined with high-throughput methods for genotyping and a genome sequence, the use of association mapping with microsatellite markers will provide critical advances in the analysis of crop traits.
Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho
2013-01-01
Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnology-based materials and living cells in both in vitro and in vivo settings. PMID:24258011
Opportunistic data locality for end user data analysis
NASA Astrophysics Data System (ADS)
Fischer, M.; Heidecker, C.; Kuehn, E.; Quast, G.; Giffels, M.; Schnepf, M.; Heiss, A.; Petzold, A.
2017-10-01
With the increasing data volume of LHC Run2, user analyses are evolving towards increasing data throughput. This evolution translates to higher requirements for efficiency and scalability of the underlying analysis infrastructure. We approach this issue with a new middleware to optimise data access: a layer of coordinated caches transparently provides data locality for high-throughput analyses. We demonstrated the feasibility of this approach with a prototype used for analyses of the CMS working groups at KIT. In this paper, we present our experience both with the approach in general, and our prototype in specific.
Kim, Eung-Sam; Ahn, Eun Hyun; Chung, Euiheon; Kim, Deok-Ho
2013-12-01
Nanotechnology-based tools are beginning to emerge as promising platforms for quantitative high-throughput analysis of live cells and tissues. Despite unprecedented progress made over the last decade, a challenge still lies in integrating emerging nanotechnology-based tools into macroscopic biomedical apparatuses for practical purposes in biomedical sciences. In this review, we discuss the recent advances and limitations in the analysis and control of mechanical, biochemical, fluidic, and optical interactions in the interface areas of nanotechnologybased materials and living cells in both in vitro and in vivo settings.
NASA Technical Reports Server (NTRS)
Clare, L. P.; Yan, T.-Y.
1985-01-01
The analysis of the ALOHA random access protocol for communications channels with fading is presented. The protocol is modified to send multiple contiguous copies of a message at each transmission attempt. Both pure and slotted ALOHA channels are considered. A general two state model is used for the channel error process to account for the channel fading memory. It is shown that greater throughput and smaller delay may be achieved using repetitions. The model is applied to the analysis of the delay-throughput performance in a fading mobile communications environment. Numerical results are given for NASA's Mobile Satellite Experiment.
Labanieh, Louai; Nguyen, Thi N.; Zhao, Weian; Kang, Dong-Ku
2016-01-01
We describe the design, fabrication and use of a dual-layered microfluidic device for ultrahigh-throughput droplet trapping, analysis, and recovery using droplet buoyancy. To demonstrate the utility of this device for digital quantification of analytes, we quantify the number of droplets, which contain a β-galactosidase-conjugated bead among more than 100,000 immobilized droplets. In addition, we demonstrate that this device can be used for droplet clustering and real-time analysis by clustering several droplets together into microwells and monitoring diffusion of fluorescein, a product of the enzymatic reaction of β-galactosidase and its fluorogenic substrate FDG, between droplets. PMID:27134760
High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Human Plasma.
Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan
2016-07-01
We present a high-throughput, nontargeted lipidomics approach using liquid chromatography coupled to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. We applied this method to screen a wide range of fatty acids from medium-chain to very long-chain (8 to 24 carbon atoms) in human plasma samples. The method enables us to chromatographically separate branched-chain species from their straight-chain isomers as well as separate biologically important ω-3 and ω-6 polyunsaturated fatty acids. We used 51 fatty acid species to demonstrate the quantitative capability of this method with quantification limits in the nanomolar range; however, this method is not limited only to these fatty acid species. High-throughput sample preparation was developed and carried out on a robotic platform that allows extraction of 96 samples simultaneously within 3 h. This high-throughput platform was used to assess the influence of different types of human plasma collection and preparation on the nonesterified fatty acid profile of healthy donors. Use of the anticoagulants EDTA and heparin has been compared with simple clotting, and only limited changes have been detected in most nonesterified fatty acid concentrations.
Short-read, high-throughput sequencing technology for STR genotyping
Bornman, Daniel M.; Hester, Mark E.; Schuetter, Jared M.; Kasoji, Manjula D.; Minard-Smith, Angela; Barden, Curt A.; Nelson, Scott C.; Godbold, Gene D.; Baker, Christine H.; Yang, Boyu; Walther, Jacquelyn E.; Tornes, Ivan E.; Yan, Pearlly S.; Rodriguez, Benjamin; Bundschuh, Ralf; Dickens, Michael L.; Young, Brian A.; Faith, Seth A.
2013-01-01
DNA-based methods for human identification principally rely upon genotyping of short tandem repeat (STR) loci. Electrophoretic-based techniques for variable-length classification of STRs are universally utilized, but are limited in that they have relatively low throughput and do not yield nucleotide sequence information. High-throughput sequencing technology may provide a more powerful instrument for human identification, but is not currently validated for forensic casework. Here, we present a systematic method to perform high-throughput genotyping analysis of the Combined DNA Index System (CODIS) STR loci using short-read (150 bp) massively parallel sequencing technology. Open source reference alignment tools were optimized to evaluate PCR-amplified STR loci using a custom designed STR genome reference. Evaluation of this approach demonstrated that the 13 CODIS STR loci and amelogenin (AMEL) locus could be accurately called from individual and mixture samples. Sensitivity analysis showed that as few as 18,500 reads, aligned to an in silico referenced genome, were required to genotype an individual (>99% confidence) for the CODIS loci. The power of this technology was further demonstrated by identification of variant alleles containing single nucleotide polymorphisms (SNPs) and the development of quantitative measurements (reads) for resolving mixed samples. PMID:25621315
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
High-Throughput RT-PCR for small-molecule screening assays
Bittker, Joshua A.
2012-01-01
Quantitative measurement of the levels of mRNA expression using real-time reverse transcription polymerase chain reaction (RT-PCR) has long been used for analyzing expression differences in tissue or cell lines of interest. This method has been used somewhat less frequently to measure the changes in gene expression due to perturbagens such as small molecules or siRNA. The availability of new instrumentation for liquid handling and real-time PCR analysis as well as the commercial availability of start-to-finish kits for RT-PCR has enabled the use of this method for high-throughput small-molecule screening on a scale comparable to traditional high-throughput screening (HTS) assays. This protocol focuses on the special considerations necessary for using quantitative RT-PCR as a primary small-molecule screening assay, including the different methods available for mRNA isolation and analysis. PMID:23487248
The promise and challenge of high-throughput sequencing of the antibody repertoire
Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R
2014-01-01
Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474
[Weighted gene co-expression network analysis in biomedicine research].
Liu, Wei; Li, Li; Ye, Hua; Tu, Wei
2017-11-25
High-throughput biological technologies are now widely applied in biology and medicine, allowing scientists to monitor thousands of parameters simultaneously in a specific sample. However, it is still an enormous challenge to mine useful information from high-throughput data. The emergence of network biology provides deeper insights into complex bio-system and reveals the modularity in tissue/cellular networks. Correlation networks are increasingly used in bioinformatics applications. Weighted gene co-expression network analysis (WGCNA) tool can detect clusters of highly correlated genes. Therefore, we systematically reviewed the application of WGCNA in the study of disease diagnosis, pathogenesis and other related fields. First, we introduced principle, workflow, advantages and disadvantages of WGCNA. Second, we presented the application of WGCNA in disease, physiology, drug, evolution and genome annotation. Then, we indicated the application of WGCNA in newly developed high-throughput methods. We hope this review will help to promote the application of WGCNA in biomedicine research.
High-throughput sequencing: a failure mode analysis.
Yang, George S; Stott, Jeffery M; Smailus, Duane; Barber, Sarah A; Balasundaram, Miruna; Marra, Marco A; Holt, Robert A
2005-01-04
Basic manufacturing principles are becoming increasingly important in high-throughput sequencing facilities where there is a constant drive to increase quality, increase efficiency, and decrease operating costs. While high-throughput centres report failure rates typically on the order of 10%, the causes of sporadic sequencing failures are seldom analyzed in detail and have not, in the past, been formally reported. Here we report the results of a failure mode analysis of our production sequencing facility based on detailed evaluation of 9,216 ESTs generated from two cDNA libraries. Two categories of failures are described; process-related failures (failures due to equipment or sample handling) and template-related failures (failures that are revealed by close inspection of electropherograms and are likely due to properties of the template DNA sequence itself). Preventative action based on a detailed understanding of failure modes is likely to improve the performance of other production sequencing pipelines.
Spotsizer: High-throughput quantitative analysis of microbial growth.
Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg
2016-10-01
Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.
Tien, Jerry F; Fong, Kimberly K; Umbreit, Neil T; Payen, Celia; Zelter, Alex; Asbury, Charles L; Dunham, Maitreya J; Davis, Trisha N
2013-09-01
During mitosis, kinetochores physically link chromosomes to the dynamic ends of spindle microtubules. This linkage depends on the Ndc80 complex, a conserved and essential microtubule-binding component of the kinetochore. As a member of the complex, the Ndc80 protein forms microtubule attachments through a calponin homology domain. Ndc80 is also required for recruiting other components to the kinetochore and responding to mitotic regulatory signals. While the calponin homology domain has been the focus of biochemical and structural characterization, the function of the remainder of Ndc80 is poorly understood. Here, we utilized a new approach that couples high-throughput sequencing to a saturating linker-scanning mutagenesis screen in Saccharomyces cerevisiae. We identified domains in previously uncharacterized regions of Ndc80 that are essential for its function in vivo. We show that a helical hairpin adjacent to the calponin homology domain influences microtubule binding by the complex. Furthermore, a mutation in this hairpin abolishes the ability of the Dam1 complex to strengthen microtubule attachments made by the Ndc80 complex. Finally, we defined a C-terminal segment of Ndc80 required for tetramerization of the Ndc80 complex in vivo. This unbiased mutagenesis approach can be generally applied to genes in S. cerevisiae to identify functional properties and domains.
Huang, Rui; Chen, Hui; Zhong, Chao; Kim, Jae Eung; Zhang, Yi-Heng Percival
2016-09-02
Coenzyme engineering that changes NAD(P) selectivity of redox enzymes is an important tool in metabolic engineering, synthetic biology, and biocatalysis. Here we developed a high throughput screening method to identify mutants of 6-phosphogluconate dehydrogenase (6PGDH) from a thermophilic bacterium Moorella thermoacetica with reversed coenzyme selectivity from NADP(+) to NAD(+). Colonies of a 6PGDH mutant library growing on the agar plates were treated by heat to minimize the background noise, that is, the deactivation of intracellular dehydrogenases, degradation of inherent NAD(P)H, and disruption of cell membrane. The melted agarose solution containing a redox dye tetranitroblue tetrazolium (TNBT), phenazine methosulfate (PMS), NAD(+), and 6-phosphogluconate was carefully poured on colonies, forming a second semi-solid layer. More active 6PGDH mutants were examined via an enzyme-linked TNBT-PMS colorimetric assay. Positive mutants were recovered by direct extraction of plasmid from dead cell colonies followed by plasmid transformation into E. coli TOP10. By utilizing this double-layer screening method, six positive mutants were obtained from two-round saturation mutagenesis. The best mutant 6PGDH A30D/R31I/T32I exhibited a 4,278-fold reversal of coenzyme selectivity from NADP(+) to NAD(+). This screening method could be widely used to detect numerous redox enzymes, particularly for thermophilic ones, which can generate NAD(P)H reacted with the redox dye TNBT.
A High-Throughput, Precipitating Colorimetric Sandwich ELISA Microarray for Shiga Toxins
Gehring, Andrew; He, Xiaohua; Fratamico, Pina; Lee, Joseph; Bagi, Lori; Brewster, Jeffrey; Paoli, George; He, Yiping; Xie, Yanping; Skinner, Craig; Barnett, Charlie; Harris, Douglas
2014-01-01
Shiga toxins 1 and 2 (Stx1 and Stx2) from Shiga toxin-producing E. coli (STEC) bacteria were simultaneously detected with a newly developed, high-throughput antibody microarray platform. The proteinaceous toxins were immobilized and sandwiched between biorecognition elements (monoclonal antibodies) and pooled horseradish peroxidase (HRP)-conjugated monoclonal antibodies. Following the reaction of HRP with the precipitating chromogenic substrate (metal enhanced 3,3-diaminobenzidine tetrahydrochloride or DAB), the formation of a colored product was quantitatively measured with an inexpensive flatbed page scanner. The colorimetric ELISA microarray was demonstrated to detect Stx1 and Stx2 at levels as low as ~4.5 ng/mL within ~2 h of total assay time with a narrow linear dynamic range of ~1–2 orders of magnitude and saturation levels well above background. Stx1 and/or Stx2 produced by various strains of STEC were also detected following the treatment of cultured cells with mitomycin C (a toxin-inducing antibiotic) and/or B-PER (a cell-disrupting, protein extraction reagent). Semi-quantitative detection of Shiga toxins was demonstrated to be sporadic among various STEC strains following incubation with mitomycin C; however, further reaction with B-PER generally resulted in the detection of or increased detection of Stx1, relative to Stx2, produced by STECs inoculated into either axenic broth culture or culture broth containing ground beef. PMID:24921195
Mobile element biology – new possibilities with high-throughput sequencing
Xing, Jinchuan; Witherspoon, David J.; Jorde, Lynn B.
2014-01-01
Mobile elements compose more than half of the human genome, but until recently their large-scale detection was time-consuming and challenging. With the development of new high-throughput sequencing technologies, the complete spectrum of mobile element variation in humans can now be identified and analyzed. Thousands of new mobile element insertions have been discovered, yielding new insights into mobile element biology, evolution, and genomic variation. We review several high-throughput methods, with an emphasis on techniques that specifically target mobile element insertions in humans, and we highlight recent applications of these methods in evolutionary studies and in the analysis of somatic alterations in human cancers. PMID:23312846
Advances in high throughput DNA sequence data compression.
Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz
2016-06-01
Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
Fenaille, François; Visani, Piero; Fumeaux, René; Milo, Christian; Guy, Philippe A
2003-04-23
Two headspace techniques based on mass spectrometry detection (MS), electronic nose, and solid phase microextraction coupled to gas chromatography-mass spectrometry (SPME-GC/MS) were evaluated for their ability to differentiate various infant formula powders based on changes of their volatiles upon storage. The electronic nose gave unresolved MS fingerprints of the samples gas phases that were further submitted to principal component analysis (PCA). Such direct MS recording combined to multivariate treatment enabled a rapid differentiation of the infant formulas over a 4 week storage test. Although MS-based electronic nose advantages are its easy-to-use aspect and its meaningful data interpretation obtained with a high throughput (100 samples per 24 h), its greatest disadvantage is that the present compounds could not be identified and quantified. For these reasons, a SPME-GC/MS measurement was also investigated. This technique allowed the identification of saturated aldehydes as the main volatiles present in the headspace of infant milk powders. An isotope dilution assay was further developed to quantitate hexanal as a potential indicator of infant milk powder oxidation. Thus, hexanal content was found to vary from roughly 500 and 3500 microg/kg for relatively non-oxidized and oxidized infant formulas, respectively.
Chen, Xun; Stout, Steven; Mueller, Uwe; Boykow, George; Visconti, Richard; Siliphaivanh, Phieng; Spencer, Kerrie; Presland, Jeremy; Kavana, Michael; Basso, Andrea D; McLaren, David G; Myers, Robert W
2017-08-01
We have developed and validated label-free, liquid chromatography-mass spectrometry (LC-MS)-based equilibrium direct and competition binding assays to quantitate small-molecule antagonist binding to recombinant human and mouse BLT1 receptors expressed in HEK 293 cell membranes. Procedurally, these binding assays involve (1) equilibration of the BLT1 receptor and probe ligand, with or without a competitor; (2) vacuum filtration through cationic glass fiber filters to separate receptor-bound from free probe ligand; and (3) LC-MS analysis in selected reaction monitoring mode for bound probe ligand quantitation. Two novel, optimized probe ligands, compounds 1 and 2, were identified by screening 20 unlabeled BLT1 antagonists for direct binding. Saturation direct binding studies confirmed the high affinity, and dissociation studies established the rapid binding kinetics of probe ligands 1 and 2. Competition binding assays were established using both probe ligands, and the affinities of structurally diverse BLT1 antagonists were measured. Both binding assay formats can be executed with high specificity and sensitivity and moderate throughput (96-well plate format) using these approaches. This highly versatile, label-free method for studying ligand binding to membrane-associated receptors should find broad application as an alternative to traditional methods using labeled ligands.
Wang, Xixian; Ren, Lihui; Su, Yetian; Ji, Yuetong; Liu, Yaoping; Li, Chunyu; Li, Xunrong; Zhang, Yi; Wang, Wei; Hu, Qiang; Han, Danxiang; Xu, Jian; Ma, Bo
2017-11-21
Raman-activated cell sorting (RACS) has attracted increasing interest, yet throughput remains one major factor limiting its broader application. Here we present an integrated Raman-activated droplet sorting (RADS) microfluidic system for functional screening of live cells in a label-free and high-throughput manner, by employing AXT-synthetic industrial microalga Haematococcus pluvialis (H. pluvialis) as a model. Raman microspectroscopy analysis of individual cells is carried out prior to their microdroplet encapsulation, which is then directly coupled to DEP-based droplet sorting. To validate the system, H. pluvialis cells containing different levels of AXT were mixed and underwent RADS. Those AXT-hyperproducing cells were sorted with an accuracy of 98.3%, an enrichment ratio of eight folds, and a throughput of ∼260 cells/min. Of the RADS-sorted cells, 92.7% remained alive and able to proliferate, which is equivalent to the unsorted cells. Thus, the RADS achieves a much higher throughput than existing RACS systems, preserves the vitality of cells, and facilitates seamless coupling with downstream manipulations such as single-cell sequencing and cultivation.
Analysis of Container Yard Capacity In North TPK Using ARIMA Method
NASA Astrophysics Data System (ADS)
Sirajuddin; Cut Gebrina Hisbach, M.; Ekawati, Ratna; Ade Irman, SM
2018-03-01
North container terminal known as North TPK is container terminal located in Indonesia Port Corporation area serving domestic container loading and unloading. It has 1006 ground slots with a total capacity of 5,544 TEUs and the maximum throughput of containers is 539,616 TEUs / year. Container throughput in the North TPK is increasing year by year. In 2011-2012, the North TPK container throughput is 165,080 TEUs / year and in 2015-2016 has reached 213,147 TEUs / year. To avoid congestion, and prevent possible losses in the future, this paper will analyze the flow of containers and the level of Yard Occupation Ratio in the North TPK at Tanjung Priok Port. The method used is the Autoregressive Integrated Moving Average (ARIMA) Model. ARIMA is a model that completely ignores independent variables in making forecasting. ARIMA results show that in 2016-2017 the total throughput of containers reached 234,006 TEUs / year with field effectiveness of 43.4% and in 2017-2018 the total throughput of containers reached 249,417 TEUs / year with field effectiveness 46.2%.
Accelerating the design of solar thermal fuel materials through high throughput simulations.
Liu, Yun; Grossman, Jeffrey C
2014-12-10
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastable structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.
Jowhar, Ziad; Gudla, Prabhakar R; Shachar, Sigal; Wangsa, Darawalee; Russ, Jill L; Pegoraro, Gianluca; Ried, Thomas; Raznahan, Armin; Misteli, Tom
2018-06-01
The spatial organization of chromosomes in the nuclear space is an extensively studied field that relies on measurements of structural features and 3D positions of chromosomes with high precision and robustness. However, no tools are currently available to image and analyze chromosome territories in a high-throughput format. Here, we have developed High-throughput Chromosome Territory Mapping (HiCTMap), a method for the robust and rapid analysis of 2D and 3D chromosome territory positioning in mammalian cells. HiCTMap is a high-throughput imaging-based chromosome detection method which enables routine analysis of chromosome structure and nuclear position. Using an optimized FISH staining protocol in a 384-well plate format in conjunction with a bespoke automated image analysis workflow, HiCTMap faithfully detects chromosome territories and their position in 2D and 3D in a large population of cells per experimental condition. We apply this novel technique to visualize chromosomes 18, X, and Y in male and female primary human skin fibroblasts, and show accurate detection of the correct number of chromosomes in the respective genotypes. Given the ability to visualize and quantitatively analyze large numbers of nuclei, we use HiCTMap to measure chromosome territory area and volume with high precision and determine the radial position of chromosome territories using either centroid or equidistant-shell analysis. The HiCTMap protocol is also compatible with RNA FISH as demonstrated by simultaneous labeling of X chromosomes and Xist RNA in female cells. We suggest HiCTMap will be a useful tool for routine precision mapping of chromosome territories in a wide range of cell types and tissues. Published by Elsevier Inc.
Mallik, Rangan; Yoo, Michelle J.; Briscoe, Chad J.; Hage, David S.
2010-01-01
Human serum albumin (HSA) was explored for use as a stationary phase and ligand in affinity microcolumns for the ultrafast extraction of free drug fractions and the use of this information for the analysis of drug-protein binding. Warfarin, imipramine, and ibuprofen were used as model analytes in this study. It was found that greater than 95% extraction of all these drugs could be achieved in as little as 250 ms on HSA microcolumns. The retained drug fraction was then eluted from the same column under isocratic conditions, giving elution in less than 40 s when working at 4.5 mL/min. The chromatographic behavior of this system gave a good fit with that predicted by computer simulations based on a reversible, saturable model for the binding of an injected drug with immobilized HSA. The free fractions measured by this method were found to be comparable to those determined by ultrafiltration, and equilibrium constants estimated by this approach gave good agreement with literature values. Advantages of this method include its speed and the relatively low cost of microcolumns that contain HSA. The ability of HSA to bind many types of drugs also creates the possibility of using the same affinity microcolumn to study and measure the free fractions for a variety of pharmaceutical agents. These properties make this technique appealing for use in drug binding studies and in the high-throughput screening of new drug candidates. PMID:20227701
Optimizing transformations for automated, high throughput analysis of flow cytometry data
2010-01-01
Background In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. Results We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Conclusions Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available. PMID:21050468
Optimizing transformations for automated, high throughput analysis of flow cytometry data.
Finak, Greg; Perez, Juan-Manuel; Weng, Andrew; Gottardo, Raphael
2010-11-04
In a high throughput setting, effective flow cytometry data analysis depends heavily on proper data preprocessing. While usual preprocessing steps of quality assessment, outlier removal, normalization, and gating have received considerable scrutiny from the community, the influence of data transformation on the output of high throughput analysis has been largely overlooked. Flow cytometry measurements can vary over several orders of magnitude, cell populations can have variances that depend on their mean fluorescence intensities, and may exhibit heavily-skewed distributions. Consequently, the choice of data transformation can influence the output of automated gating. An appropriate data transformation aids in data visualization and gating of cell populations across the range of data. Experience shows that the choice of transformation is data specific. Our goal here is to compare the performance of different transformations applied to flow cytometry data in the context of automated gating in a high throughput, fully automated setting. We examine the most common transformations used in flow cytometry, including the generalized hyperbolic arcsine, biexponential, linlog, and generalized Box-Cox, all within the BioConductor flowCore framework that is widely used in high throughput, automated flow cytometry data analysis. All of these transformations have adjustable parameters whose effects upon the data are non-intuitive for most users. By making some modelling assumptions about the transformed data, we develop maximum likelihood criteria to optimize parameter choice for these different transformations. We compare the performance of parameter-optimized and default-parameter (in flowCore) data transformations on real and simulated data by measuring the variation in the locations of cell populations across samples, discovered via automated gating in both the scatter and fluorescence channels. We find that parameter-optimized transformations improve visualization, reduce variability in the location of discovered cell populations across samples, and decrease the misclassification (mis-gating) of individual events when compared to default-parameter counterparts. Our results indicate that the preferred transformation for fluorescence channels is a parameter- optimized biexponential or generalized Box-Cox, in accordance with current best practices. Interestingly, for populations in the scatter channels, we find that the optimized hyperbolic arcsine may be a better choice in a high-throughput setting than current standard practice of no transformation. However, generally speaking, the choice of transformation remains data-dependent. We have implemented our algorithm in the BioConductor package, flowTrans, which is publicly available.
Chipster: user-friendly analysis software for microarray and other high-throughput data.
Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I
2011-10-14
The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.
Chipster: user-friendly analysis software for microarray and other high-throughput data
2011-01-01
Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641
SmartGrain: high-throughput phenotyping software for measuring seed shape through image analysis.
Tanabata, Takanari; Shibaya, Taeko; Hori, Kiyosumi; Ebana, Kaworu; Yano, Masahiro
2012-12-01
Seed shape and size are among the most important agronomic traits because they affect yield and market price. To obtain accurate seed size data, a large number of measurements are needed because there is little difference in size among seeds from one plant. To promote genetic analysis and selection for seed shape in plant breeding, efficient, reliable, high-throughput seed phenotyping methods are required. We developed SmartGrain software for high-throughput measurement of seed shape. This software uses a new image analysis method to reduce the time taken in the preparation of seeds and in image capture. Outlines of seeds are automatically recognized from digital images, and several shape parameters, such as seed length, width, area, and perimeter length, are calculated. To validate the software, we performed a quantitative trait locus (QTL) analysis for rice (Oryza sativa) seed shape using backcrossed inbred lines derived from a cross between japonica cultivars Koshihikari and Nipponbare, which showed small differences in seed shape. SmartGrain removed areas of awns and pedicels automatically, and several QTLs were detected for six shape parameters. The allelic effect of a QTL for seed length detected on chromosome 11 was confirmed in advanced backcross progeny; the cv Nipponbare allele increased seed length and, thus, seed weight. High-throughput measurement with SmartGrain reduced sampling error and made it possible to distinguish between lines with small differences in seed shape. SmartGrain could accurately recognize seed not only of rice but also of several other species, including Arabidopsis (Arabidopsis thaliana). The software is free to researchers.
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
Yan, Qimin; Yu, Jie; Suram, Santosh K.; ...
2017-03-06
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Qimin; Yu, Jie; Suram, Santosh K.
The limited number of known low-band-gap photoelectrocatalytic materials poses a significant challenge for the generation of chemical fuels from sunlight. Here, using high-throughput ab initio theory with experiments in an integrated workflow, we find eight ternary vanadate oxide photoanodes in the target band-gap range (1.2-2.8 eV). Detailed analysis of these vanadate compounds reveals the key role of VO 4 structural motifs and electronic band-edge character in efficient photoanodes, initiating a genome for such materials and paving the way for a broadly applicable high-throughput-discovery and materials-by-design feedback loop. Considerably expanding the number of known photoelectrocatalysts for water oxidation, our study establishesmore » ternary metal vanadates as a prolific class of photoanodematerials for generation of chemical fuels from sunlight and demonstrates our high-throughput theory-experiment pipeline as a prolific approach to materials discovery.« less
Development and Validation of an Automated High-Throughput System for Zebrafish In Vivo Screenings
Virto, Juan M.; Holgado, Olaia; Diez, Maria; Izpisua Belmonte, Juan Carlos; Callol-Massot, Carles
2012-01-01
The zebrafish is a vertebrate model compatible with the paradigms of drug discovery. The small size and transparency of zebrafish embryos make them amenable for the automation necessary in high-throughput screenings. We have developed an automated high-throughput platform for in vivo chemical screenings on zebrafish embryos that includes automated methods for embryo dispensation, compound delivery, incubation, imaging and analysis of the results. At present, two different assays to detect cardiotoxic compounds and angiogenesis inhibitors can be automatically run in the platform, showing the versatility of the system. A validation of these two assays with known positive and negative compounds, as well as a screening for the detection of unknown anti-angiogenic compounds, have been successfully carried out in the system developed. We present a totally automated platform that allows for high-throughput screenings in a vertebrate organism. PMID:22615792
Evaluation of a High Throughput Starch Analysis Optimised for Wood
Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco
2014-01-01
Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863
Throughput assurance of wireless body area networks coexistence based on stochastic geometry
Wang, Yinglong; Shu, Minglei; Wu, Shangbin
2017-01-01
Wireless body area networks (WBANs) are expected to influence the traditional medical model by assisting caretakers with health telemonitoring. Within WBANs, the transmit power of the nodes should be as small as possible owing to their limited energy capacity but should be sufficiently large to guarantee the quality of the signal at the receiving nodes. When multiple WBANs coexist in a small area, the communication reliability and overall throughput can be seriously affected due to resource competition and interference. We show that the total network throughput largely depends on the WBANs distribution density (λp), transmit power of their nodes (Pt), and their carrier-sensing threshold (γ). Using stochastic geometry, a joint carrier-sensing threshold and power control strategy is proposed to meet the demand of coexisting WBANs based on the IEEE 802.15.4 standard. Given different network distributions and carrier-sensing thresholds, the proposed strategy derives a minimum transmit power according to varying surrounding environment. We obtain expressions for transmission success probability and throughput adopting this strategy. Using numerical examples, we show that joint carrier-sensing thresholds and transmit power strategy can effectively improve the overall system throughput and reduce interference. Additionally, this paper studies the effects of a guard zone on the throughput using a Matern hard-core point process (HCPP) type II model. Theoretical analysis and simulation results show that the HCPP model can increase the success probability and throughput of networks. PMID:28141841
Mehand, Massinissa Si; Srinivasan, Bala; De Crescenzo, Gregory
2015-01-01
Surface plasmon resonance-based biosensors have been successfully applied to the study of the interactions between macromolecules and small molecular weight compounds. In an effort to increase the throughput of these SPR-based experiments, we have already proposed to inject multiple compounds simultaneously over the same surface. When specifically applied to small molecular weight compounds, such a strategy would however require prior knowledge of the refractive index increment of each compound in order to correctly interpret the recorded signal. An additional experiment is typically required to obtain this information. In this manuscript, we show that through the introduction of an additional global parameter corresponding to the ratio of the saturating signals associated with each molecule, the kinetic parameters could be identified with similar confidence intervals without any other experimentation. PMID:26515024
Rapid 2,2'-bicinchoninic-based xylanase assay compatible with high throughput screening
William R. Kenealy; Thomas W. Jeffries
2003-01-01
High-throughput screening requires simple assays that give reliable quantitative results. A microplate assay was developed for reducing sugar analysis that uses a 2,2'-bicinchoninic-based protein reagent. Endo-1,4-â-D-xylanase activity against oat spelt xylan was detected at activities of 0.002 to 0.011 IU ml−1. The assay is linear for sugar...
How Do Deep Saline Aquifer Microbial Communities Respond to Supercritical CO2 Injection?
NASA Astrophysics Data System (ADS)
Mu, A.; Billman-Jacobe, H.; Boreham, C.; Schacht, U.; Moreau, J. W.
2011-12-01
Carbon Capture and Storage (CCS) is currently seen as a viable strategy for mitigating anthropogenic carbon dioxide pollution. The Cooperative Research Centre for Greenhouse Gas Technologies (CO2CRC) is currently conducting a field experiment in the Otway Basin (Australia) studying residual gas saturation in the water-saturated reservoir of the Paaratte Formation. As part of this study, a suite of pre-CO2 injection water samples were collected from approximately 1400 meters depth (60°C, 13.8 MPa) via an in situ sampling system. The in situ sampling system isolates aquifer water from sources of contamination while maintaining the formation pressure. Whole community DNA was extracted from these samples to investigate the prokaryotic biodiversity of the saline Paaratte aquifer (EC = 1509.6 uS/cm). Bioinformatic analysis of preliminary 16S ribosomal gene data revealed Thermincola, Acinetobacter, Sphingobium, and Dechloromonas amongst the closest related genera to environmental clone sequences obtained from a subset of pre-CO2 injection groundwater samples. Epifluorescent microscopy with 4',6-diamidino-2-phenylindole (DAPI) highlighted an abundance of filamentous cells ranging from 5 to 45 μM. Efforts are currently directed towards utilising a high throughput sequencing approach to capture an exhaustive profile of the microbial diversity of the Paaratte aquifer CO2 injection site, and to understand better the response of in situ microbial populations to the injection of large volumes (e.g. many kilotonnes) of supercritical CO2 (sc-CO2). Sequencing results will be used to direct cultivation efforts towards enrichment of a CO2-tolerant microorganism. Understanding the microbial response to sc-CO2 is an integral aspect of carbon dioxide storage, for which very little information exists in the literature. This study aims to elucidate molecular mechanisms, through genomic and cultivation-based methods, for CO2 tolerance with the prospect of engineering biofilms to enhance trapping of CO2 in saline aquifers.
Hydrogel Droplet Microfluidics for High-Throughput Single Molecule/Cell Analysis.
Zhu, Zhi; Yang, Chaoyong James
2017-01-17
Heterogeneity among individual molecules and cells has posed significant challenges to traditional bulk assays, due to the assumption of average behavior, which would lose important biological information in heterogeneity and result in a misleading interpretation. Single molecule/cell analysis has become an important and emerging field in biological and biomedical research for insights into heterogeneity between large populations at high resolution. Compared with the ensemble bulk method, single molecule/cell analysis explores the information on time trajectories, conformational states, and interactions of individual molecules/cells, all key factors in the study of chemical and biological reaction pathways. Various powerful techniques have been developed for single molecule/cell analysis, including flow cytometry, atomic force microscopy, optical and magnetic tweezers, single-molecule fluorescence spectroscopy, and so forth. However, some of them have the low-throughput issue that has to analyze single molecules/cells one by one. Flow cytometry is a widely used high-throughput technique for single cell analysis but lacks the ability for intercellular interaction study and local environment control. Droplet microfluidics becomes attractive for single molecule/cell manipulation because single molecules/cells can be individually encased in monodisperse microdroplets, allowing high-throughput analysis and manipulation with precise control of the local environment. Moreover, hydrogels, cross-linked polymer networks that swell in the presence of water, have been introduced into droplet microfluidic systems as hydrogel droplet microfluidics. By replacing an aqueous phase with a monomer or polymer solution, hydrogel droplets can be generated on microfluidic chips for encapsulation of single molecules/cells according to the Poisson distribution. The sol-gel transition property endows the hydrogel droplets with new functionalities and diversified applications in single molecule/cell analysis. The hydrogel can act as a 3D cell culture matrix to mimic the extracellular environment for long-term single cell culture, which allows further heterogeneity study in proliferation, drug screening, and metastasis at the single-cell level. The sol-gel transition allows reactions in solution to be performed rapidly and efficiently with product storage in the gel for flexible downstream manipulation and analysis. More importantly, controllable sol-gel regulation provides a new way to maintain phenotype-genotype linkages in the hydrogel matrix for high throughput molecular evolution. In this Account, we will review the hydrogel droplet generation on microfluidics, single molecule/cell encapsulation in hydrogel droplets, as well as the progress made by our group and others in the application of hydrogel droplet microfluidics for single molecule/cell analysis, including single cell culture, single molecule/cell detection, single cell sequencing, and molecular evolution.
Automation of fluorescent differential display with digital readout.
Meade, Jonathan D; Cho, Yong-Jig; Fisher, Jeffrey S; Walden, Jamie C; Guo, Zhen; Liang, Peng
2006-01-01
Since its invention in 1992, differential display (DD) has become the most commonly used technique for identifying differentially expressed genes because of its many advantages over competing technologies such as DNA microarray, serial analysis of gene expression (SAGE), and subtractive hybridization. Despite the great impact of the method on biomedical research, there has been a lack of automation of DD technology to increase its throughput and accuracy for systematic gene expression analysis. Most of previous DD work has taken a "shot-gun" approach of identifying one gene at a time, with a limited number of polymerase chain reaction (PCR) reactions set up manually, giving DD a low-tech and low-throughput image. We have optimized the DD process with a new platform that incorporates fluorescent digital readout, automated liquid handling, and large-format gels capable of running entire 96-well plates. The resulting streamlined fluorescent DD (FDD) technology offers an unprecedented accuracy, sensitivity, and throughput in comprehensive and quantitative analysis of gene expression. These major improvements will allow researchers to find differentially expressed genes of interest, both known and novel, quickly and easily.
Baty, Florent; Klingbiel, Dirk; Zappa, Francesco; Brutsche, Martin
2015-12-01
Alternative splicing is an important component of tumorigenesis. Recent advent of exon array technology enables the detection of alternative splicing at a genome-wide scale. The analysis of high-throughput alternative splicing is not yet standard and methodological developments are still needed. We propose a novel statistical approach-Dually Constrained Correspondence Analysis-for the detection of splicing changes in exon array data. Using this methodology, we investigated the genome-wide alteration of alternative splicing in patients with non-small cell lung cancer treated by bevacizumab/erlotinib. Splicing candidates reveal a series of genes related to carcinogenesis (SFTPB), cell adhesion (STAB2, PCDH15, HABP2), tumor aggressiveness (ARNTL2), apoptosis, proliferation and differentiation (PDE4D, FLT3, IL1R2), cell invasion (ETV1), as well as tumor growth (OLFM4, FGF14), tumor necrosis (AFF3) or tumor suppression (TUSC3, CSMD1, RHOBTB2, SERPINB5), with indication of known alternative splicing in a majority of genes. DCCA facilitates the identification of putative biologically relevant alternative splicing events in high-throughput exon array data. Copyright © 2015 Elsevier Inc. All rights reserved.
Gas occurrence property in shales of Tuha basin northwest china
NASA Astrophysics Data System (ADS)
Chen, Jinlong; Huang, Zhilong
2017-04-01
Pore of rock under formation condition must be fulfilled by gas, oil, or water, so the volume of water and gas is equation to porous volume in shale gas. The occurrences states of gas are free gas, solution gas, and absorbed gas. Field analysis is used to obtain total gas content by improved lost gas recover method. Free gas content acquired by pore proportion of gas, which use measured pore volume minus water and oil saturation, convert gas content of standard condition by state equation. Water saturation obtain from core water content, oil saturation obtain from extract carbohydrate. Solution gas need gas solubility in oil and water to calculate solution gas content in standard condition. Absorbed gas, introduce Absorbed Gas Saturation ɛ, which acquire from isothermal adsorption volume vs field analysis gas content in many basins of published paper, need isothermal adsorption and Absorbed Gas Saturation to obtain absorbed gas content. All of the data build connect with logging value by regression equation. The gas content is 0.92-1.53 m3/t from field analysis, evaluate gas content is 1.33 m3/t average, free gas proportion is about 47%, absorbed gas counter for 49%, and solution gas is average 4%.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-08-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/.
Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda
2017-10-01
The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.
Nagasaki, Hideki; Mochizuki, Takako; Kodama, Yuichi; Saruhashi, Satoshi; Morizaki, Shota; Sugawara, Hideaki; Ohyanagi, Hajime; Kurata, Nori; Okubo, Kousaku; Takagi, Toshihisa; Kaminuma, Eli; Nakamura, Yasukazu
2013-01-01
High-performance next-generation sequencing (NGS) technologies are advancing genomics and molecular biological research. However, the immense amount of sequence data requires computational skills and suitable hardware resources that are a challenge to molecular biologists. The DNA Data Bank of Japan (DDBJ) of the National Institute of Genetics (NIG) has initiated a cloud computing-based analytical pipeline, the DDBJ Read Annotation Pipeline (DDBJ Pipeline), for a high-throughput annotation of NGS reads. The DDBJ Pipeline offers a user-friendly graphical web interface and processes massive NGS datasets using decentralized processing by NIG supercomputers currently free of charge. The proposed pipeline consists of two analysis components: basic analysis for reference genome mapping and de novo assembly and subsequent high-level analysis of structural and functional annotations. Users may smoothly switch between the two components in the pipeline, facilitating web-based operations on a supercomputer for high-throughput data analysis. Moreover, public NGS reads of the DDBJ Sequence Read Archive located on the same supercomputer can be imported into the pipeline through the input of only an accession number. This proposed pipeline will facilitate research by utilizing unified analytical workflows applied to the NGS data. The DDBJ Pipeline is accessible at http://p.ddbj.nig.ac.jp/. PMID:23657089
Scafaro, Andrew P; Negrini, A Clarissa A; O'Leary, Brendan; Rashid, F Azzahra Ahmad; Hayes, Lucy; Fan, Yuzhen; Zhang, You; Chochois, Vincent; Badger, Murray R; Millar, A Harvey; Atkin, Owen K
2017-01-01
Mitochondrial respiration in the dark ( R dark ) is a critical plant physiological process, and hence a reliable, efficient and high-throughput method of measuring variation in rates of R dark is essential for agronomic and ecological studies. However, currently methods used to measure R dark in plant tissues are typically low throughput. We assessed a high-throughput automated fluorophore system of detecting multiple O 2 consumption rates. The fluorophore technique was compared with O 2 -electrodes, infrared gas analysers (IRGA), and membrane inlet mass spectrometry, to determine accuracy and speed of detecting respiratory fluxes. The high-throughput fluorophore system provided stable measurements of R dark in detached leaf and root tissues over many hours. High-throughput potential was evident in that the fluorophore system was 10 to 26-fold faster per sample measurement than other conventional methods. The versatility of the technique was evident in its enabling: (1) rapid screening of R dark in 138 genotypes of wheat; and, (2) quantification of rarely-assessed whole-plant R dark through dissection and simultaneous measurements of above- and below-ground organs. Variation in absolute R dark was observed between techniques, likely due to variation in sample conditions (i.e. liquid vs. gas-phase, open vs. closed systems), indicating that comparisons between studies using different measuring apparatus may not be feasible. However, the high-throughput protocol we present provided similar values of R dark to the most commonly used IRGA instrument currently employed by plant scientists. Together with the greater than tenfold increase in sample processing speed, we conclude that the high-throughput protocol enables reliable, stable and reproducible measurements of R dark on multiple samples simultaneously, irrespective of plant or tissue type.
Tschiersch, Henning; Junker, Astrid; Meyer, Rhonda C; Altmann, Thomas
2017-01-01
Automated plant phenotyping has been established as a powerful new tool in studying plant growth, development and response to various types of biotic or abiotic stressors. Respective facilities mainly apply non-invasive imaging based methods, which enable the continuous quantification of the dynamics of plant growth and physiology during developmental progression. However, especially for plants of larger size, integrative, automated and high throughput measurements of complex physiological parameters such as photosystem II efficiency determined through kinetic chlorophyll fluorescence analysis remain a challenge. We present the technical installations and the establishment of experimental procedures that allow the integrated high throughput imaging of all commonly determined PSII parameters for small and large plants using kinetic chlorophyll fluorescence imaging systems (FluorCam, PSI) integrated into automated phenotyping facilities (Scanalyzer, LemnaTec). Besides determination of the maximum PSII efficiency, we focused on implementation of high throughput amenable protocols recording PSII operating efficiency (Φ PSII ). Using the presented setup, this parameter is shown to be reproducibly measured in differently sized plants despite the corresponding variation in distance between plants and light source that caused small differences in incident light intensity. Values of Φ PSII obtained with the automated chlorophyll fluorescence imaging setup correlated very well with conventionally determined data using a spot-measuring chlorophyll fluorometer. The established high throughput operating protocols enable the screening of up to 1080 small and 184 large plants per hour, respectively. The application of the implemented high throughput protocols is demonstrated in screening experiments performed with large Arabidopsis and maize populations assessing natural variation in PSII efficiency. The incorporation of imaging systems suitable for kinetic chlorophyll fluorescence analysis leads to a substantial extension of the feature spectrum to be assessed in the presented high throughput automated plant phenotyping platforms, thus enabling the simultaneous assessment of plant architectural and biomass-related traits and their relations to physiological features such as PSII operating efficiency. The implemented high throughput protocols are applicable to a broad spectrum of model and crop plants of different sizes (up to 1.80 m height) and architectures. The deeper understanding of the relation of plant architecture, biomass formation and photosynthetic efficiency has a great potential with respect to crop and yield improvement strategies.
A noninvasive, direct real-time PCR method for sex determination in multiple avian species
Brubaker, Jessica L.; Karouna-Renier, Natalie K.; Chen, Yu; Jenko, Kathryn; Sprague, Daniel T.; Henry, Paula F.P.
2011-01-01
Polymerase chain reaction (PCR)-based methods to determine the sex of birds are well established and have seen few modifications since they were first introduced in the 1990s. Although these methods allowed for sex determination in species that were previously difficult to analyse, they were not conducive to high-throughput analysis because of the laboriousness of DNA extraction and gel electrophoresis. We developed a high-throughput real-time PCR-based method for analysis of sex in birds, which uses noninvasive sample collection and avoids DNA extraction and gel electrophoresis.
Creation of a small high-throughput screening facility.
Flak, Tod
2009-01-01
The creation of a high-throughput screening facility within an organization is a difficult task, requiring a substantial investment of time, money, and organizational effort. Major issues to consider include the selection of equipment, the establishment of data analysis methodologies, and the formation of a group having the necessary competencies. If done properly, it is possible to build a screening system in incremental steps, adding new pieces of equipment and data analysis modules as the need grows. Based upon our experience with the creation of a small screening service, we present some guidelines to consider in planning a screening facility.
Shankar, Manoharan; Priyadharshini, Ramachandran; Gunasekaran, Paramasamy
2009-08-01
An image analysis-based method for high throughput screening of an alpha-amylase mutant library using chromogenic assays was developed. Assays were performed in microplates and high resolution images of the assay plates were read using the Virtual Microplate Reader (VMR) script to quantify the concentration of the chromogen. This method is fast and sensitive in quantifying 0.025-0.3 mg starch/ml as well as 0.05-0.75 mg glucose/ml. It was also an effective screening method for improved alpha-amylase activity with a coefficient of variance of 18%.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
2014-01-01
Background RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. Results We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification” includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module “mRNA identification” includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module “Target screening” provides expression profiling analyses and graphic visualization. The module “Self-testing” offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program’s functionality. Conclusions eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory. PMID:24593312
Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang
2014-03-05
RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.
Fu, Wei; Zhu, Pengyu; Wei, Shuang; Zhixin, Du; Wang, Chenguang; Wu, Xiyang; Li, Feiwu; Zhu, Shuifang
2017-04-01
Among all of the high-throughput detection methods, PCR-based methodologies are regarded as the most cost-efficient and feasible methodologies compared with the next-generation sequencing or ChIP-based methods. However, the PCR-based methods can only achieve multiplex detection up to 15-plex due to limitations imposed by the multiplex primer interactions. The detection throughput cannot meet the demands of high-throughput detection, such as SNP or gene expression analysis. Therefore, in our study, we have developed a new high-throughput PCR-based detection method, multiplex enrichment quantitative PCR (ME-qPCR), which is a combination of qPCR and nested PCR. The GMO content detection results in our study showed that ME-qPCR could achieve high-throughput detection up to 26-plex. Compared to the original qPCR, the Ct values of ME-qPCR were lower for the same group, which showed that ME-qPCR sensitivity is higher than the original qPCR. The absolute limit of detection for ME-qPCR could achieve levels as low as a single copy of the plant genome. Moreover, the specificity results showed that no cross-amplification occurred for irrelevant GMO events. After evaluation of all of the parameters, a practical evaluation was performed with different foods. The more stable amplification results, compared to qPCR, showed that ME-qPCR was suitable for GMO detection in foods. In conclusion, ME-qPCR achieved sensitive, high-throughput GMO detection in complex substrates, such as crops or food samples. In the future, ME-qPCR-based GMO content identification may positively impact SNP analysis or multiplex gene expression of food or agricultural samples. Graphical abstract For the first-step amplification, four primers (A, B, C, and D) have been added into the reaction volume. In this manner, four kinds of amplicons have been generated. All of these four amplicons could be regarded as the target of second-step PCR. For the second-step amplification, three parallels have been taken for the final evaluation. After the second evaluation, the final amplification curves and melting curves have been achieved.
Theoretical and Numerical Investigations on Shallow Tunnelling in Unsaturated Soils
NASA Astrophysics Data System (ADS)
Soranzo, Enrico; Wu, Wei
2013-04-01
Excavation of shallow tunnels with the New Austrian Tunnelling Method (NATM) requires proper assessing of the tunnel face stability, to enable an open-face excavation, and the estimation of the correspondent surface settlements. Soils in a partially saturated condition exhibit a higher cohesion than in a fully saturated state, which can be taken into account when assessing the stability of the tunnel face. For the assessment of the face support pressure, different methods are used in engineering practice, varying from simple empirical and analytical formulations to advanced finite element analysis. Such procedures can be modified to account for the unsaturated state of soils. In this study a method is presented to incorporate the effect of partial saturation in the numerical analysis. The results are then compared with a simple analytical formulation derived from parametric studies. As to the numerical analysis, the variation of cohesion and of Young's modulus with saturation can be considered when the water table lies below the tunnel in a soil exhibiting a certain capillary rise, so that the tunnel is driven in a partially saturated layer. The linear elastic model with Mohr-Coulomb failure criterion can be extended to partially saturated states and calibrated with triaxial tests on unsaturated. In order to model both positive and negative pore water pressure (suction), Bishop's effective stress is incorporated into Mohr-Coulomb's failure criterion. The effective stress parameter in Bishop's formulation is related to the degree of saturation as suggested by Fredlund. If a linear suction distribution is assumed, the degree of saturation can be calculated from the Soil Water Characteristic Curve (SWCC). Expressions exist that relate the Young's modulus of unsaturated soils to the net mean stress and the matric suction. The results of the numerical computation can be compared to Vermeer & Ruse's closed-form formula that expresses the limit support pressure of the tunnel face. The expression is derived from parametric studies and predicts stability of the tunnel face when negative values are returned, suggesting that open-face tunnelling can be performed. The formula can be modified to account for the variation of cohesion along the tunnel face. The results obtained from both the numerical analysis and the analytical formulation are well in agreement and show that the stability of the tunnel face can greatly benefit from the enhanced cohesion of partially saturated soils.
Performance Optimization of Priority Assisted CSMA/CA Mechanism of 802.15.6 under Saturation Regime
Shakir, Mustafa; Rehman, Obaid Ur; Rahim, Mudassir; Alrajeh, Nabil; Khan, Zahoor Ali; Khan, Mahmood Ashraf; Niaz, Iftikhar Azim; Javaid, Nadeem
2016-01-01
Due to the recent development in the field of Wireless Sensor Networks (WSNs), the Wireless Body Area Networks (WBANs) have become a major area of interest for the developers and researchers. Human body exhibits postural mobility due to which distance variation occurs and the status of connections amongst sensors change time to time. One of the major requirements of WBAN is to prolong the network lifetime without compromising on other performance measures, i.e., delay, throughput and bandwidth efficiency. Node prioritization is one of the possible solutions to obtain optimum performance in WBAN. IEEE 802.15.6 CSMA/CA standard splits the nodes with different user priorities based on Contention Window (CW) size. Smaller CW size is assigned to higher priority nodes. This standard helps to reduce delay, however, it is not energy efficient. In this paper, we propose a hybrid node prioritization scheme based on IEEE 802.15.6 CSMA/CA to reduce energy consumption and maximize network lifetime. In this scheme, optimum performance is achieved by node prioritization based on CW size as well as power in respective user priority. Our proposed scheme reduces the average back off time for channel access due to CW based prioritization. Additionally, power based prioritization for a respective user priority helps to minimize required number of retransmissions. Furthermore, we also compare our scheme with IEEE 802.15.6 CSMA/CA standard (CW assisted node prioritization) and power assisted node prioritization under postural mobility in WBAN. Mathematical expressions are derived to determine the accurate analytical model for throughput, delay, bandwidth efficiency, energy consumption and life time for each node prioritization scheme. With the intention of analytical model validation, we have performed the simulations in OMNET++/MIXIM framework. Analytical and simulation results show that our proposed hybrid node prioritization scheme outperforms other node prioritization schemes in terms of average network delay, average throughput, average bandwidth efficiency and network lifetime. PMID:27598167
NASA Astrophysics Data System (ADS)
Wang, J.; Polivka, T. N.; Hyer, E. J.; Peterson, D. A.
2014-12-01
Unlike previous space-borne Earth-observing sensors, the Visible Infrared Imaging Radiometer Suite (VIIRS) employs aggregation to reduce downlink bandwidth requirements and preserve spatial resolution across the swath. To examine the potentially deleterious impacts of aggregation when encountering detector saturation, nearly four months of NOAA's Nightfire product were analyzed, which contains a subset of the hottest observed nighttime pixels. An empirical method for identifying saturation was devised. The 3.69 µm band (M12) was the most frequently-saturating band with 0.15% of the Nightfire pixels at or near the ~359 K hard saturation limit, with possible saturation also occurring in M14, M15, and M16. Artifacts consistent with detector saturation were seen with M12 temperatures as low as 330 K in the scene center. This partial saturation and aggregation influence must be considered when using VIIRS radiances for quantitative characterization of hot emission sources such as fires and gas flaring.
Gupta, Surya; De Puysseleyr, Veronic; Van der Heyden, José; Maddelein, Davy; Lemmens, Irma; Lievens, Sam; Degroeve, Sven; Tavernier, Jan; Martens, Lennart
2017-05-01
Protein-protein interaction (PPI) studies have dramatically expanded our knowledge about cellular behaviour and development in different conditions. A multitude of high-throughput PPI techniques have been developed to achieve proteome-scale coverage for PPI studies, including the microarray based Mammalian Protein-Protein Interaction Trap (MAPPIT) system. Because such high-throughput techniques typically report thousands of interactions, managing and analysing the large amounts of acquired data is a challenge. We have therefore built the MAPPIT cell microArray Protein Protein Interaction-Data management & Analysis Tool (MAPPI-DAT) as an automated data management and analysis tool for MAPPIT cell microarray experiments. MAPPI-DAT stores the experimental data and metadata in a systematic and structured way, automates data analysis and interpretation, and enables the meta-analysis of MAPPIT cell microarray data across all stored experiments. MAPPI-DAT is developed in Python, using R for data analysis and MySQL as data management system. MAPPI-DAT is cross-platform and can be ran on Microsoft Windows, Linux and OS X/macOS. The source code and a Microsoft Windows executable are freely available under the permissive Apache2 open source license at https://github.com/compomics/MAPPI-DAT. jan.tavernier@vib-ugent.be or lennart.martens@vib-ugent.be. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.
Microarray-Based Gene Expression Analysis for Veterinary Pathologists: A Review.
Raddatz, Barbara B; Spitzbarth, Ingo; Matheis, Katja A; Kalkuhl, Arno; Deschl, Ulrich; Baumgärtner, Wolfgang; Ulrich, Reiner
2017-09-01
High-throughput, genome-wide transcriptome analysis is now commonly used in all fields of life science research and is on the cusp of medical and veterinary diagnostic application. Transcriptomic methods such as microarrays and next-generation sequencing generate enormous amounts of data. The pathogenetic expertise acquired from understanding of general pathology provides veterinary pathologists with a profound background, which is essential in translating transcriptomic data into meaningful biological knowledge, thereby leading to a better understanding of underlying disease mechanisms. The scientific literature concerning high-throughput data-mining techniques usually addresses mathematicians or computer scientists as the target audience. In contrast, the present review provides the reader with a clear and systematic basis from a veterinary pathologist's perspective. Therefore, the aims are (1) to introduce the reader to the necessary methodological background; (2) to introduce the sequential steps commonly performed in a microarray analysis including quality control, annotation, normalization, selection of differentially expressed genes, clustering, gene ontology and pathway analysis, analysis of manually selected genes, and biomarker discovery; and (3) to provide references to publically available and user-friendly software suites. In summary, the data analysis methods presented within this review will enable veterinary pathologists to analyze high-throughput transcriptome data obtained from their own experiments, supplemental data that accompany scientific publications, or public repositories in order to obtain a more in-depth insight into underlying disease mechanisms.
ERIC Educational Resources Information Center
Hagström, Linus; Scheja, Max
2014-01-01
The aim of this article is to contribute to the discussion on how examinations can be designed to enhance students' learning and increase throughput in terms of the number of students who sit, and pass, the course examination. The context of the study is a basic level political science course on power analysis, which initially suffered from low…
Custom Super-Resolution Microscope for the Structural Analysis of Nanostructures
2018-05-29
research community. As part of our validation of the new design approach, we performed two - color imaging of pairs of adjacent oligo probes hybridized...nanostructures and biological targets. Our microscope features a large field of view and custom optics that facilitate 3D imaging and enhanced contrast in...our imaging throughput by creating two microscopy platforms for high-throughput, super-resolution materials characterization, with the AO set-up being
Digital Microwave System Design Guide.
1984-02-01
traffic analysis is a continuous effort, setting parameters for subsequent stages of expansion after the system design is finished. 2.1.3 Quality of...operational structure of the user for whom he is providing service. 2.2.3 Quality of Service. In digital communications, the basic performance parameter ...the basic interpretation of system performance is measured in terms of a single parameter , throughput. Throughput can be defined as the number of
Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter
2015-01-01
Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438
GlycoExtractor: a web-based interface for high throughput processing of HPLC-glycan data.
Artemenko, Natalia V; Campbell, Matthew P; Rudd, Pauline M
2010-04-05
Recently, an automated high-throughput HPLC platform has been developed that can be used to fully sequence and quantify low concentrations of N-linked sugars released from glycoproteins, supported by an experimental database (GlycoBase) and analytical tools (autoGU). However, commercial packages that support the operation of HPLC instruments and data storage lack platforms for the extraction of large volumes of data. The lack of resources and agreed formats in glycomics is now a major limiting factor that restricts the development of bioinformatic tools and automated workflows for high-throughput HPLC data analysis. GlycoExtractor is a web-based tool that interfaces with a commercial HPLC database/software solution to facilitate the extraction of large volumes of processed glycan profile data (peak number, peak areas, and glucose unit values). The tool allows the user to export a series of sample sets to a set of file formats (XML, JSON, and CSV) rather than a collection of disconnected files. This approach not only reduces the amount of manual refinement required to export data into a suitable format for data analysis but also opens the field to new approaches for high-throughput data interpretation and storage, including biomarker discovery and validation and monitoring of online bioprocessing conditions for next generation biotherapeutics.
High-Throughput Density Measurement Using Magnetic Levitation.
Ge, Shencheng; Wang, Yunzhe; Deshler, Nicolas J; Preston, Daniel J; Whitesides, George M
2018-06-20
This work describes the development of an integrated analytical system that enables high-throughput density measurements of diamagnetic particles (including cells) using magnetic levitation (MagLev), 96-well plates, and a flatbed scanner. MagLev is a simple and useful technique with which to carry out density-based analysis and separation of a broad range of diamagnetic materials with different physical forms (e.g., liquids, solids, gels, pastes, gums, etc.); one major limitation, however, is the capacity to perform high-throughput density measurements. This work addresses this limitation by (i) re-engineering the shape of the magnetic fields so that the MagLev system is compatible with 96-well plates, and (ii) integrating a flatbed scanner (and simple optical components) to carry out imaging of the samples that levitate in the system. The resulting system is compatible with both biological samples (human erythrocytes) and nonbiological samples (simple liquids and solids, such as 3-chlorotoluene, cholesterol crystals, glass beads, copper powder, and polymer beads). The high-throughput capacity of this integrated MagLev system will enable new applications in chemistry (e.g., analysis and separation of materials) and biochemistry (e.g., cellular responses under environmental stresses) in a simple and label-free format on the basis of a universal property of all matter, i.e., density.
A Barcoding Strategy Enabling Higher-Throughput Library Screening by Microscopy.
Chen, Robert; Rishi, Harneet S; Potapov, Vladimir; Yamada, Masaki R; Yeh, Vincent J; Chow, Thomas; Cheung, Celia L; Jones, Austin T; Johnson, Terry D; Keating, Amy E; DeLoache, William C; Dueber, John E
2015-11-20
Dramatic progress has been made in the design and build phases of the design-build-test cycle for engineering cells. However, the test phase usually limits throughput, as many outputs of interest are not amenable to rapid analytical measurements. For example, phenotypes such as motility, morphology, and subcellular localization can be readily measured by microscopy, but analysis of these phenotypes is notoriously slow. To increase throughput, we developed microscopy-readable barcodes (MiCodes) composed of fluorescent proteins targeted to discernible organelles. In this system, a unique barcode can be genetically linked to each library member, making possible the parallel analysis of phenotypes of interest via microscopy. As a first demonstration, we MiCoded a set of synthetic coiled-coil leucine zipper proteins to allow an 8 × 8 matrix to be tested for specific interactions in micrographs consisting of mixed populations of cells. A novel microscopy-readable two-hybrid fluorescence localization assay for probing candidate interactions in the cytosol was also developed using a bait protein targeted to the peroxisome and a prey protein tagged with a fluorescent protein. This work introduces a generalizable, scalable platform for making microscopy amenable to higher-throughput library screening experiments, thereby coupling the power of imaging with the utility of combinatorial search paradigms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swift, T.E.; Marlow, R.E.; Wilhelm, M.H.
1981-11-01
This report describes part of the work done to fulfill a contract awarded to Gruy Federal, Inc., by the Department of Energy (DOE) on Feburary 12, 1979. The work includes pressure-coring and associated logging and testing programs to provide data on in-situ oil saturation, porosity and permeability distribution, and other data needed for resource characterization of fields and reservoirs in which CO/sub 2/ injection might have a high probability of success. This report details the second such project. Core porosities agreed well with computed log porosities. Core water saturation and computed log porosities agree fairly well from 3692 to 3712more » feet, poorly from 3712 to 3820 feet and in a general way from 4035 to 4107 feet. Computer log analysis techniques incorporating the a, m, and n values obtained from Core Laboratories analysis did not improve the agreement of log versus core derived water saturations. However, both core and log analysis indicated the ninth zone had the highest residual hydrocarbon saturations and production data confirmed the validity of oil saturation determinations. Residual oil saturation, for the perforated and tested intervals were 259 STB/acre-ft for the interval from 4035 to 4055 feet, and 150 STB/acre-ft for the interval from 3692 to 3718 feet. Nine BOPD was produced from the interval 4035 to 4055 feet and no oil was produced from interval 3692 to 3718 feet, qualitatively confirming the relative oil saturations as calculated. The low oil production in the zone from 4022 to 4055 and the lack of production from 3692 to 3718 feet indicated the zone to be at or near residual waterflood conditions as determined by log analysis. This project demonstrates the usefulness of integrating pressure core, log, and production data to realistically evaluate a reservoir for carbon dioxide flood.« less
Protocols and programs for high-throughput growth and aging phenotyping in yeast.
Jung, Paul P; Christian, Nils; Kay, Daniel P; Skupin, Alexander; Linster, Carole L
2015-01-01
In microorganisms, and more particularly in yeasts, a standard phenotyping approach consists in the analysis of fitness by growth rate determination in different conditions. One growth assay that combines high throughput with high resolution involves the generation of growth curves from 96-well plate microcultivations in thermostated and shaking plate readers. To push the throughput of this method to the next level, we have adapted it in this study to the use of 384-well plates. The values of the extracted growth parameters (lag time, doubling time and yield of biomass) correlated well between experiments carried out in 384-well plates as compared to 96-well plates or batch cultures, validating the higher-throughput approach for phenotypic screens. The method is not restricted to the use of the budding yeast Saccharomyces cerevisiae, as shown by consistent results for other species selected from the Hemiascomycete class. Furthermore, we used the 384-well plate microcultivations to develop and validate a higher-throughput assay for yeast Chronological Life Span (CLS), a parameter that is still commonly determined by a cumbersome method based on counting "Colony Forming Units". To accelerate analysis of the large datasets generated by the described growth and aging assays, we developed the freely available software tools GATHODE and CATHODE. These tools allow for semi-automatic determination of growth parameters and CLS behavior from typical plate reader output files. The described protocols and programs will increase the time- and cost-efficiency of a number of yeast-based systems genetics experiments as well as various types of screens.
A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy
Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian
2016-01-01
Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925
Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander
2015-01-01
Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.
Xu, Jiadi; Yadav, Nirbhay N.; Bar-Shir, Amnon; Jones, Craig K.; Chan, Kannie W. Y.; Zhang, Jiangyang; Walczak, P.; McMahon, Michael T.; van Zijl, Peter C. M.
2013-01-01
Purpose Chemical exchange saturation transfer (CEST) imaging is a new MRI technology allowing the detection of low concentration endogenous cellular proteins and metabolites indirectly through their exchangeable protons. A new technique, variable delay multi-pulse CEST (VDMP-CEST), is proposed to eliminate the need for recording full Z-spectra and performing asymmetry analysis to obtain CEST contrast. Methods The VDMP-CEST scheme involves acquiring images with two (or more) delays between radiofrequency saturation pulses in pulsed CEST, producing a series of CEST images sensitive to the speed of saturation transfer. Subtracting two images or fitting a time series produces CEST and relayed-nuclear Overhauser enhancement CEST maps without effects of direct water saturation and, when using low radiofrequency power, minimal magnetization transfer contrast interference. Results When applied to several model systems (bovine serum albumin, crosslinked bovine serum albumin, l-glutamic acid) and in vivo on healthy rat brain, VDMP-CEST showed sensitivity to slow to intermediate range magnetization transfer processes (rate < 100–150 Hz), such as amide proton transfer and relayed nuclear Overhauser enhancement-CEST. Images for these contrasts could be acquired in short scan times by using a single radiofrequency frequency. Conclusions VDMP-CEST provides an approach to detect CEST effect by sensitizing saturation experiments to slower exchange processes without interference of direct water saturation and without need to acquire Z-spectra and perform asymmetry analysis. PMID:23813483
Tipton, Jeremiah D; Tran, John C; Catherman, Adam D; Ahlf, Dorothy R; Durbin, Kenneth R; Lee, Ji Eun; Kellie, John F; Kelleher, Neil L; Hendrickson, Christopher L; Marshall, Alan G
2012-03-06
Current high-throughput top-down proteomic platforms provide routine identification of proteins less than 25 kDa with 4-D separations. This short communication reports the application of technological developments over the past few years that improve protein identification and characterization for masses greater than 25 kDa. Advances in separation science have allowed increased numbers of proteins to be identified, especially by nanoliquid chromatography (nLC) prior to mass spectrometry (MS) analysis. Further, a goal of high-throughput top-down proteomics is to extend the mass range for routine nLC MS analysis up to 80 kDa because gene sequence analysis predicts that ~70% of the human proteome is transcribed to be less than 80 kDa. Normally, large proteins greater than 50 kDa are identified and characterized by top-down proteomics through fraction collection and direct infusion at relatively low throughput. Further, other MS-based techniques provide top-down protein characterization, however at low resolution for intact mass measurement. Here, we present analysis of standard (up to 78 kDa) and whole cell lysate proteins by Fourier transform ion cyclotron resonance mass spectrometry (nLC electrospray ionization (ESI) FTICR MS). The separation platform reduced the complexity of the protein matrix so that, at 14.5 T, proteins from whole cell lysate up to 72 kDa are baseline mass resolved on a nano-LC chromatographic time scale. Further, the results document routine identification of proteins at improved throughput based on accurate mass measurement (less than 10 ppm mass error) of precursor and fragment ions for proteins up to 50 kDa.
High-sensitivity HLA typing by Saturated Tiling Capture Sequencing (STC-Seq).
Jiao, Yang; Li, Ran; Wu, Chao; Ding, Yibin; Liu, Yanning; Jia, Danmei; Wang, Lifeng; Xu, Xiang; Zhu, Jing; Zheng, Min; Jia, Junling
2018-01-15
Highly polymorphic human leukocyte antigen (HLA) genes are responsible for fine-tuning the adaptive immune system. High-resolution HLA typing is important for the treatment of autoimmune and infectious diseases. Additionally, it is routinely performed for identifying matched donors in transplantation medicine. Although many HLA typing approaches have been developed, the complexity, low-efficiency and high-cost of current HLA-typing assays limit their application in population-based high-throughput HLA typing for donors, which is required for creating large-scale databases for transplantation and precision medicine. Here, we present a cost-efficient Saturated Tiling Capture Sequencing (STC-Seq) approach to capturing 14 HLA class I and II genes. The highly efficient capture (an approximately 23,000-fold enrichment) of these genes allows for simplified allele calling. Tests on five genes (HLA-A/B/C/DRB1/DQB1) from 31 human samples and 351 datasets using STC-Seq showed results that were 98% consistent with the known two sets of digitals (field1 and field2) genotypes. Additionally, STC can capture genomic DNA fragments longer than 3 kb from HLA loci, making the library compatible with the third-generation sequencing. STC-Seq is a highly accurate and cost-efficient method for HLA typing which can be used to facilitate the establishment of population-based HLA databases for the precision and transplantation medicine.
Saturation mutagenesis reveals manifold determinants of exon definition.
Ke, Shengdong; Anquetil, Vincent; Zamalloa, Jorge Rojas; Maity, Alisha; Yang, Anthony; Arias, Mauricio A; Kalachikov, Sergey; Russo, James J; Ju, Jingyue; Chasin, Lawrence A
2018-01-01
To illuminate the extent and roles of exonic sequences in the splicing of human RNA transcripts, we conducted saturation mutagenesis of a 51-nt internal exon in a three-exon minigene. All possible single and tandem dinucleotide substitutions were surveyed. Using high-throughput genetics, 5560 minigene molecules were assayed for splicing in human HEK293 cells. Up to 70% of mutations produced substantial (greater than twofold) phenotypes of either increased or decreased splicing. Of all predicted secondary structural elements, only a single 15-nt stem-loop showed a strong correlation with splicing, acting negatively. The in vitro formation of exon-protein complexes between the mutant molecules and proteins associated with spliceosome formation (U2AF35, U2AF65, U1A, and U1-70K) correlated with splicing efficiencies, suggesting exon definition as the step affected by most mutations. The measured relative binding affinities of dozens of human RNA binding protein domains as reported in the CISBP-RNA database were found to correlate either positively or negatively with splicing efficiency, more than could fit on the 51-nt test exon simultaneously. The large number of these functional protein binding correlations point to a dynamic and heterogeneous population of pre-mRNA molecules, each responding to a particular collection of binding proteins. © 2018 Ke et al.; Published by Cold Spring Harbor Laboratory Press.
Accelerating the Design of Solar Thermal Fuel Materials through High Throughput Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Y; Grossman, JC
2014-12-01
Solar thermal fuels (STF) store the energy of sunlight, which can then be released later in the form of heat, offering an emission-free and renewable solution for both solar energy conversion and storage. However, this approach is currently limited by the lack of low-cost materials with high energy density and high stability. In this Letter, we present an ab initio high-throughput computational approach to accelerate the design process and allow for searches over a broad class of materials. The high-throughput screening platform we have developed can run through large numbers of molecules composed of earth-abundant elements and identifies possible metastablemore » structures of a given material. Corresponding isomerization enthalpies associated with the metastable structures are then computed. Using this high-throughput simulation approach, we have discovered molecular structures with high isomerization enthalpies that have the potential to be new candidates for high-energy density STF. We have also discovered physical principles to guide further STF materials design through structural analysis. More broadly, our results illustrate the potential of using high-throughput ab initio simulations to design materials that undergo targeted structural transitions.« less
NASA Technical Reports Server (NTRS)
Allman, Mark; Ostermann, Shawn; Kruse, Hans
1996-01-01
In several experiments using NASA's Advanced Communications Technology Satellite (ACTS), investigators have reported disappointing throughput using the transmission control protocol/Internet protocol (TCP/IP) protocol suite over 1.536Mbit/sec (T1) satellite circuits. A detailed analysis of file transfer protocol (FTP) file transfers reveals that both the TCP window size and the TCP 'slow starter' algorithm contribute to the observed limits in throughput. In this paper we summarize the experimental and and theoretical analysis of the throughput limit imposed by TCP on the satellite circuit. We then discuss in detail the implementation of a multi-socket FTP, XFTP client and server. XFTP has been tested using the ACTS system. Finally, we discuss a preliminary set of tests on a link with non-zero bit error rates. XFTP shows promising performance under these conditions, suggesting the possibility that a multi-socket application may be less effected by bit errors than a single, large-window TCP connection.
d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan
2017-07-01
A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library Fe x Si y Ge 100-x-y (20
High-Throughput Assessment of Cellular Mechanical Properties.
Darling, Eric M; Di Carlo, Dino
2015-01-01
Traditionally, cell analysis has focused on using molecular biomarkers for basic research, cell preparation, and clinical diagnostics; however, new microtechnologies are enabling evaluation of the mechanical properties of cells at throughputs that make them amenable to widespread use. We review the current understanding of how the mechanical characteristics of cells relate to underlying molecular and architectural changes, describe how these changes evolve with cell-state and disease processes, and propose promising biomedical applications that will be facilitated by the increased throughput of mechanical testing: from diagnosing cancer and monitoring immune states to preparing cells for regenerative medicine. We provide background about techniques that laid the groundwork for the quantitative understanding of cell mechanics and discuss current efforts to develop robust techniques for rapid analysis that aim to implement mechanophenotyping as a routine tool in biomedicine. Looking forward, we describe additional milestones that will facilitate broad adoption, as well as new directions not only in mechanically assessing cells but also in perturbing them to passively engineer cell state.
Zhou, Jizhong; He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G; Alvarez-Cohen, Lisa
2015-01-27
Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied "open-format" and "closed-format" detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. Copyright © 2015 Zhou et al.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R; Bock, Davi D; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R Clay; Smith, Stephen J; Szalay, Alexander S; Vogelstein, Joshua T; Vogelstein, R Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes - neural connectivity maps of the brain-using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems-reads to parallel disk arrays and writes to solid-state storage-to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization.
Burns, Randal; Roncal, William Gray; Kleissas, Dean; Lillaney, Kunal; Manavalan, Priya; Perlman, Eric; Berger, Daniel R.; Bock, Davi D.; Chung, Kwanghun; Grosenick, Logan; Kasthuri, Narayanan; Weiler, Nicholas C.; Deisseroth, Karl; Kazhdan, Michael; Lichtman, Jeff; Reid, R. Clay; Smith, Stephen J.; Szalay, Alexander S.; Vogelstein, Joshua T.; Vogelstein, R. Jacob
2013-01-01
We describe a scalable database cluster for the spatial analysis and annotation of high-throughput brain imaging data, initially for 3-d electron microscopy image stacks, but for time-series and multi-channel data as well. The system was designed primarily for workloads that build connectomes— neural connectivity maps of the brain—using the parallel execution of computer vision algorithms on high-performance compute clusters. These services and open-science data sets are publicly available at openconnecto.me. The system design inherits much from NoSQL scale-out and data-intensive computing architectures. We distribute data to cluster nodes by partitioning a spatial index. We direct I/O to different systems—reads to parallel disk arrays and writes to solid-state storage—to avoid I/O interference and maximize throughput. All programming interfaces are RESTful Web services, which are simple and stateless, improving scalability and usability. We include a performance evaluation of the production system, highlighting the effec-tiveness of spatial data organization. PMID:24401992
He, Zhili; Yang, Yunfeng; Deng, Ye; Tringe, Susannah G.; Alvarez-Cohen, Lisa
2015-01-01
ABSTRACT Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied “open-format” and “closed-format” detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications and focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions. PMID:25626903
High-throughput technology for novel SO2 oxidation catalysts
Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F
2011-01-01
We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427
Zhou, Jizhong; He, Zhili; Yang, Yunfeng; ...
2015-01-27
Understanding the structure, functions, activities and dynamics of microbial communities in natural environments is one of the grand challenges of 21st century science. To address this challenge, over the past decade, numerous technologies have been developed for interrogating microbial communities, of which some are amenable to exploratory work (e.g., high-throughput sequencing and phenotypic screening) and others depend on reference genes or genomes (e.g., phylogenetic and functional gene arrays). Here, we provide a critical review and synthesis of the most commonly applied “open-format” and “closed-format” detection technologies. We discuss their characteristics, advantages, and disadvantages within the context of environmental applications andmore » focus on analysis of complex microbial systems, such as those in soils, in which diversity is high and reference genomes are few. In addition, we discuss crucial issues and considerations associated with applying complementary high-throughput molecular technologies to address important ecological questions.« less
NASA Astrophysics Data System (ADS)
d'Acremont, Quentin; Pernot, Gilles; Rampnoux, Jean-Michel; Furlan, Andrej; Lacroix, David; Ludwig, Alfred; Dilhaire, Stefan
2017-07-01
A High-Throughput Time-Domain ThermoReflectance (HT-TDTR) technique was developed to perform fast thermal conductivity measurements with minimum user actions required. This new setup is based on a heterodyne picosecond thermoreflectance system. The use of two different laser oscillators has been proven to reduce the acquisition time by two orders of magnitude and avoid the experimental artefacts usually induced by moving the elements present in TDTR systems. An amplitude modulation associated to a lock-in detection scheme is included to maintain a high sensitivity to thermal properties. We demonstrate the capabilities of the HT-TDTR setup to perform high-throughput thermal analysis by mapping thermal conductivity and interface resistances of a ternary thin film silicide library FexSiyGe100-x-y (20
Multiplexed mass cytometry profiling of cellular states perturbed by small-molecule regulators
Bodenmiller, Bernd; Zunder, Eli R.; Finck, Rachel; Chen, Tiffany J.; Savig, Erica S.; Bruggner, Robert V.; Simonds, Erin F.; Bendall, Sean C.; Sachs, Karen; Krutzik, Peter O.; Nolan, Garry P.
2013-01-01
The ability to comprehensively explore the impact of bio-active molecules on human samples at the single-cell level can provide great insight for biomedical research. Mass cytometry enables quantitative single-cell analysis with deep dimensionality, but currently lacks high-throughput capability. Here we report a method termed mass-tag cellular barcoding (MCB) that increases mass cytometry throughput by sample multiplexing. 96-well format MCB was used to characterize human peripheral blood mononuclear cell (PBMC) signaling dynamics, cell-to-cell communication, the signaling variability between 8 donors, and to define the impact of 27 inhibitors on this system. For each compound, 14 phosphorylation sites were measured in 14 PBMC types, resulting in 18,816 quantified phosphorylation levels from each multiplexed sample. This high-dimensional systems-level inquiry allowed analysis across cell-type and signaling space, reclassified inhibitors, and revealed off-target effects. MCB enables high-content, high-throughput screening, with potential applications for drug discovery, pre-clinical testing, and mechanistic investigation of human disease. PMID:22902532
NASA Astrophysics Data System (ADS)
Song, Yongchen; Hao, Min; Zhao, Yuechao; Zhang, Liang
2014-12-01
In this study, the dual-chamber pressure decay method and magnetic resonance imaging (MRI) were used to dynamically visualize the gas diffusion process in liquid-saturated porous media, and the relationship of concentration-distance for gas diffusing into liquid-saturated porous media at different times were obtained by MR images quantitative analysis. A non-iterative finite volume method was successfully applied to calculate the local gas diffusion coefficient in liquid-saturated porous media. The results agreed very well with the conventional pressure decay method, thus it demonstrates that the method was feasible of determining the local diffusion coefficient of gas in liquid-saturated porous media at different times during diffusion process.
Barteneva, Natasha S; Vorobjev, Ivan A
2018-01-01
In this paper, we review some of the recent advances in cellular heterogeneity and single-cell analysis methods. In modern research of cellular heterogeneity, there are four major approaches: analysis of pooled samples, single-cell analysis, high-throughput single-cell analysis, and lately integrated analysis of cellular population at a single-cell level. Recently developed high-throughput single-cell genetic analysis methods such as RNA-Seq require purification step and destruction of an analyzed cell often are providing a snapshot of the investigated cell without spatiotemporal context. Correlative analysis of multiparameter morphological, functional, and molecular information is important for differentiation of more uniform groups in the spectrum of different cell types. Simplified distributions (histograms and 2D plots) can underrepresent biologically significant subpopulations. Future directions may include the development of nondestructive methods for dissecting molecular events in intact cells, simultaneous correlative cellular analysis of phenotypic and molecular features by hybrid technologies such as imaging flow cytometry, and further progress in supervised and non-supervised statistical analysis algorithms.
Ogawa, Shoujiro; Kittaka, Hiroki; Nakata, Akiho; Komatsu, Kenji; Sugiura, Takahiro; Satoh, Mamoru; Nomura, Fumio; Higashi, Tatsuya
2017-03-20
The plasma/serum concentration of 25-hydroxyvitamin D 3 [25(OH)D 3 ] is a diagnostic index for vitamin D deficiency/insufficiency, which is associated with a wide range of diseases, such as rickets, cancer and diabetes. We have reported that the derivatization with 4-(4-dimethylaminophenyl)-1,2,4-triazoline-3,5-dione (DAPTAD) works well in the liquid chromatography/electrospray ionization-tandem mass spectrometry (LC/ESI-MS/MS) assay of the serum/plasma 25(OH)D 3 for enhancing the sensitivity and the separation from a potent interfering metabolite, 3-epi-25-hydroxyvitamin D 3 [3-epi-25(OH)D 3 ]. However, enhancing the analysis throughput remains an issue in the LC/ESI-MS/MS assay of 25(OH)D 3 . The most obvious restriction of the LC/MS/MS throughput is the chromatographic run time. In this study, we developed an enhanced throughput method for the determination of the plasma 25(OH)D 3 by LC/ESI-MS/MS combined with the derivatization using the triplex ( 2 H 0 -, 2 H 3 - and 2 H 6 -) DAPTAD isotopologues. After separate derivatization with 1 of 3 different isotopologues, the 3 samples were combined and injected together into LC/ESI-MS/MS. Based on the mass differences between the isotopologues, the derivatized 25(OH)D 3 in the 3 different samples were quantified within a single run. The developed method tripled the hourly analysis throughput without sacrificing assay performance, i.e., ease of pretreatment of plasma sample (only deproteinization), limit of quantification (1.0ng/mL when a 5μL-plasma was used), precision (intra-assay RSD≤5.9% and inter-assay RSD≤5.5%), accuracy (98.7-102.2%), matrix effects, and capability of separating from an interfering metabolite, 3-epi-25(OH)D 3 . The multiplexing of samples by the isotopologue derivatization was applied to the analysis of plasma samples of healthy subjects and the developed method was proven to have a satisfactory applicability. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Gukasyan, A. V.; Koshevoy, E. P.; Kosachev, V. S.
2018-05-01
A comparative analysis of alternative models for plastic flow in extrusive transportation of oil-bearing materials was conducted; the research was directed at determining the function describing the screw core throughput capacity of the press (extruder). Transition from a one-dimensional model to a two-dimensional model significantly improves the mathematical model and allows using two-dimensional rheological models determining the throughput of the screw core.
Kortz, Linda; Helmschrodt, Christin; Ceglarek, Uta
2011-03-01
In the last decade various analytical strategies have been established to enhance separation speed and efficiency in high performance liquid chromatography applications. Chromatographic supports based on monolithic material, small porous particles, and porous layer beads have been developed and commercialized to improve throughput and separation efficiency. This paper provides an overview of current developments in fast chromatography combined with mass spectrometry for the analysis of metabolites and proteins in clinical applications. Advances and limitations of fast chromatography for the combination with mass spectrometry are discussed. Practical aspects of, recent developments in, and the present status of high-throughput analysis of human body fluids for therapeutic drug monitoring, toxicology, clinical metabolomics, and proteomics are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR
Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less
High or low oxygen saturation and severe retinopathy of prematurity: a meta-analysis.
Chen, Minghua L; Guo, Lei; Smith, Lois E H; Dammann, Christiane E L; Dammann, Olaf
2010-06-01
Low oxygen saturation appears to decrease the risk of severe retinopathy of prematurity (ROP) in preterm newborns when administered during the first few weeks after birth. High oxygen saturation seems to reduce the risk at later postmenstrual ages (PMAs). However, previous clinical studies are not conclusive individually. To perform a systematic review and meta-analysis to report the association between severe ROP incidence of premature infants with high or low target oxygen saturation measured by pulse oximetry. Studies were identified through PubMed and Embase literature searches through May 2009 by using the terms "retinopathy of prematurity and oxygen" or "retinopathy of prematurity and oxygen therapy." We selected 10 publications addressing the association between severe ROP and target oxygen saturation measured by pulse oximetry. Using a random-effects model we calculated the summary-effect estimate. We visually inspected funnel plots to examine possible publication bias. Low oxygen saturation (70%-96%) in the first several postnatal weeks was associated with a reduced risk of severe ROP (risk ratio [RR]: 0.48 [95% confidence interval (CI): 0.31-0.75]). High oxygen saturation (94%-99%) at > or = 32 weeks' PMA was associated with a decreased risk for progression to severe ROP (RR: 0.54 [95% CI: 0.35-0.82]). Among preterm infants with a gestational age of < or = 32 weeks, early low and late high oxygen saturation were associated with a reduced risk for severe ROP. We feel that a large randomized clinical trial with long-term developmental follow-up is warranted to confirm this meta-analytic result.
Wyatt, S K; Barck, K H; Kates, L; Zavala-Solorio, J; Ross, J; Kolumam, G; Sonoda, J; Carano, R A D
2015-11-01
The ability to non-invasively measure body composition in mouse models of obesity and obesity-related disorders is essential for elucidating mechanisms of metabolic regulation and monitoring the effects of novel treatments. These studies aimed to develop a fully automated, high-throughput micro-computed tomography (micro-CT)-based image analysis technique for longitudinal quantitation of adipose, non-adipose and lean tissue as well as bone and demonstrate utility for assessing the effects of two distinct treatments. An initial validation study was performed in diet-induced obesity (DIO) and control mice on a vivaCT 75 micro-CT system. Subsequently, four groups of DIO mice were imaged pre- and post-treatment with an experimental agonistic antibody specific for anti-fibroblast growth factor receptor 1 (anti-FGFR1, R1MAb1), control immunoglobulin G antibody, a known anorectic antiobesity drug (rimonabant, SR141716), or solvent control. The body composition analysis technique was then ported to a faster micro-CT system (CT120) to markedly increase throughput as well as to evaluate the use of micro-CT image intensity for hepatic lipid content in DIO and control mice. Ex vivo chemical analysis and colorimetric analysis of the liver triglycerides were performed as the standard metrics for correlation with body composition and hepatic lipid status, respectively. Micro-CT-based body composition measures correlate with ex vivo chemical analysis metrics and enable distinction between DIO and control mice. R1MAb1 and rimonabant have differing effects on body composition as assessed by micro-CT. High-throughput body composition imaging is possible using a modified CT120 system. Micro-CT also provides a non-invasive assessment of hepatic lipid content. This work describes, validates and demonstrates utility of a fully automated image analysis technique to quantify in vivo micro-CT-derived measures of adipose, non-adipose and lean tissue, as well as bone. These body composition metrics highly correlate with standard ex vivo chemical analysis and enable longitudinal evaluation of body composition and therapeutic efficacy monitoring.
Fujimori, Shigeo; Hirai, Naoya; Ohashi, Hiroyuki; Masuoka, Kazuyo; Nishikimi, Akihiko; Fukui, Yoshinori; Washio, Takanori; Oshikubo, Tomohiro; Yamashita, Tatsuhiro; Miyamoto-Sato, Etsuko
2012-01-01
Next-generation sequencing (NGS) has been applied to various kinds of omics studies, resulting in many biological and medical discoveries. However, high-throughput protein-protein interactome datasets derived from detection by sequencing are scarce, because protein-protein interaction analysis requires many cell manipulations to examine the interactions. The low reliability of the high-throughput data is also a problem. Here, we describe a cell-free display technology combined with NGS that can improve both the coverage and reliability of interactome datasets. The completely cell-free method gives a high-throughput and a large detection space, testing the interactions without using clones. The quantitative information provided by NGS reduces the number of false positives. The method is suitable for the in vitro detection of proteins that interact not only with the bait protein, but also with DNA, RNA and chemical compounds. Thus, it could become a universal approach for exploring the large space of protein sequences and interactome networks. PMID:23056904
Near-common-path interferometer for imaging Fourier-transform spectroscopy in wide-field microscopy
Wadduwage, Dushan N.; Singh, Vijay Raj; Choi, Heejin; Yaqoob, Zahid; Heemskerk, Hans; Matsudaira, Paul; So, Peter T. C.
2017-01-01
Imaging Fourier-transform spectroscopy (IFTS) is a powerful method for biological hyperspectral analysis based on various imaging modalities, such as fluorescence or Raman. Since the measurements are taken in the Fourier space of the spectrum, it can also take advantage of compressed sensing strategies. IFTS has been readily implemented in high-throughput, high-content microscope systems based on wide-field imaging modalities. However, there are limitations in existing wide-field IFTS designs. Non-common-path approaches are less phase-stable. Alternatively, designs based on the common-path Sagnac interferometer are stable, but incompatible with high-throughput imaging. They require exhaustive sequential scanning over large interferometric path delays, making compressive strategic data acquisition impossible. In this paper, we present a novel phase-stable, near-common-path interferometer enabling high-throughput hyperspectral imaging based on strategic data acquisition. Our results suggest that this approach can improve throughput over those of many other wide-field spectral techniques by more than an order of magnitude without compromising phase stability. PMID:29392168
Choudhry, Priya
2016-01-01
Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849
NASA Technical Reports Server (NTRS)
Otterson, D. A.; Seng, G. T.
1984-01-01
A new high-performance liquid chromatographic (HPLC) method for group-type analysis of middistillate fuels is described. It uses a refractive index detector and standards that are prepared by reacting a portion of the fuel sample with sulfuric acid. A complete analysis of a middistillate fuel for saturates and aromatics (including the preparation of the standard) requires about 15 min if standards for several fuels are prepared simultaneously. From model fuel studies, the method was found to be accurate to within 0.4 vol% saturates or aromatics, and provides a precision of + or - 0.4 vol%. Olefin determinations require an additional 15 min of analysis time. However, this determination is needed only for those fuels displaying a significant olefin response at 200 nm (obtained routinely during the saturated/aromatics analysis procedure). The olefin determination uses the responses of the olefins and the corresponding saturates, as well as the average value of their refractive index sensitivity ratios (1.1). Studied indicated that, although the relative error in the olefins result could reach 10 percent by using this average sensitivity ratio, it was 5 percent for the fuels used in this study. Olefin concentrations as low as 0.1 vol% have been determined using this method.
From cancer genomes to cancer models: bridging the gaps
Baudot, Anaïs; Real, Francisco X.; Izarzugaza, José M. G.; Valencia, Alfonso
2009-01-01
Cancer genome projects are now being expanded in an attempt to provide complete landscapes of the mutations that exist in tumours. Although the importance of cataloguing genome variations is well recognized, there are obvious difficulties in bridging the gaps between high-throughput resequencing information and the molecular mechanisms of cancer evolution. Here, we describe the current status of the high-throughput genomic technologies, and the current limitations of the associated computational analysis and experimental validation of cancer genetic variants. We emphasize how the current cancer-evolution models will be influenced by the high-throughput approaches, in particular through efforts devoted to monitoring tumour progression, and how, in turn, the integration of data and models will be translated into mechanistic knowledge and clinical applications. PMID:19305388
Loeffler 4.0: Diagnostic Metagenomics.
Höper, Dirk; Wylezich, Claudia; Beer, Martin
2017-01-01
A new world of possibilities for "virus discovery" was opened up with high-throughput sequencing becoming available in the last decade. While scientifically metagenomic analysis was established before the start of the era of high-throughput sequencing, the availability of the first second-generation sequencers was the kick-off for diagnosticians to use sequencing for the detection of novel pathogens. Today, diagnostic metagenomics is becoming the standard procedure for the detection and genetic characterization of new viruses or novel virus variants. Here, we provide an overview about technical considerations of high-throughput sequencing-based diagnostic metagenomics together with selected examples of "virus discovery" for animal diseases or zoonoses and metagenomics for food safety or basic veterinary research. © 2017 Elsevier Inc. All rights reserved.
Assaying gene function by growth competition experiment.
Merritt, Joshua; Edwards, Jeremy S
2004-07-01
High-throughput screening and analysis is one of the emerging paradigms in biotechnology. In particular, high-throughput methods are essential in the field of functional genomics because of the vast amount of data generated in recent and ongoing genome sequencing efforts. In this report we discuss integrated functional analysis methodologies which incorporate both a growth competition component and a highly parallel assay used to quantify results of the growth competition. Several applications of the two most widely used technologies in the field, i.e., transposon mutagenesis and deletion strain library growth competition, and individual applications of several developing or less widely reported technologies are presented.
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis
2012-01-01
Background The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Results Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. Conclusions By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand. PMID:22276739
Pair-barcode high-throughput sequencing for large-scale multiplexed sample analysis.
Tu, Jing; Ge, Qinyu; Wang, Shengqin; Wang, Lei; Sun, Beili; Yang, Qi; Bai, Yunfei; Lu, Zuhong
2012-01-25
The multiplexing becomes the major limitation of the next-generation sequencing (NGS) in application to low complexity samples. Physical space segregation allows limited multiplexing, while the existing barcode approach only permits simultaneously analysis of up to several dozen samples. Here we introduce pair-barcode sequencing (PBS), an economic and flexible barcoding technique that permits parallel analysis of large-scale multiplexed samples. In two pilot runs using SOLiD sequencer (Applied Biosystems Inc.), 32 independent pair-barcoded miRNA libraries were simultaneously discovered by the combination of 4 unique forward barcodes and 8 unique reverse barcodes. Over 174,000,000 reads were generated and about 64% of them are assigned to both of the barcodes. After mapping all reads to pre-miRNAs in miRBase, different miRNA expression patterns are captured from the two clinical groups. The strong correlation using different barcode pairs and the high consistency of miRNA expression in two independent runs demonstrates that PBS approach is valid. By employing PBS approach in NGS, large-scale multiplexed pooled samples could be practically analyzed in parallel so that high-throughput sequencing economically meets the requirements of samples which are low sequencing throughput demand.
Zhang, Bing; Schmoyer, Denise; Kirov, Stefan; Snoddy, Jay
2004-01-01
Background Microarray and other high-throughput technologies are producing large sets of interesting genes that are difficult to analyze directly. Bioinformatics tools are needed to interpret the functional information in the gene sets. Results We have created a web-based tool for data analysis and data visualization for sets of genes called GOTree Machine (GOTM). This tool was originally intended to analyze sets of co-regulated genes identified from microarray analysis but is adaptable for use with other gene sets from other high-throughput analyses. GOTree Machine generates a GOTree, a tree-like structure to navigate the Gene Ontology Directed Acyclic Graph for input gene sets. This system provides user friendly data navigation and visualization. Statistical analysis helps users to identify the most important Gene Ontology categories for the input gene sets and suggests biological areas that warrant further study. GOTree Machine is available online at . Conclusion GOTree Machine has a broad application in functional genomic, proteomic and other high-throughput methods that generate large sets of interesting genes; its primary purpose is to help users sort for interesting patterns in gene sets. PMID:14975175
Optical characterization of ultra-sensitive TES bolometers for SAFARI
NASA Astrophysics Data System (ADS)
Audley, Michael D.; de Lange, Gerhard; Gao, Jian-Rong; Khosropanah, Pourya; Mauskopf, Philip D.; Morozov, Dmitry; Trappe, Neil A.; Doherty, Stephen; Withington, Stafford
2014-07-01
We have characterized the optical response of prototype detectors for SAFARI, the far-infrared imaging spectrometer for the SPICA satellite. SAFARI's three bolometer arrays will image a 2'×2' field of view with spectral information over the wavelength range 34—210 μm. SAFARI requires extremely sensitive detectors (goal NEP ~ 0.2 aW/√Hz), with correspondingly low saturation powers (~5 fW), to take advantage of SPICA's cooled optics. We have constructed an ultra-low background optical test facility containing an internal cold black-body illuminator and have recently added an internal hot black-body source and a light-pipe for external illumination. We illustrate the performance of the test facility with results including spectral-response measurements. Based on an improved understanding of the optical throughput of the test facility we find an optical efficiency of 60% for prototype SAFARI detectors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moretti, Rocco; Chang, Aram; Peltier-Pain, Pauline
2012-03-15
Directed evolution is a valuable technique to improve enzyme activity in the absence of a priori structural knowledge, which can be typically enhanced via structure-guided strategies. In this study, a combination of both whole-gene error-prone polymerase chain reaction and site-saturation mutagenesis enabled the rapid identification of mutations that improved RmlA activity toward non-native substrates. These mutations have been shown to improve activities over 10-fold for several targeted substrates, including non-native pyrimidine- and purine-based NTPs as well as non-native d- and l-sugars (both a- and b-isomers). This study highlights the first broadly applicable high throughput sugar-1-phosphate nucleotidyltransferase screen and the firstmore » proof of concept for the directed evolution of this enzyme class toward the identification of uniquely permissive RmlA variants.« less
Saturation of energetic-particle-driven geodesic acoustic modes due to wave-particle nonlinearity
NASA Astrophysics Data System (ADS)
Biancalani, A.; Chavdarovski, I.; Qiu, Z.; Bottino, A.; Del Sarto, D.; Ghizzo, A.; Gürcan, Ö.; Morel, P.; Novikau, I.
2017-12-01
The nonlinear dynamics of energetic-particle (EP) driven geodesic acoustic modes (EGAM) is investigated here. A numerical analysis with the global gyrokinetic particle-in-cell code ORB5 is performed, and the results are interpreted with the analytical theory, in close comparison with the theory of the beam-plasma instability. Only axisymmetric modes are considered, with a nonlinear dynamics determined by wave-particle interaction. Quadratic scalings of the saturated electric field with respect to the linear growth rate are found for the case of interest. As a main result, the formula for the saturation level is provided. Near the saturation, we observe a transition from adiabatic to non-adiabatic dynamics, i.e. the frequency chirping rate becomes comparable to the resonant EP bounce frequency. The numerical analysis is performed here with electrostatic simulations with circular flux surfaces, and kinetic effects of the electrons are neglected.
Tsotsou, Georgia Eleni; Cass, Anthony Edward George; Gilardi, Gianfranco
2002-01-01
A rapid method for identifying compounds that are potential substrates for the drug metabolising enzyme cytochrome P450 is described. The strategy is based on the detection of a degradation product of NAD(P)H oxidation during substrate turnover by the enzyme expressed in Escherichia coli cells spontaneously lysed under the experimental conditions. The performance of the method has been tested on two known substrates of the wild-type cytochrome P450 BM3, arachidonic (AA) and lauric (LA) acids, and two substrates with environmental significance, the anionic surfactant sodium dodecyl sulfate (SDS), and the solvent 1,1,2,2-tetrachloroethane (TCE). The minimal background signal given from cells expressing cytochrome P450 BM3 in the absence of added substrate is only 3% of the signal in the presence of saturating substrate. Control experiments have proven that this method is specifically detecting NADPH oxidation by catalytic turnover of P450 BM3. The assay has been adapted to a microtitre plate format and used to screen a series of furazan derivatives as potential substrates. Three derivatives were identified as substrates. The method gave a significant different signal for two isomeric furazan derivatives. All results found on the cell lysate were verified and confirmed with the purified enzyme. This strategy opens the way to automated high throughput screening of NAD(P)H-linked enzymatic activity of molecules of pharmacological and biotechnological interest and libraries of random mutants of NAD(P)H-dependent biocatalysts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Rui; Chen, Hui; Zhong, Chao
Coenzyme engineering that changes NAD(P) selectivity of redox enzymes is an important tool in metabolic engineering, synthetic biology, and biocatalysis. Here we developed a high throughput screening method to identify mutants of 6-phosphogluconate dehydrogenase (6PGDH) from a thermophilic bacterium Moorella thermoacetica with reversed coenzyme selectivity from NADP + to NAD +. Colonies of a 6PGDH mutant library growing on the agar plates were treated by heat to minimize the background noise, that is, the deactivation of intracellular dehydrogenases, degradation of inherent NAD(P)H, and disruption of cell membrane. The melted agarose solution containing a redox dye tetranitroblue tetrazolium (TNBT), phenazine methosulfatemore » (PMS), NAD +, and 6-phosphogluconate was carefully poured on colonies, forming a second semi-solid layer. More active 6PGDH mutants were examined via an enzyme-linked TNBT-PMS colorimetric assay. Positive mutants were recovered by direct extraction of plasmid from dead cell colonies followed by plasmid transformation into E. coli TOP10. By utilizing this double-layer screening method, six positive mutants were obtained from two-round saturation mutagenesis. The best mutant 6PGDH A30D/R31I/T32I exhibited a 4,278-fold reversal of coenzyme selectivity from NADP + to NAD +. Furthermore, this screening method could be widely used to detect numerous redox enzymes, particularly for thermophilic ones, which can generate NAD(P)H reacted with the redox dye TNBT.« less
Cryosorption Pumps for a Neutral Beam Injector Test Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dremel, M.; Mack, A.; Day, C.
2006-04-27
We present the experiences of the manufacturing and the operating of a system of two identical cryosorption pumps used in a neutral beam injector test facility for fusion reactors. Calculated and measured heat loads of the cryogenic liquid helium and liquid nitrogen circuits of the cryosorption pumps are discussed. The design calculations concerning the thermo-hydraulics of the helium circuit are compared with experiences from the operation of the cryosorption pumps. Both cryopumps are integrated in a test facility of a neutral beam injector that will be used to heat the plasma of a nuclear fusion reactor with a beam ofmore » deuterium or hydrogen molecules. The huge gas throughput into the vessel of the test facility results in challenging needs on the cryopumping system.The developed cryosorption pumps are foreseen to pump a hydrogen throughput of 20 - 30 mbar{center_dot}l/s. To establish a mean pressure of several 10-5 mbar in the test vessel a pumping speed of about 350 m3/s per pump is needed. The pressure conditions must be maintained over several hours pumping without regeneration of the cryopanels, which necessitates a very high pumping capacity. A possibility to fulfill these requirements is the use of charcoal coated cryopanels to pump the gasloads by adsorption. For the cooling of the cryopanels, liquid helium at saturation pressure is used and therefore a two-phase forced flow in the cryopump system must be controlled.« less
Orgovan, Norbert; Peter, Beatrix; Bősze, Szilvia; Ramsden, Jeremy J; Szabó, Bálint; Horvath, Robert
2014-02-07
A novel high-throughput label-free resonant waveguide grating (RWG) imager biosensor, the Epic® BenchTop (BT), was utilized to determine the dependence of cell spreading kinetics on the average surface density (v(RGD)) of integrin ligand RGD-motifs. v(RGD) was tuned over four orders of magnitude by co-adsorbing the biologically inactive PLL-g-PEG and the RGD-functionalized PLL-g-PEG-RGD synthetic copolymers from their mixed solutions onto the sensor surface. Using highly adherent human cervical tumor (HeLa) cells as a model system, cell adhesion kinetic data of unprecedented quality were obtained. Spreading kinetics were fitted with the logistic equation to obtain the spreading rate constant (r) and the maximum biosensor response (Δλmax), which is assumed to be directly proportional to the maximum spread contact area (Amax). r was found to be independent of the surface density of integrin ligands. In contrast, Δλmax increased with increasing RGD surface density until saturation at high densities. Interpreting the latter behavior with a simple kinetic mass action model, a 2D dissociation constant of 1753 ± 243 μm(-2) (corresponding to a 3D dissociation constant of ~30 μM) was obtained for the binding between RGD-specific integrins embedded in the cell membrane and PLL-g-PEG-RGD. All of these results were obtained completely noninvasively without using any labels.
Huang, Rui; Chen, Hui; Zhong, Chao; ...
2016-09-02
Coenzyme engineering that changes NAD(P) selectivity of redox enzymes is an important tool in metabolic engineering, synthetic biology, and biocatalysis. Here we developed a high throughput screening method to identify mutants of 6-phosphogluconate dehydrogenase (6PGDH) from a thermophilic bacterium Moorella thermoacetica with reversed coenzyme selectivity from NADP + to NAD +. Colonies of a 6PGDH mutant library growing on the agar plates were treated by heat to minimize the background noise, that is, the deactivation of intracellular dehydrogenases, degradation of inherent NAD(P)H, and disruption of cell membrane. The melted agarose solution containing a redox dye tetranitroblue tetrazolium (TNBT), phenazine methosulfatemore » (PMS), NAD +, and 6-phosphogluconate was carefully poured on colonies, forming a second semi-solid layer. More active 6PGDH mutants were examined via an enzyme-linked TNBT-PMS colorimetric assay. Positive mutants were recovered by direct extraction of plasmid from dead cell colonies followed by plasmid transformation into E. coli TOP10. By utilizing this double-layer screening method, six positive mutants were obtained from two-round saturation mutagenesis. The best mutant 6PGDH A30D/R31I/T32I exhibited a 4,278-fold reversal of coenzyme selectivity from NADP + to NAD +. Furthermore, this screening method could be widely used to detect numerous redox enzymes, particularly for thermophilic ones, which can generate NAD(P)H reacted with the redox dye TNBT.« less
Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N
2017-01-01
Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.
Recent advances in quantitative high throughput and high content data analysis.
Moutsatsos, Ioannis K; Parker, Christian N
2016-01-01
High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.
Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes
2012-01-01
Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.
NASA Astrophysics Data System (ADS)
Lin, W.; Noormets, A.; domec, J.; King, J. S.; Sun, G.; McNulty, S.
2012-12-01
Wood stable isotope ratios (δ13C and δ18O) offer insight to water source and plant water use efficiency (WUE), which in turn provide a glimpse to potential plant responses to changing climate, particularly rainfall patterns. The synthetic pathways of cell wall deposition in wood rings differ in their discrimination ratios between the light and heavy isotopes, and α-cellulose is broadly seen as the best indicator of plant water status due to its local and temporal fixation and to its high abundance within the wood. To use the effects of recent severe droughts on the WUE of loblolly pine (Pinus taeda) throughout Southeastern USA as a harbinger of future changes, an effort has been undertaken to sample the entire range of the species and to sample the isotopic composition in a consistent manner. To be able to accommodate the large number of samples required by this analysis, we have developed a new high-throughput method for α-cellulose extraction, which is the rate-limiting step in such an endeavor. Although an entire family of methods has been developed and perform well, their throughput in a typical research lab setting is limited to 16-75 samples per week with intensive labor input. The resin exclusion step in conifersis is particularly time-consuming. We have combined the recent advances of α-cellulose extraction in plant ecology and wood science, including a high-throughput extraction device developed in the Potsdam Dendro Lab and a simple chemical-based resin exclusion method. By transferring the entire extraction process to a multiport-based system allows throughputs of up to several hundred samples in two weeks, while minimizing labor requirements to 2-3 days per batch of samples.
Scaling and automation of a high-throughput single-cell-derived tumor sphere assay chip.
Cheng, Yu-Heng; Chen, Yu-Chih; Brien, Riley; Yoon, Euisik
2016-10-07
Recent research suggests that cancer stem-like cells (CSCs) are the key subpopulation for tumor relapse and metastasis. Due to cancer plasticity in surface antigen and enzymatic activity markers, functional tumorsphere assays are promising alternatives for CSC identification. To reliably quantify rare CSCs (1-5%), thousands of single-cell suspension cultures are required. While microfluidics is a powerful tool in handling single cells, previous works provide limited throughput and lack automatic data analysis capability required for high-throughput studies. In this study, we present the scaling and automation of high-throughput single-cell-derived tumor sphere assay chips, facilitating the tracking of up to ∼10 000 cells on a chip with ∼76.5% capture rate. The presented cell capture scheme guarantees sampling a representative population from the bulk cells. To analyze thousands of single-cells with a variety of fluorescent intensities, a highly adaptable analysis program was developed for cell/sphere counting and size measurement. Using a Pluronic® F108 (poly(ethylene glycol)-block-poly(propylene glycol)-block-poly(ethylene glycol)) coating on polydimethylsiloxane (PDMS), a suspension culture environment was created to test a controversial hypothesis: whether larger or smaller cells are more stem-like defined by the capability to form single-cell-derived spheres. Different cell lines showed different correlations between sphere formation rate and initial cell size, suggesting heterogeneity in pathway regulation among breast cancer cell lines. More interestingly, by monitoring hundreds of spheres, we identified heterogeneity in sphere growth dynamics, indicating the cellular heterogeneity even within CSCs. These preliminary results highlight the power of unprecedented high-throughput and automation in CSC studies.
Suwattanaphim, Suparach; Yodavuhd, Sirisanpang; Puangsa-art, Supalarp
2015-07-01
Oxygen Saturation is one of the important data to determine patient status and worldwide applied in several situations. Evaluation about status of immediate perinatal period of the infant usually uses clinical assessment, Apgar scoring, which had been used for a long time without other scientific measurement. Pulse oximeter the non-invasive measurement of oxygen saturation, may play role for oxygen saturation evaluation in newborn that immediately change from intra to extra uterine environment. Monitoring the time duration that immediately born infants by normal labor or Cesarean section modes, used to archived target oxygen saturation (SpO) and looking for the other factors that influence oxygen saturation adaptation. The data of the 553 infants born in Charoenkrung Pracharak Hospital, Bangkok, Thailand between October 2012 and April 2013 were collected. The 204 healthy newborns that met all criteria were studied. All infants were recorded pulse oximeter from the second to the tenth minute after birth. They were grouped by several factors such as maternal gravidity, gestational age, mode of delivery, Apgar score, birth weight, and sex. Time interval to achieve target oxygen saturation (SpO2 ≥ 90%) was collected for analysis. The oxygen saturation of infants immediately after birth showed an increase. Median time interval was 6.5 (2-10) minutes for 90% saturation and 7 (2-10) minutes for 95% saturation, respectively. Only mode of delivery showed statistical significant time difference (p < 0.001). A Cox proportional hazards analysis of the Kaplan-Meier curves demonstrated that infants born by cesarean delivery took significantly longer time to reach a stable SpO2 ≥ 90% than infants born by vaginal delivery (95% CI = 1.28 to 2.74; p < 0.01). A newly born infant has to take 6.5 minutes (2-10) after birth to adjust their oxygen saturation to reach normal higher level of extra uterine life, median SpO2 of 90%. Furthermore, mode of delivery makes a significant difference of oxygen saturation status; the cesarean route takes significantly longer time than the vaginal route to achieve SpO2 ≥ 90%.
Heinig, Uwe; Scholz, Susanne; Dahm, Pia; Grabowy, Udo; Jennewein, Stefan
2010-08-01
Classical approaches to strain improvement and metabolic engineering rely on rapid qualitative and quantitative analyses of the metabolites of interest. As an analytical tool, mass spectrometry (MS) has proven to be efficient and nearly universally applicable for timely screening of metabolites. Furthermore, gas chromatography (GC)/MS- and liquid chromatography (LC)/MS-based metabolite screens can often be adapted to high-throughput formats. We recently engineered a Saccharomyces cerevisiae strain to produce taxa-4(5),11(12)-diene, the first pathway-committing biosynthetic intermediate for the anticancer drug Taxol, through the heterologous and homologous expression of several genes related to isoprenoid biosynthesis. To date, GC/MS- and LC/MS-based high-throughput methods have been inherently difficult to adapt to the screening of isoprenoid-producing microbial strains due to the need for extensive sample preparation of these often highly lipophilic compounds. In the current work, we examined different approaches to the high-throughput analysis of taxa-4(5),11(12)-diene biosynthesizing yeast strains in a 96-deep-well format. Carbon plasma coating of standard 96-deep-well polypropylene plates allowed us to circumvent the inherent solvent instability of commonly used deep-well plates. In addition, efficient adsorption of the target isoprenoid product by the coated plates allowed rapid and simple qualitative and quantitative analyses of the individual cultures. Copyright 2010 Elsevier Inc. All rights reserved.
Subnuclear foci quantification using high-throughput 3D image cytometry
NASA Astrophysics Data System (ADS)
Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.
2015-07-01
Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.
Broadband ion mobility deconvolution for rapid analysis of complex mixtures.
Pettit, Michael E; Brantley, Matthew R; Donnarumma, Fabrizio; Murray, Kermit K; Solouki, Touradj
2018-05-04
High resolving power ion mobility (IM) allows for accurate characterization of complex mixtures in high-throughput IM mass spectrometry (IM-MS) experiments. We previously demonstrated that pure component IM-MS data can be extracted from IM unresolved post-IM/collision-induced dissociation (CID) MS data using automated ion mobility deconvolution (AIMD) software [Matthew Brantley, Behrooz Zekavat, Brett Harper, Rachel Mason, and Touradj Solouki, J. Am. Soc. Mass Spectrom., 2014, 25, 1810-1819]. In our previous reports, we utilized a quadrupole ion filter for m/z-isolation of IM unresolved monoisotopic species prior to post-IM/CID MS. Here, we utilize a broadband IM-MS deconvolution strategy to remove the m/z-isolation requirement for successful deconvolution of IM unresolved peaks. Broadband data collection has throughput and multiplexing advantages; hence, elimination of the ion isolation step reduces experimental run times and thus expands the applicability of AIMD to high-throughput bottom-up proteomics. We demonstrate broadband IM-MS deconvolution of two separate and unrelated pairs of IM unresolved isomers (viz., a pair of isomeric hexapeptides and a pair of isomeric trisaccharides) in a simulated complex mixture. Moreover, we show that broadband IM-MS deconvolution improves high-throughput bottom-up characterization of a proteolytic digest of rat brain tissue. To our knowledge, this manuscript is the first to report successful deconvolution of pure component IM and MS data from an IM-assisted data-independent analysis (DIA) or HDMSE dataset.
The US EPA ToxCast Program: Moving from Data Generation ...
The U.S. EPA ToxCast program is entering its tenth year. Significant learning and progress have occurred towards collection, analysis, and interpretation of the data. The library of ~1,800 chemicals has been subject to ongoing characterization (e.g., identity, purity, stability) and is unique in its scope, structural diversity, and use scenarios making it ideally suited to investigate the underlying molecular mechanisms of toxicity. The ~700 high-throughput in vitro assay endpoints cover 327 genes and 293 pathways as well as other integrated cellular processes and responses. The integrated analysis of high-throughput screening data has shown that most environmental and industrial chemicals are very non-selective in the biological targets they perturb, while a small subset of chemicals are relatively selective for specific biological targets. The selectivity of a chemical informs interpretation of the screening results while also guiding future mode-of-action or adverse outcome pathway approaches. Coupling the high-throughput in vitro assays with medium-throughput pharmacokinetic assays and reverse dosimetry allows conversion of the potency estimates to an administered dose. Comparison of the administered dose to human exposure provides a risk-based context. The lessons learned from this effort will be presented and discussed towards application to chemical safety decision making and the future of the computational toxicology program at the U.S. EPA. SOT pr
Stepping into the omics era: Opportunities and challenges for biomaterials science and engineering☆
Rabitz, Herschel; Welsh, William J.; Kohn, Joachim; de Boer, Jan
2016-01-01
The research paradigm in biomaterials science and engineering is evolving from using low-throughput and iterative experimental designs towards high-throughput experimental designs for materials optimization and the evaluation of materials properties. Computational science plays an important role in this transition. With the emergence of the omics approach in the biomaterials field, referred to as materiomics, high-throughput approaches hold the promise of tackling the complexity of materials and understanding correlations between material properties and their effects on complex biological systems. The intrinsic complexity of biological systems is an important factor that is often oversimplified when characterizing biological responses to materials and establishing property-activity relationships. Indeed, in vitro tests designed to predict in vivo performance of a given biomaterial are largely lacking as we are not able to capture the biological complexity of whole tissues in an in vitro model. In this opinion paper, we explain how we reached our opinion that converging genomics and materiomics into a new field would enable a significant acceleration of the development of new and improved medical devices. The use of computational modeling to correlate high-throughput gene expression profiling with high throughput combinatorial material design strategies would add power to the analysis of biological effects induced by material properties. We believe that this extra layer of complexity on top of high-throughput material experimentation is necessary to tackle the biological complexity and further advance the biomaterials field. PMID:26876875
Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella
2012-01-01
Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue® mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents. PMID:22735701
Besaratinia, Ahmad; Li, Haiqing; Yoon, Jae-In; Zheng, Albert; Gao, Hanlin; Tommasi, Stella
2012-08-01
Many carcinogens leave a unique mutational fingerprint in the human genome. These mutational fingerprints manifest as specific types of mutations often clustering at certain genomic loci in tumor genomes from carcinogen-exposed individuals. To develop a high-throughput method for detecting the mutational fingerprint of carcinogens, we have devised a cost-, time- and labor-effective strategy, in which the widely used transgenic Big Blue mouse mutation detection assay is made compatible with the Roche/454 Genome Sequencer FLX Titanium next-generation sequencing technology. As proof of principle, we have used this novel method to establish the mutational fingerprints of three prominent carcinogens with varying mutagenic potencies, including sunlight ultraviolet radiation, 4-aminobiphenyl and secondhand smoke that are known to be strong, moderate and weak mutagens, respectively. For verification purposes, we have compared the mutational fingerprints of these carcinogens obtained by our newly developed method with those obtained by parallel analyses using the conventional low-throughput approach, that is, standard mutation detection assay followed by direct DNA sequencing using a capillary DNA sequencer. We demonstrate that this high-throughput next-generation sequencing-based method is highly specific and sensitive to detect the mutational fingerprints of the tested carcinogens. The method is reproducible, and its accuracy is comparable with that of the currently available low-throughput method. In conclusion, this novel method has the potential to move the field of carcinogenesis forward by allowing high-throughput analysis of mutations induced by endogenous and/or exogenous genotoxic agents.
Group type analysis of asphalt by column liquid chromatography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, C.; Yang, J.; Xue, Y.
2008-07-01
An improved analysis method for characterization of asphalt was established. The method is based on column chromatography technique. The asphalts were separated into four groups: saturates, aromatics, resins, and asphaltenes, quantitatively. About 0.1 g of sample was required in each analysis. About 20 mL of n-heptanes was used to separate out saturates first. Then about 35 mL of n-heptanes/dichloromethane (.5, v/v) mixture was used to separate out aromatics. About 30 mL of dichloromethane/tetrahydrofuran (1/3, v/v) mixture was used to separate out resin. The quality of the separation was confirmed by infrared spectra (IR) and {sup 1}H NMR analysis. The modelmore » compounds, tetracosan for saturates, dibenz(o)anthracen for aromatics, and acetanilide for resins were used for verification. The IR and {sup 1}H NMR analysis of the prepared fractions from the column liquid chromatography were in good agreement that of pure reagents.« less
Droplet Microarray Based on Superhydrophobic-Superhydrophilic Patterns for Single Cell Analysis.
Jogia, Gabriella E; Tronser, Tina; Popova, Anna A; Levkin, Pavel A
2016-12-09
Single-cell analysis provides fundamental information on individual cell response to different environmental cues and is a growing interest in cancer and stem cell research. However, current existing methods are still facing challenges in performing such analysis in a high-throughput manner whilst being cost-effective. Here we established the Droplet Microarray (DMA) as a miniaturized screening platform for high-throughput single-cell analysis. Using the method of limited dilution and varying cell density and seeding time, we optimized the distribution of single cells on the DMA. We established culturing conditions for single cells in individual droplets on DMA obtaining the survival of nearly 100% of single cells and doubling time of single cells comparable with that of cells cultured in bulk cell population using conventional methods. Our results demonstrate that the DMA is a suitable platform for single-cell analysis, which carries a number of advantages compared with existing technologies allowing for treatment, staining and spot-to-spot analysis of single cells over time using conventional analysis methods such as microscopy.
NASA Astrophysics Data System (ADS)
LaForce, T.; Ennis-King, J.; Paterson, L.
2013-12-01
Residual CO2 saturation is a critically important parameter in CO2 storage as it can have a large impact on the available secure storage volume and post-injection CO2 migration. A suite of single-well tests to measure residual trapping was conducted at the Otway test site in Victoria, Australia during 2011. One or more of these tests could be conducted at a prospective CO2 storage site before large-scale injection. The test involved injection of 150 tonnes of pure carbon dioxide followed by 454 tonnes of CO2-saturated formation water to drive the carbon dioxide to residual saturation. This work presents a brief overview of the full test sequence, followed by the analysis and interpretation of the tests using noble gas tracers. Prior to CO2 injection krypton (Kr) and xenon (Xe) tracers were injected and back-produced to characterise the aquifer under single-phase conditions. After CO2 had been driven to residual the two tracers were injected and produced again. The noble gases act as non-partitioning aqueous-phase tracers in the undisturbed aquifer and as partitioning tracers in the presence of residual CO2. To estimate residual saturation from the tracer test data a one-dimensional radial model of the near-well region is used. In the model there are only two independent parameters: the apparent dispersivity of each tracer and the residual CO2 saturation. Independent analysis of the Kr and Xe tracer production curves gives the same estimate of residual saturation to within the accuracy of the method. Furthermore the residual from the noble gas tracer tests is consistent with other measurements in the sequence of tests.
Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing.
Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong
2017-01-26
The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λ b / λ u bits/s/Hz, where λ b and λ u are the densities of base stations and mobile users, respectively.
NASA Astrophysics Data System (ADS)
Guo, Lijin; Zheng, Shixue; Cao, Cougui; Li, Chengfang
2016-09-01
The objective of this study was to investigate how the relationships between bacterial communities and organic C (SOC) in topsoil (0-5 cm) are affected by tillage practices [conventional intensive tillage (CT) or no-tillage (NT)] and straw-returning methods [crop straw returning (S) or removal (NS)] under a rice-wheat rotation in central China. Soil bacterial communities were determined by high-throughput sequencing technology. After two cycles of annual rice-wheat rotation, compared with CT treatments, NT treatments generally had significantly more bacterial genera and monounsaturated fatty acids/saturated fatty acids (MUFA/STFA), but a decreased gram-positive bacteria/gram-negative bacteria ratio (G+/G-). S treatments had significantly more bacterial genera and MUFA/STFA, but had decreased G+/G- compared with NS treatments. Multivariate analysis revealed that Gemmatimonas, Rudaea, Spingomonas, Pseudomonas, Dyella, Burkholderia, Clostridium, Pseudolabrys, Arcicella and Bacillus were correlated with SOC, and cellulolytic bacteria (Burkholderia, Pseudomonas, Clostridium, Rudaea and Bacillus) and Gemmationas explained 55.3% and 12.4% of the variance in SOC, respectively. Structural equation modeling further indicated that tillage and residue managements affected SOC directly and indirectly through these cellulolytic bacteria and Gemmationas. Our results suggest that Burkholderia, Pseudomonas, Clostridium, Rudaea, Bacillus and Gemmationas help to regulate SOC sequestration in topsoil under tillage and residue systems.
Capacity-Delay Trade-Off in Collaborative Hybrid Ad-Hoc Networks with Coverage Sensing
Chen, Lingyu; Luo, Wenbin; Liu, Chen; Hong, Xuemin; Shi, Jianghong
2017-01-01
The integration of ad hoc device-to-device (D2D) communications and open-access small cells can result in a networking paradigm called hybrid the ad hoc network, which is particularly promising in delivering delay-tolerant data. The capacity-delay performance of hybrid ad hoc networks has been studied extensively under a popular framework called scaling law analysis. These studies, however, do not take into account aspects of interference accumulation and queueing delay and, therefore, may lead to over-optimistic results. Moreover, focusing on the average measures, existing works fail to give finer-grained insights into the distribution of delays. This paper proposes an alternative analytical framework based on queueing theoretic models and physical interference models. We apply this framework to study the capacity-delay performance of a collaborative cellular D2D network with coverage sensing and two-hop relay. The new framework allows us to fully characterize the delay distribution in the transform domain and pinpoint the impacts of coverage sensing, user and base station densities, transmit power, user mobility and packet size on the capacity-delay trade-off. We show that under the condition of queueing equilibrium, the maximum throughput capacity per device saturates to an upper bound of 0.7239 λb/λu bits/s/Hz, where λb and λu are the densities of base stations and mobile users, respectively. PMID:28134769
Ion channel drug discovery and research: the automated Nano-Patch-Clamp technology.
Brueggemann, A; George, M; Klau, M; Beckler, M; Steindl, J; Behrends, J C; Fertig, N
2004-01-01
Unlike the genomics revolution, which was largely enabled by a single technological advance (high throughput sequencing), rapid advancement in proteomics will require a broader effort to increase the throughput of a number of key tools for functional analysis of different types of proteins. In the case of ion channels -a class of (membrane) proteins of great physiological importance and potential as drug targets- the lack of adequate assay technologies is felt particularly strongly. The available, indirect, high throughput screening methods for ion channels clearly generate insufficient information. The best technology to study ion channel function and screen for compound interaction is the patch clamp technique, but patch clamping suffers from low throughput, which is not acceptable for drug screening. A first step towards a solution is presented here. The nano patch clamp technology, which is based on a planar, microstructured glass chip, enables automatic whole cell patch clamp measurements. The Port-a-Patch is an automated electrophysiology workstation, which uses planar patch clamp chips. This approach enables high quality and high content ion channel and compound evaluation on a one-cell-at-a-time basis. The presented automation of the patch process and its scalability to an array format are the prerequisites for any higher throughput electrophysiology instruments.
High-throughput transformation of Saccharomyces cerevisiae using liquid handling robots.
Liu, Guangbo; Lanham, Clayton; Buchan, J Ross; Kaplan, Matthew E
2017-01-01
Saccharomyces cerevisiae (budding yeast) is a powerful eukaryotic model organism ideally suited to high-throughput genetic analyses, which time and again has yielded insights that further our understanding of cell biology processes conserved in humans. Lithium Acetate (LiAc) transformation of yeast with DNA for the purposes of exogenous protein expression (e.g., plasmids) or genome mutation (e.g., gene mutation, deletion, epitope tagging) is a useful and long established method. However, a reliable and optimized high throughput transformation protocol that runs almost no risk of human error has not been described in the literature. Here, we describe such a method that is broadly transferable to most liquid handling high-throughput robotic platforms, which are now commonplace in academic and industry settings. Using our optimized method, we are able to comfortably transform approximately 1200 individual strains per day, allowing complete transformation of typical genomic yeast libraries within 6 days. In addition, use of our protocol for gene knockout purposes also provides a potentially quicker, easier and more cost-effective approach to generating collections of double mutants than the popular and elegant synthetic genetic array methodology. In summary, our methodology will be of significant use to anyone interested in high throughput molecular and/or genetic analysis of yeast.
USDA-ARS?s Scientific Manuscript database
Immunoassays are analytical methods that employ antibodies or molecules derived from antibodies for the essential binding reactions. The choice of immunoassay system for food safety analysis depends on the analyte, the matrix, and the requirements of the analysis (speed, throughput, sensitivity, spe...
Investigation Of In-Line Monitoring Options At H Canyon/HB Line For Plutonium Oxide Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sexton, L.
2015-10-14
H Canyon and HB Line have a production goal of 1 MT per year of plutonium oxide feedstock for the MOX facility by FY17 (AFS-2 mission). In order to meet this goal, steps will need to be taken to improve processing efficiency. One concept for achieving this goal is to implement in-line process monitoring at key measurement points within the facilities. In-line monitoring during operations has the potential to increase throughput and efficiency while reducing costs associated with laboratory sample analysis. In the work reported here, we mapped the plutonium oxide process, identified key measurement points, investigated alternate technologies thatmore » could be used for in-line analysis, and initiated a throughput benefit analysis.« less
NASA Astrophysics Data System (ADS)
Osnos, V. B.; Kuneevsky, V. V.; Larionov, V. M.; Saifullin, E. R.; Gainetdinov, A. V.; Vankov, Yu V.; Larionova, I. V.
2017-01-01
The method of natural thermal convection with heat agent recirculation (NTC HAR) in oil reservoirs is described. The analysis of the effectiveness of this method for oil reservoir heating with the values of water saturation from 0 to 0.5 units is conducted. As the test element Ashalchinskoye oil field is taken. CMG STARS software was used for calculations. Dynamics of cumulative production, recovery factor and specific energy consumption per 1 m3 of crude oil produced in the application of the heat exchanger with heat agent in cases of different initial water saturation are defined and presented as graphs.
Suzuki, Kazumichi; Palmer, Matthew B; Sahoo, Narayan; Zhang, Xiaodong; Poenisch, Falk; Mackin, Dennis S; Liu, Amy Y; Wu, Richard; Zhu, X Ronald; Frank, Steven J; Gillin, Michael T; Lee, Andrew K
2016-07-01
To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery system downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. The mean monthly equipment clinical availability for the spot scanning port in April 2012-March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012-August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for approximately 30%-40% of total treatment time for the total target volumes exceeding 200 cm(3), which was the case for more than 80% of the patients in this study. When total treatment time was modeled as a function of the number of fields and total target volume, the model overestimated total treatment time by 12% on average, with a standard deviation of 32%. A sensitivity analysis of throughput capacity for a hypothetical four-room spot scanning proton therapy center identified several priority items for improvements in throughput capacity, including operation time, beam delivery time, and patient immobilization and setup time. The spot scanning port at our proton therapy center has operated at a high performance level and has been used to treat a large number of complex cases. Further improvements in efficiency may be feasible in the areas of facility operation, beam delivery, patient immobilization and setup, and optimization of treatment scheduling.
Tome, Jacob M; Ozer, Abdullah; Pagano, John M; Gheba, Dan; Schroth, Gary P; Lis, John T
2014-06-01
RNA-protein interactions play critical roles in gene regulation, but methods to quantitatively analyze these interactions at a large scale are lacking. We have developed a high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay by adapting a high-throughput DNA sequencer to quantify the binding of fluorescently labeled protein to millions of RNAs anchored to sequenced cDNA templates. Using HiTS-RAP, we measured the affinity of mutagenized libraries of GFP-binding and NELF-E-binding aptamers to their respective targets and identified critical regions of interaction. Mutations additively affected the affinity of the NELF-E-binding aptamer, whose interaction depended mainly on a single-stranded RNA motif, but not that of the GFP aptamer, whose interaction depended primarily on secondary structure.
Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping
2016-09-01
There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.
Kokel, David; Rennekamp, Andrew J; Shah, Asmi H; Liebel, Urban; Peterson, Randall T
2012-08-01
For decades, studying the behavioral effects of individual drugs and genetic mutations has been at the heart of efforts to understand and treat nervous system disorders. High-throughput technologies adapted from other disciplines (e.g., high-throughput chemical screening, genomics) are changing the scale of data acquisition in behavioral neuroscience. Massive behavioral datasets are beginning to emerge, particularly from zebrafish labs, where behavioral assays can be performed rapidly and reproducibly in 96-well, high-throughput format. Mining these datasets and making comparisons across different assays are major challenges for the field. Here, we review behavioral barcoding, a process by which complex behavioral assays are reduced to a string of numeric features, facilitating analysis and comparison within and across datasets. Copyright © 2012 Elsevier Ltd. All rights reserved.
Implicit Block ACK Scheme for IEEE 802.11 WLANs
Sthapit, Pranesh; Pyun, Jae-Young
2016-01-01
The throughput of IEEE 802.11 standard is significantly bounded by the associated Medium Access Control (MAC) overhead. Because of the overhead, an upper limit exists for throughput, which is bounded, including situations where data rates are extremely high. Therefore, an overhead reduction is necessary to achieve higher throughput. The IEEE 802.11e amendment introduced the block ACK mechanism, to reduce the number of control messages in MAC. Although the block ACK scheme greatly reduces overhead, further improvements are possible. In this letter, we propose an implicit block ACK method that further reduces the overhead associated with IEEE 802.11e’s block ACK scheme. The mathematical analysis results are presented for both the original protocol and the proposed scheme. A performance improvement of greater than 10% was achieved with the proposed implementation.
High-throughput syntheses of iron phosphite open frameworks in ionic liquids
NASA Astrophysics Data System (ADS)
Wang, Zhixiu; Mu, Ying; Wang, Yilin; Bing, Qiming; Su, Tan; Liu, Jingyao
2017-02-01
Three open-framework iron phosphites: Feп5(NH4)2(HPO3)6 (1), Feп2Fe♯(NH4)(HPO3)4 (2) and Fe♯2(HPO3)3 (3) have been synthesized under ionothermal conditions. How the different synthesis parameters, such as the gel concentrations, synthetic times, reaction temperatures and solvents affect the products have been monitored by using high-throughput approaches. Within each type of experiment, relevant products have been investigated. The optimal reaction conditions are obtained from a series of experiments by high-throughput approaches. All the structures are determined by single-crystal X-ray diffraction analysis and further characterized by PXRD, TGA and FTIR analyses. Magnetic study reveals that those three compounds show interesting magnetic behavior at low temperature.
Saturation Length of Erodible Sediment Beds Subject to Shear Flow
NASA Astrophysics Data System (ADS)
Casler, D. M.; Kahn, B. P.; Furbish, D. J.; Schmeeckle, M. W.
2016-12-01
We examine the initial growth and wavelength selection of sand ripples based on probabilistic formulations of the flux and the Exner equation. Current formulations of this problem as a linear stability analysis appeal to the idea of a saturation length-the lag between the bed stress and the flux-as a key stabilizing influence leading to selection of a finite wavelength. We present two contrasting formulations. The first is based on the Fokker-Planck approximation of the divergence form of the Exner equation, and thus involves particle diffusion associated with variations in particle activity, in addition to the conventionally assumed advective term. The role of a saturation length associated with the particle activity is similar to previous analyses. However, particle diffusion provides an attenuating influence on the growth of small wavelengths, independent of a saturation length, and is thus a sufficient, if not necessary, condition contributing to selection of a finite wavelength. The second formulation is based on the entrainment form of the Exner equation. As a precise, probabilistic formulation of conservation, this form of the Exner equation does not distinguish between advection and diffusion, and, because it directly accounts for all particle motions via a convolution of the distribution of particle hop distances, it pays no attention to the idea of a saturation length. The formulation and resulting description of initial ripple growth and wavelength selection thus inherently subsume the effects embodied in the ideas of advection, diffusion, and a saturation length as used in other formulations. Moreover, the formulation does not distinguish between bed load and suspended load, and is thus broader in application. The analysis reveals that the length scales defined by the distribution of hop distances are more fundamental than the saturation length in determining the initial growth or decay of bedforms. Formulations involving the saturation length coincide with the special case of an exponential distribution of hop distance, where the saturation length is equal to the mean hop distance.
Kepner, Gordon R
2010-04-13
The numerous natural phenomena that exhibit saturation behavior, e.g., ligand binding and enzyme kinetics, have been approached, to date, via empirical and particular analyses. This paper presents a mechanism-free, and assumption-free, second-order differential equation, designed only to describe a typical relationship between the variables governing these phenomena. It develops a mathematical model for this relation, based solely on the analysis of the typical experimental data plot and its saturation characteristics. Its utility complements the traditional empirical approaches. For the general saturation curve, described in terms of its independent (x) and dependent (y) variables, a second-order differential equation is obtained that applies to any saturation phenomena. It shows that the driving factor for the basic saturation behavior is the probability of the interactive site being free, which is described quantitatively. Solving the equation relates the variables in terms of the two empirical constants common to all these phenomena, the initial slope of the data plot and the limiting value at saturation. A first-order differential equation for the slope emerged that led to the concept of the effective binding rate at the active site and its dependence on the calculable probability the interactive site is free. These results are illustrated using specific cases, including ligand binding and enzyme kinetics. This leads to a revised understanding of how to interpret the empirical constants, in terms of the variables pertinent to the phenomenon under study. The second-order differential equation revealed the basic underlying relations that describe these saturation phenomena, and the basic mathematical properties of the standard experimental data plot. It was shown how to integrate this differential equation, and define the common basic properties of these phenomena. The results regarding the importance of the slope and the new perspectives on the empirical constants governing the behavior of these phenomena led to an alternative perspective on saturation behavior kinetics. Their essential commonality was revealed by this analysis, based on the second-order differential equation.
Wen Lin; Asko Noormets; John S. King; Ge Sun; Steve McNulty; Jean-Christophe Domec; Lucas Cernusak
2017-01-01
Stable isotope ratios (δ13C and δ18O) of tree-ring α-cellulose are important tools in paleoclimatology, ecology, plant physiology and genetics. The Multiple Sample Isolation System for Solids (MSISS) was a major advance in the tree-ring α-cellulose extraction methods, offering greater throughput and reduced labor input compared to traditional alternatives. However, the...
NASA Astrophysics Data System (ADS)
El Abed, Abdel I.; Taly, Valérie
2013-11-01
We investigate light coupling into highly monodisperse liquid microdroplets, which are produced and manipulated at kHz rates in a microfluidic device. We show that such coupling leads to Whispering gallery mode resonances (WGMs) which are detected and analyzed versus time during the fast displacement of microdroplets into the microfluidic channel. Our results show that droplet-based microfluidics may be applied advantageously in the promising field of high-throughput label-free biosensing.
Nanosurveyor: a framework for real-time data processing
Daurer, Benedikt J.; Krishnan, Hari; Perciano, Talita; ...
2017-01-31
Background: The ever improving brightness of accelerator based sources is enabling novel observations and discoveries with faster frame rates, larger fields of view, higher resolution, and higher dimensionality. Results: Here we present an integrated software/algorithmic framework designed to capitalize on high-throughput experiments through efficient kernels, load-balanced workflows, which are scalable in design. We describe the streamlined processing pipeline of ptychography data analysis. Conclusions: The pipeline provides throughput, compression, and resolution as well as rapid feedback to the microscope operators.
Morgan, Martin; Anders, Simon; Lawrence, Michael; Aboyoun, Patrick; Pagès, Hervé; Gentleman, Robert
2009-01-01
Summary: ShortRead is a package for input, quality assessment, manipulation and output of high-throughput sequencing data. ShortRead is provided in the R and Bioconductor environments, allowing ready access to additional facilities for advanced statistical analysis, data transformation, visualization and integration with diverse genomic resources. Availability and Implementation: This package is implemented in R and available at the Bioconductor web site; the package contains a ‘vignette’ outlining typical work flows. Contact: mtmorgan@fhcrc.org PMID:19654119
2014-12-24
toxlet.2011.04.007 Rogers JV, Choi YW, Kiser RC et al (2004) Microarray analysis of gene expression in murine skin exposed to sulfur mustard. J Bio...Chemotactic factors released in culture by intact developing and healing skin lesions produced in rabbits by the irritant sulfur mustard. Inflam- mation 21(2...Project ID Number CBM.CUTOC.04.10. RC 00114. ABSTRACT See reprint. 15. SUBJECT TERMS sulfur mustard, cutaneous injury, siRNA, high-throughput screening
Microbial Community Shifts Associated with RDX Loss in a Saturated and Well-Drained Surface Soil
2005-03-01
community containing firmicutes (36%), proteobacteria (54%), actinobacteria (8%), and bacteroidetes (1%). The unsaturated soil contained a greater number of...genera (2.5 times that of the saturated soil) within similar phyla (19% firmicutes, 66% proteobacteria, 6% actinobacteria , 2% bacteroidetes, and 7...by the PLFA analysis. The T-RFLP analysis identified firmicutes (36%), proteobacteria (54%), actinobacteria (8%), and bacteroidetes (1%) in the
Automated Analysis of siRNA Screens of Virus Infected Cells Based on Immunofluorescence Microscopy
NASA Astrophysics Data System (ADS)
Matula, Petr; Kumar, Anil; Wörz, Ilka; Harder, Nathalie; Erfle, Holger; Bartenschlager, Ralf; Eils, Roland; Rohr, Karl
We present an image analysis approach as part of a high-throughput microscopy screening system based on cell arrays for the identification of genes involved in Hepatitis C and Dengue virus replication. Our approach comprises: cell nucleus segmentation, quantification of virus replication level in cells, localization of regions with transfected cells, cell classification by infection status, and quality assessment of an experiment. The approach is fully automatic and has been successfully applied to a large number of cell array images from screening experiments. The experimental results show a good agreement with the expected behavior of positive as well as negative controls and encourage the application to screens from further high-throughput experiments.
Analysis of data throughput in communication between PLCs and HMI/SCADA systems
NASA Astrophysics Data System (ADS)
Mikolajek, Martin; Koziorek, Jiri
2016-09-01
This paper is focused on Analysis of data throughout in communication between PLCs and HMI/SCADA systems. The first part of paper discusses basic problematic communication between PLC and HMI systems. Next part is about specific types of communications PLC - HMI requests. For those cases paper is talking about response and data throughput1-3 . Subsequent section of this article contains practical parts with various data exchanges between PLC Siemens and HMI. The possibilities of communication that are described in this article are focused on using OPC server for visualization software, custom HMI system and own application created by using .NET with Technology. The last part of this article contains some communication solutions.
Optical Alignment and Diffraction Analysis for AIRES: An Airborne Infrared Echelle Spectrometer
NASA Technical Reports Server (NTRS)
Haas, Michael R.; Fonda, Mark (Technical Monitor)
2002-01-01
The optical design is presented for a long-slit grating spectrometer known as AIRES (Airborne InfraRed Echelle Spectrometer). The instrument employs two gratings in series: a small order sorter and a large steeply blazed echelle. The optical path includes four pupil and four field stops, including two narrow slits. A detailed diffraction analysis is performed using GLAD by Applied Optics Research to evaluate critical trade-offs between optical throughput, spectral resolution, and system weight and volume. The effects of slit width, slit length, oversizing the second slit relative to the first, on- vs off-axis throughput, and clipping at the pupil stops and other optical elements are discussed.
A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis.
Collins, Tony J; Ylanko, Jarkko; Geng, Fei; Andrews, David W
2015-11-01
A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose-response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds.
A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis
Collins, Tony J.; Ylanko, Jarkko; Geng, Fei
2015-01-01
Abstract A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose–response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds. PMID:26422066
Fast multiclonal clusterization of V(D)J recombinations from high-throughput sequencing.
Giraud, Mathieu; Salson, Mikaël; Duez, Marc; Villenet, Céline; Quief, Sabine; Caillault, Aurélie; Grardel, Nathalie; Roumier, Christophe; Preudhomme, Claude; Figeac, Martin
2014-05-28
V(D)J recombinations in lymphocytes are essential for immunological diversity. They are also useful markers of pathologies. In leukemia, they are used to quantify the minimal residual disease during patient follow-up. However, the full breadth of lymphocyte diversity is not fully understood. We propose new algorithms that process high-throughput sequencing (HTS) data to extract unnamed V(D)J junctions and gather them into clones for quantification. This analysis is based on a seed heuristic and is fast and scalable because in the first phase, no alignment is performed with germline database sequences. The algorithms were applied to TR γ HTS data from a patient with acute lymphoblastic leukemia, and also on data simulating hypermutations. Our methods identified the main clone, as well as additional clones that were not identified with standard protocols. The proposed algorithms provide new insight into the analysis of high-throughput sequencing data for leukemia, and also to the quantitative assessment of any immunological profile. The methods described here are implemented in a C++ open-source program called Vidjil.
Traceless Immobilization of Analytes for High-Throughput Experiments with SAMDI Mass Spectrometry.
Helal, Kazi Y; Alamgir, Azmain; Berns, Eric J; Mrksich, Milan
2018-06-21
Label-free assays, and particularly those based on the combination of mass spectroscopy with surface chemistries, enable high-throughput experiments of a broad range of reactions. However, these methods can still require the incorporation of functional groups that allow immobilization of reactants and products to surfaces prior to analysis. In this paper, we report a traceless method for attaching molecules to a self-assembled monolayer for matrix-assisted laser desorption and ionization (SAMDI) mass spectrometry. This method uses monolayers that are functionalized with a 3-trifluoromethyl-3-phenyl-diazirine group that liberates nitrogen when irradiated and gives a carbene that inserts into a wide range of bonds to covalently immobilize molecules. Analysis of the monolayer with SAMDI then reveals peaks for each of the adducts formed from molecules in the sample. This method is applied to characterize a P450 drug metabolizing enzyme and to monitor a Suzuki-Miyaura coupling chemical reaction and is important because modification of the substrates with a functional group would alter their activities. This method will be important for high-throughput experiments in many areas, including reaction discovery and optimization.
Rioualen, Claire; Da Costa, Quentin; Chetrit, Bernard; Charafe-Jauffret, Emmanuelle; Ginestier, Christophe
2017-01-01
High-throughput RNAi screenings (HTS) allow quantifying the impact of the deletion of each gene in any particular function, from virus-host interactions to cell differentiation. However, there has been less development for functional analysis tools dedicated to RNAi analyses. HTS-Net, a network-based analysis program, was developed to identify gene regulatory modules impacted in high-throughput screenings, by integrating transcription factors-target genes interaction data (regulome) and protein-protein interaction networks (interactome) on top of screening z-scores. HTS-Net produces exhaustive HTML reports for results navigation and exploration. HTS-Net is a new pipeline for RNA interference screening analyses that proves better performance than simple gene rankings by z-scores, by re-prioritizing genes and replacing them in their biological context, as shown by the three studies that we reanalyzed. Formatted input data for the three studied datasets, source code and web site for testing the system are available from the companion web site at http://htsnet.marseille.inserm.fr/. We also compared our program with existing algorithms (CARD and hotnet2). PMID:28949986
Use of high-throughput mass spectrometry to elucidate host pathogen interactions in Salmonella
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodland, Karin D.; Adkins, Joshua N.; Ansong, Charles
Capabilities in mass spectrometry are evolving rapidly, with recent improvements in sensitivity, data analysis, and most important, from the standpoint of this review, much higher throughput allowing analysis of many samples in a single day. This short review describes how these improvements in mass spectrometry can be used to dissect host-pathogen interactions using Salmonella as a model system. This approach enabled direct identification of the majority of annotated Salmonella proteins, quantitation of expression changes under various in vitro growth conditions, and new insights into virulence and expression of Salmonella proteins within host cell cells. One of the most significant findingsmore » is that a very high percentage of the all annotated genes (>20%) in Salmonella are regulated post-transcriptionally. In addition, new and unexpected interactions have been identified for several Salmonella virulence regulators that involve protein-protein interactions, suggesting additional functions of these regulators in coordinating virulence expression. Overall high throughput mass spectrometry provides a new view of pathogen-host interactions emphasizing the protein products and defining how protein interactions determine the outcome of infection.« less
Lee, Unseok; Chang, Sungyul; Putra, Gian Anantrio; Kim, Hyoungseok; Kim, Dong Hwan
2018-01-01
A high-throughput plant phenotyping system automatically observes and grows many plant samples. Many plant sample images are acquired by the system to determine the characteristics of the plants (populations). Stable image acquisition and processing is very important to accurately determine the characteristics. However, hardware for acquiring plant images rapidly and stably, while minimizing plant stress, is lacking. Moreover, most software cannot adequately handle large-scale plant imaging. To address these problems, we developed a new, automated, high-throughput plant phenotyping system using simple and robust hardware, and an automated plant-imaging-analysis pipeline consisting of machine-learning-based plant segmentation. Our hardware acquires images reliably and quickly and minimizes plant stress. Furthermore, the images are processed automatically. In particular, large-scale plant-image datasets can be segmented precisely using a classifier developed using a superpixel-based machine-learning algorithm (Random Forest), and variations in plant parameters (such as area) over time can be assessed using the segmented images. We performed comparative evaluations to identify an appropriate learning algorithm for our proposed system, and tested three robust learning algorithms. We developed not only an automatic analysis pipeline but also a convenient means of plant-growth analysis that provides a learning data interface and visualization of plant growth trends. Thus, our system allows end-users such as plant biologists to analyze plant growth via large-scale plant image data easily.
MetaUniDec: High-Throughput Deconvolution of Native Mass Spectra
NASA Astrophysics Data System (ADS)
Reid, Deseree J.; Diesing, Jessica M.; Miller, Matthew A.; Perry, Scott M.; Wales, Jessica A.; Montfort, William R.; Marty, Michael T.
2018-04-01
The expansion of native mass spectrometry (MS) methods for both academic and industrial applications has created a substantial need for analysis of large native MS datasets. Existing software tools are poorly suited for high-throughput deconvolution of native electrospray mass spectra from intact proteins and protein complexes. The UniDec Bayesian deconvolution algorithm is uniquely well suited for high-throughput analysis due to its speed and robustness but was previously tailored towards individual spectra. Here, we optimized UniDec for deconvolution, analysis, and visualization of large data sets. This new module, MetaUniDec, centers around a hierarchical data format 5 (HDF5) format for storing datasets that significantly improves speed, portability, and file size. It also includes code optimizations to improve speed and a new graphical user interface for visualization, interaction, and analysis of data. To demonstrate the utility of MetaUniDec, we applied the software to analyze automated collision voltage ramps with a small bacterial heme protein and large lipoprotein nanodiscs. Upon increasing collisional activation, bacterial heme-nitric oxide/oxygen binding (H-NOX) protein shows a discrete loss of bound heme, and nanodiscs show a continuous loss of lipids and charge. By using MetaUniDec to track changes in peak area or mass as a function of collision voltage, we explore the energetic profile of collisional activation in an ultra-high mass range Orbitrap mass spectrometer. [Figure not available: see fulltext.
Experimental Investigation of Mechanical Properties of Black Shales after CO2-Water-Rock Interaction
Lyu, Qiao; Ranjith, Pathegama Gamage; Long, Xinping; Ji, Bin
2016-01-01
The effects of CO2-water-rock interactions on the mechanical properties of shale are essential for estimating the possibility of sequestrating CO2 in shale reservoirs. In this study, uniaxial compressive strength (UCS) tests together with an acoustic emission (AE) system and SEM and EDS analysis were performed to investigate the mechanical properties and microstructural changes of black shales with different saturation times (10 days, 20 days and 30 days) in water dissoluted with gaseous/super-critical CO2. According to the experimental results, the values of UCS, Young’s modulus and brittleness index decrease gradually with increasing saturation time in water with gaseous/super-critical CO2. Compared to samples without saturation, 30-day saturation causes reductions of 56.43% in UCS and 54.21% in Young’s modulus for gaseous saturated samples, and 66.05% in UCS and 56.32% in Young’s modulus for super-critical saturated samples, respectively. The brittleness index also decreases drastically from 84.3% for samples without saturation to 50.9% for samples saturated in water with gaseous CO2, to 47.9% for samples saturated in water with super-critical carbon dioxide (SC-CO2). SC-CO2 causes a greater reduction of shale’s mechanical properties. The crack propagation results obtained from the AE system show that longer saturation time produces higher peak cumulative AE energy. SEM images show that many pores occur when shale samples are saturated in water with gaseous/super-critical CO2. The EDS results show that CO2-water-rock interactions increase the percentages of C and Fe and decrease the percentages of Al and K on the surface of saturated samples when compared to samples without saturation. PMID:28773784
Lyu, Qiao; Ranjith, Pathegama Gamage; Long, Xinping; Ji, Bin
2016-08-06
The effects of CO₂-water-rock interactions on the mechanical properties of shale are essential for estimating the possibility of sequestrating CO₂ in shale reservoirs. In this study, uniaxial compressive strength (UCS) tests together with an acoustic emission (AE) system and SEM and EDS analysis were performed to investigate the mechanical properties and microstructural changes of black shales with different saturation times (10 days, 20 days and 30 days) in water dissoluted with gaseous/super-critical CO₂. According to the experimental results, the values of UCS, Young's modulus and brittleness index decrease gradually with increasing saturation time in water with gaseous/super-critical CO₂. Compared to samples without saturation, 30-day saturation causes reductions of 56.43% in UCS and 54.21% in Young's modulus for gaseous saturated samples, and 66.05% in UCS and 56.32% in Young's modulus for super-critical saturated samples, respectively. The brittleness index also decreases drastically from 84.3% for samples without saturation to 50.9% for samples saturated in water with gaseous CO₂, to 47.9% for samples saturated in water with super-critical carbon dioxide (SC-CO₂). SC-CO₂ causes a greater reduction of shale's mechanical properties. The crack propagation results obtained from the AE system show that longer saturation time produces higher peak cumulative AE energy. SEM images show that many pores occur when shale samples are saturated in water with gaseous/super-critical CO₂. The EDS results show that CO₂-water-rock interactions increase the percentages of C and Fe and decrease the percentages of Al and K on the surface of saturated samples when compared to samples without saturation.
Paintdakhi, Ahmad; Parry, Bradley; Campos, Manuel; Irnov, Irnov; Elf, Johan; Surovtsev, Ivan; Jacobs-Wagner, Christine
2016-01-01
Summary With the realization that bacteria display phenotypic variability among cells and exhibit complex subcellular organization critical for cellular function and behavior, microscopy has re-emerged as a primary tool in bacterial research during the last decade. However, the bottleneck in today’s single-cell studies is quantitative image analysis of cells and fluorescent signals. Here, we address current limitations through the development of Oufti, a stand-alone, open-source software package for automated measurements of microbial cells and fluorescence signals from microscopy images. Oufti provides computational solutions for tracking touching cells in confluent samples, handles various cell morphologies, offers algorithms for quantitative analysis of both diffraction and non-diffraction-limited fluorescence signals, and is scalable for high-throughput analysis of massive datasets, all with subpixel precision. All functionalities are integrated in a single package. The graphical user interface, which includes interactive modules for segmentation, image analysis, and post-processing analysis, makes the software broadly accessible to users irrespective of their computational skills. PMID:26538279
Improved Data Analysis Tools for the Thermal Emission Spectrometer
NASA Astrophysics Data System (ADS)
Rodriguez, K.; Laura, J.; Fergason, R.; Bogle, R.
2017-06-01
We plan to stand up three different database systems for testing of a new datastore for MGS TES data allowing for more accessible tools supporting high throughput data analysis on the high-dimensionality hyperspectral data set.
Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo
2008-01-23
To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.
High-throughput protein analysis integrating bioinformatics and experimental assays
del Val, Coral; Mehrle, Alexander; Falkenhahn, Mechthild; Seiler, Markus; Glatting, Karl-Heinz; Poustka, Annemarie; Suhai, Sandor; Wiemann, Stefan
2004-01-01
The wealth of transcript information that has been made publicly available in recent years requires the development of high-throughput functional genomics and proteomics approaches for its analysis. Such approaches need suitable data integration procedures and a high level of automation in order to gain maximum benefit from the results generated. We have designed an automatic pipeline to analyse annotated open reading frames (ORFs) stemming from full-length cDNAs produced mainly by the German cDNA Consortium. The ORFs are cloned into expression vectors for use in large-scale assays such as the determination of subcellular protein localization or kinase reaction specificity. Additionally, all identified ORFs undergo exhaustive bioinformatic analysis such as similarity searches, protein domain architecture determination and prediction of physicochemical characteristics and secondary structure, using a wide variety of bioinformatic methods in combination with the most up-to-date public databases (e.g. PRINTS, BLOCKS, INTERPRO, PROSITE SWISSPROT). Data from experimental results and from the bioinformatic analysis are integrated and stored in a relational database (MS SQL-Server), which makes it possible for researchers to find answers to biological questions easily, thereby speeding up the selection of targets for further analysis. The designed pipeline constitutes a new automatic approach to obtaining and administrating relevant biological data from high-throughput investigations of cDNAs in order to systematically identify and characterize novel genes, as well as to comprehensively describe the function of the encoded proteins. PMID:14762202
Zhao, Rui; Ding, Shi-Jian; Shen, Yufeng; Camp, David G.; Livesay, Eric A.; Udseth, Harold; Smith, Richard D.
2009-01-01
We report on the development and characterization of automated metal-free multiple-column nanoLC instrumentation for sensitive and high-throughput analysis of phosphopeptides with mass spectrometry analysis. The system implements a multiple-column capillary LC fluidic design developed for high-throughput analysis of peptides (Anal. Chem. 2001, 73, 3011–3021), incorporating modifications to achieve broad and sensitive analysis of phosphopeptides. The integrated nanoLC columns (50 µm i.d. × 30 cm containing 5 µm C18 particles) and the on-line solid phase extraction columns (150 µm i.d. × 4 cm containing 5 µm C18 particles) were connected to automatic switching valves with non-metal chromatographic accessories, and other modifications to avoid the exposure of the analyte to any metal surfaces during handling, separation, and electrospray ionization. The nanoLC developed provided a separation peak capacity of ∼250 for phosphopeptides (and ∼400 for normal peptides). A detection limit of 0.4 fmol was obtained when a linear ion trap tandem mass spectrometer (Finnegan LTQ) was coupled to a 50-µm i.d. column of the nanoLC. The separation power and sensitivity provided by the nanoLC-LTQ enabled identification of ∼4600 phosphopeptide candidates from ∼60 µg COS-7 cell tryptic digest followed by IMAC enrichment and ∼520 tyrosine phosphopeptides from ∼2 mg of human T cells digests followed by phosphotyrosine peptide immunoprecipitation. PMID:19217835
Papanastasiou, Giorgos; Williams, Michelle C; Kershaw, Lucy E; Dweck, Marc R; Alam, Shirjel; Mirsadraee, Saeed; Connell, Martin; Gray, Calum; MacGillivray, Tom; Newby, David E; Semple, Scott Ik
2015-02-17
Mathematical modeling of cardiovascular magnetic resonance perfusion data allows absolute quantification of myocardial blood flow. Saturation of left ventricle signal during standard contrast administration can compromise the input function used when applying these models. This saturation effect is evident during application of standard Fermi models in single bolus perfusion data. Dual bolus injection protocols have been suggested to eliminate saturation but are much less practical in the clinical setting. The distributed parameter model can also be used for absolute quantification but has not been applied in patients with coronary artery disease. We assessed whether distributed parameter modeling might be less dependent on arterial input function saturation than Fermi modeling in healthy volunteers. We validated the accuracy of each model in detecting reduced myocardial blood flow in stenotic vessels versus gold-standard invasive methods. Eight healthy subjects were scanned using a dual bolus cardiac perfusion protocol at 3T. We performed both single and dual bolus analysis of these data using the distributed parameter and Fermi models. For the dual bolus analysis, a scaled pre-bolus arterial input function was used. In single bolus analysis, the arterial input function was extracted from the main bolus. We also performed analysis using both models of single bolus data obtained from five patients with coronary artery disease and findings were compared against independent invasive coronary angiography and fractional flow reserve. Statistical significance was defined as two-sided P value < 0.05. Fermi models overestimated myocardial blood flow in healthy volunteers due to arterial input function saturation in single bolus analysis compared to dual bolus analysis (P < 0.05). No difference was observed in these volunteers when applying distributed parameter-myocardial blood flow between single and dual bolus analysis. In patients, distributed parameter modeling was able to detect reduced myocardial blood flow at stress (<2.5 mL/min/mL of tissue) in all 12 stenotic vessels compared to only 9 for Fermi modeling. Comparison of single bolus versus dual bolus values suggests that distributed parameter modeling is less dependent on arterial input function saturation than Fermi modeling. Distributed parameter modeling showed excellent accuracy in detecting reduced myocardial blood flow in all stenotic vessels.
Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong
2016-04-29
Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.
Analysis of Illumina Microbial Assemblies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clum, Alicia; Foster, Brian; Froula, Jeff
2010-05-28
Since the emerging of second generation sequencing technologies, the evaluation of different sequencing approaches and their assembly strategies for different types of genomes has become an important undertaken. Next generation sequencing technologies dramatically increase sequence throughput while decreasing cost, making them an attractive tool for whole genome shotgun sequencing. To compare different approaches for de-novo whole genome assembly, appropriate tools and a solid understanding of both quantity and quality of the underlying sequence data are crucial. Here, we performed an in-depth analysis of short-read Illumina sequence assembly strategies for bacterial and archaeal genomes. Different types of Illumina libraries as wellmore » as different trim parameters and assemblers were evaluated. Results of the comparative analysis and sequencing platforms will be presented. The goal of this analysis is to develop a cost-effective approach for the increased throughput of the generation of high quality microbial genomes.« less
Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data
2014-01-01
Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193
High count-rate study of two TES x-ray microcalorimeters with different transition temperatures
NASA Astrophysics Data System (ADS)
Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; Sadleir, John E.; Smith, Stephen J.; Wassell, Edward J.
2017-10-01
We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures (T c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T cs had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.3 eV at 6 keV from lower and higher T c devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the so-called event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96% throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T c (faster) device, and 5.8 eV FWHM with 97% throughput with the lower T c (slower) device at 722 Hz.
Nemes, Peter; Hoover, William J; Keire, David A
2013-08-06
Sensors with high chemical specificity and enhanced sample throughput are vital to screening food products and medical devices for chemical or biochemical contaminants that may pose a threat to public health. For example, the rapid detection of oversulfated chondroitin sulfate (OSCS) in heparin could prevent reoccurrence of heparin adulteration that caused hundreds of severe adverse events including deaths worldwide in 2007-2008. Here, rapid pyrolysis is integrated with direct analysis in real time (DART) mass spectrometry to rapidly screen major glycosaminoglycans, including heparin, chondroitin sulfate A, dermatan sulfate, and OSCS. The results demonstrate that, compared to traditional liquid chromatography-based analyses, pyrolysis mass spectrometry achieved at least 250-fold higher sample throughput and was compatible with samples volume-limited to about 300 nL. Pyrolysis yielded an abundance of fragment ions (e.g., 150 different m/z species), many of which were specific to the parent compound. Using multivariate and statistical data analysis models, these data enabled facile differentiation of the glycosaminoglycans with high throughput. After method development was completed, authentically contaminated samples obtained during the heparin crisis by the FDA were analyzed in a blinded manner for OSCS contamination. The lower limit of differentiation and detection were 0.1% (w/w) OSCS in heparin and 100 ng/μL (20 ng) OSCS in water, respectively. For quantitative purposes the linear dynamic range spanned approximately 3 orders of magnitude. Moreover, this chemical readout was successfully employed to find clues in the manufacturing history of the heparin samples that can be used for surveillance purposes. The presented technology and data analysis protocols are anticipated to be readily adaptable to other chemical and biochemical agents and volume-limited samples.
Hulsman, Marc; Hulshof, Frits; Unadkat, Hemant; Papenburg, Bernke J; Stamatialis, Dimitrios F; Truckenmüller, Roman; van Blitterswijk, Clemens; de Boer, Jan; Reinders, Marcel J T
2015-03-01
Surface topographies of materials considerably impact cellular behavior as they have been shown to affect cell growth, provide cell guidance, and even induce cell differentiation. Consequently, for successful application in tissue engineering, the contact interface of biomaterials needs to be optimized to induce the required cell behavior. However, a rational design of biomaterial surfaces is severely hampered because knowledge is lacking on the underlying biological mechanisms. Therefore, we previously developed a high-throughput screening device (TopoChip) that measures cell responses to large libraries of parameterized topographical material surfaces. Here, we introduce a computational analysis of high-throughput materiome data to capture the relationship between the surface topographies of materials and cellular morphology. We apply robust statistical techniques to find surface topographies that best promote a certain specified cellular response. By augmenting surface screening with data-driven modeling, we determine which properties of the surface topographies influence the morphological properties of the cells. With this information, we build models that predict the cellular response to surface topographies that have not yet been measured. We analyze cellular morphology on 2176 surfaces, and find that the surface topography significantly affects various cellular properties, including the roundness and size of the nucleus, as well as the perimeter and orientation of the cells. Our learned models capture and accurately predict these relationships and reveal a spectrum of topographies that induce various levels of cellular morphologies. Taken together, this novel approach of high-throughput screening of materials and subsequent analysis opens up possibilities for a rational design of biomaterial surfaces. Copyright © 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Zhao, Meng-Meng; Du, Shan-Shan; Li, Qiu-Hong; Chen, Tao; Qiu, Hui; Wu, Qin; Chen, Shan-Shan; Zhou, Ying; Zhang, Yuan; Hu, Yang; Su, Yi-Liang; Shen, Li; Zhang, Fen; Weng, Dong; Li, Hui-Ping
2017-02-01
This study aims to use high throughput 16SrRNA gene sequencing to examine the bacterial profile of lymph node biopsy samples of patients with sarcoidosis and to further verify the association between Propionibacterium acnes (P. acnes) and sarcoidosis. A total of 36 mediastinal lymph node biopsy specimens were collected from 17 cases of sarcoidosis, 8 tuberculosis (TB group), and 11 non-infectious lung diseases (control group). The V4 region of the bacterial 16SrRNA gene in the specimens was amplified and sequenced using the high throughput sequencing platform MiSeq, and bacterial profile was established. The data analysis software QIIME and Metastats were used to compare bacterial relative abundance in the three patient groups. Overall, 545 genera were identified; 38 showed significantly lower and 29 had significantly higher relative abundance in the sarcoidosis group than in the TB and control groups (P < 0.01). P. acnes 16SrRNA was exclusively found in all the 17 samples of the sarcoidosis group, whereas was not detected in the TB and control groups. The relative abundance of P. acnes in the sarcoidosis group (0.16% ± 0. 11%) was significantly higher than that in the TB (Metastats analysis: P = 0.0010, q = 0.0044) and control groups (Metastats analysis: P = 0.0010, q = 0.0038). The relative abundance of P. granulosum was only 0.0022% ± 0. 0044% in the sarcoidosis group. P. granulosum 16SrRNA was not detected in the other two groups. High throughput 16SrRNA gene sequencing appears to be a useful tool to investigate the bacterial profile of sarcoidosis specimens. The results suggest that P. acnes may be involved in sarcoidosis development.
Nedbal, Jakub; Visitkul, Viput; Ortiz-Zapater, Elena; Weitsman, Gregory; Chana, Prabhjoat; Matthews, Daniel R; Ng, Tony; Ameer-Beg, Simon M
2015-01-01
Sensing ion or ligand concentrations, physico-chemical conditions, and molecular dimerization or conformation change is possible by assays involving fluorescent lifetime imaging. The inherent low throughput of imaging impedes rigorous statistical data analysis on large cell numbers. We address this limitation by developing a fluorescence lifetime-measuring flow cytometer for fast fluorescence lifetime quantification in living or fixed cell populations. The instrument combines a time-correlated single photon counting epifluorescent microscope with microfluidics cell-handling system. The associated computer software performs burst integrated fluorescence lifetime analysis to assign fluorescence lifetime, intensity, and burst duration to each passing cell. The maximum safe throughput of the instrument reaches 3,000 particles per minute. Living cells expressing spectroscopic rulers of varying peptide lengths were distinguishable by Förster resonant energy transfer measured by donor fluorescence lifetime. An epidermal growth factor (EGF)-stimulation assay demonstrated the technique's capacity to selectively quantify EGF receptor phosphorylation in cells, which was impossible by measuring sensitized emission on a standard flow cytometer. Dual-color fluorescence lifetime detection and cell-specific chemical environment sensing were exemplified using di-4-ANEPPDHQ, a lipophilic environmentally sensitive dye that exhibits changes in its fluorescence lifetime as a function of membrane lipid order. To our knowledge, this instrument opens new applications in flow cytometry which were unavailable due to technological limitations of previously reported fluorescent lifetime flow cytometers. The presented technique is sensitive to lifetimes of most popular fluorophores in the 0.5–5 ns range including fluorescent proteins and is capable of detecting multi-exponential fluorescence lifetime decays. This instrument vastly enhances the throughput of experiments involving fluorescence lifetime measurements, thereby providing statistically significant quantitative data for analysis of large cell populations. © 2014 International Society for Advancement of Cytometry PMID:25523156
Simulation of water-table aquifers using specified saturated thickness
Sheets, Rodney A.; Hill, Mary C.; Haitjema, Henk M.; Provost, Alden M.; Masterson, John P.
2014-01-01
Simulating groundwater flow in a water-table (unconfined) aquifer can be difficult because the saturated thickness available for flow depends on model-calculated hydraulic heads. It is often possible to realize substantial time savings and still obtain accurate head and flow solutions by specifying an approximate saturated thickness a priori, thus linearizing this aspect of the model. This specified-thickness approximation often relies on the use of the “confined” option in numerical models, which has led to confusion and criticism of the method. This article reviews the theoretical basis for the specified-thickness approximation, derives an error analysis for relatively ideal problems, and illustrates the utility of the approximation with a complex test problem. In the transient version of our complex test problem, the specified-thickness approximation produced maximum errors in computed drawdown of about 4% of initial aquifer saturated thickness even when maximum drawdowns were nearly 20% of initial saturated thickness. In the final steady-state version, the approximation produced maximum errors in computed drawdown of about 20% of initial aquifer saturated thickness (mean errors of about 5%) when maximum drawdowns were about 35% of initial saturated thickness. In early phases of model development, such as during initial model calibration efforts, the specified-thickness approximation can be a very effective tool to facilitate convergence. The reduced execution time and increased stability obtained through the approximation can be especially useful when many model runs are required, such as during inverse model calibration, sensitivity and uncertainty analyses, multimodel analysis, and development of optimal resource management scenarios.
NASA Astrophysics Data System (ADS)
Liu, S.; Pan, B.
2015-12-01
The logging evaluation of tuffaceous sandstone reservoirs is always a difficult problem. Experiments show that the tuff and shale have different logging responses. Since the tuff content exerts an influence on the computation of shale content and the parameters of the reservoir, and the accuracy of saturation evaluation is reduced. Therefore, the effect of tuff on the calculation of saturation cannot be ignored. This study takes the tuffaceous sandstone reservoirs in the X depression of Hailar-Tamtsag basin as an example to analyze. And the electric conduction model of tuffaceous sandstone reservoirs is established. The method which combines bacterial foraging algorithm and particle swarm optimization algorithm is used to calculate the content of reservoir components in well logging for the first time, and the calculated content of tuff and shale corresponds to the results analysis of thin sections. The experiment on cation exchange capacity (CEC) proves that tuff has conductivity, and the conversion relationship between CEC and resistivity proposed by Toshinobu Iton has been improved. According to the rock electric experiment under simulated reservoir conditions, the rock-electro parameters (a, b, m and n) are determined. The improved relationship between CEC and resistivity and the rock-electro parameters are used in the calculation of saturation. Formula (1) shows the saturation equation of the tuffaceous reservoirs:According to the comparative analysis between irreducible water saturation and the calculated saturation, we find that the saturation equation used CEC data and rock-electro parameters has a better application effect at oil layer than Archie's formulas.
A flow-free droplet-based device for high throughput polymorphic crystallization.
Yang, Shih-Mo; Zhang, Dapeng; Chen, Wang; Chen, Shih-Chi
2015-06-21
Crystallization is one of the most crucial steps in the process of pharmaceutical formulation. In recent years, emulsion-based platforms have been developed and broadly adopted to generate high quality products. However, these conventional approaches such as stirring are still limited in several aspects, e.g., unstable crystallization conditions and broad size distribution; besides, only simple crystal forms can be produced. In this paper, we present a new flow-free droplet-based formation process for producing highly controlled crystallization with two examples: (1) NaCl crystallization reveals the ability to package saturated solution into nanoliter droplets, and (2) glycine crystallization demonstrates the ability to produce polymorphic crystallization forms by controlling the droplet size and temperature. In our process, the saturated solution automatically fills the microwell array powered by degassed bulk PDMS. A critical oil covering step is then introduced to isolate the saturated solution and control the water dissolution rate. Utilizing surface tension, the solution is uniformly packaged in the form of thousands of isolating droplets at the bottom of each microwell of 50-300 μm diameter. After water dissolution, individual crystal structures are automatically formed inside the microwell array. This approach facilitates the study of different glycine growth processes: α-form generated inside the droplets and γ-form generated at the edge of the droplets. With precise temperature control over nanoliter-sized droplets, the growth of ellipsoidal crystalline agglomerates of glycine was achieved for the first time. Optical and SEM images illustrate that the ellipsoidal agglomerates consist of 2-5 μm glycine clusters with inner spiral structures of ~35 μm screw pitch. Lastly, the size distribution of spherical crystalline agglomerates (SAs) produced from microwells of different sizes was measured to have a coefficient variation (CV) of less than 5%, showing crystal sizes can be precisely controlled by microwell sizes with high uniformity. This new method can be used to reliably fabricate monodispersed crystals for pharmaceutical applications.
Relating oxygen partial pressure, saturation and content: the haemoglobin-oxygen dissociation curve.
Collins, Julie-Ann; Rudenski, Aram; Gibson, John; Howard, Luke; O'Driscoll, Ronan
2015-09-01
The delivery of oxygen by arterial blood to the tissues of the body has a number of critical determinants including blood oxygen concentration (content), saturation (S O2 ) and partial pressure, haemoglobin concentration and cardiac output, including its distribution. The haemoglobin-oxygen dissociation curve, a graphical representation of the relationship between oxygen satur-ation and oxygen partial pressure helps us to understand some of the principles underpinning this process. Historically this curve was derived from very limited data based on blood samples from small numbers of healthy subjects which were manipulated in vitro and ultimately determined by equations such as those described by Severinghaus in 1979. In a study of 3524 clinical specimens, we found that this equation estimated the S O2 in blood from patients with normal pH and S O2 >70% with remarkable accuracy and, to our knowledge, this is the first large-scale validation of this equation using clinical samples. Oxygen saturation by pulse oximetry (S pO2 ) is nowadays the standard clinical method for assessing arterial oxygen saturation, providing a convenient, pain-free means of continuously assessing oxygenation, provided the interpreting clinician is aware of important limitations. The use of pulse oximetry reduces the need for arterial blood gas analysis (S aO2 ) as many patients who are not at risk of hypercapnic respiratory failure or metabolic acidosis and have acceptable S pO2 do not necessarily require blood gas analysis. While arterial sampling remains the gold-standard method of assessing ventilation and oxygenation, in those patients in whom blood gas analysis is indicated, arterialised capillary samples also have a valuable role in patient care. The clinical role of venous blood gases however remains less well defined.
Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses.
Franco-Hermida, John Jairo; Quintero, María Fernanda; Cabrera, Raúl Iskander; Guzman, José Miguel
2017-01-01
This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system.
A rapid enzymatic assay for high-throughput screening of adenosine-producing strains
Dong, Huina; Zu, Xin; Zheng, Ping; Zhang, Dawei
2015-01-01
Adenosine is a major local regulator of tissue function and industrially useful as precursor for the production of medicinal nucleoside substances. High-throughput screening of adenosine overproducers is important for industrial microorganism breeding. An enzymatic assay of adenosine was developed by combined adenosine deaminase (ADA) with indophenol method. The ADA catalyzes the cleavage of adenosine to inosine and NH3, the latter can be accurately determined by indophenol method. The assay system was optimized to deliver a good performance and could tolerate the addition of inorganic salts and many nutrition components to the assay mixtures. Adenosine could be accurately determined by this assay using 96-well microplates. Spike and recovery tests showed that this assay can accurately and reproducibly determine increases in adenosine in fermentation broth without any pretreatment to remove proteins and potentially interfering low-molecular-weight molecules. This assay was also applied to high-throughput screening for high adenosine-producing strains. The high selectivity and accuracy of the ADA assay provides rapid and high-throughput analysis of adenosine in large numbers of samples. PMID:25580842
Logares, Ramiro; Haverkamp, Thomas H A; Kumar, Surendra; Lanzén, Anders; Nederbragt, Alexander J; Quince, Christopher; Kauserud, Håvard
2012-10-01
The incursion of High-Throughput Sequencing (HTS) in environmental microbiology brings unique opportunities and challenges. HTS now allows a high-resolution exploration of the vast taxonomic and metabolic diversity present in the microbial world, which can provide an exceptional insight on global ecosystem functioning, ecological processes and evolution. This exploration has also economic potential, as we will have access to the evolutionary innovation present in microbial metabolisms, which could be used for biotechnological development. HTS is also challenging the research community, and the current bottleneck is present in the data analysis side. At the moment, researchers are in a sequence data deluge, with sequencing throughput advancing faster than the computer power needed for data analysis. However, new tools and approaches are being developed constantly and the whole process could be depicted as a fast co-evolution between sequencing technology, informatics and microbiologists. In this work, we examine the most popular and recently commercialized HTS platforms as well as bioinformatics methods for data handling and analysis used in microbial metagenomics. This non-exhaustive review is intended to serve as a broad state-of-the-art guide to researchers expanding into this rapidly evolving field. Copyright © 2012 Elsevier B.V. All rights reserved.
Analysis and Testing of Mobile Wireless Networks
NASA Technical Reports Server (NTRS)
Alena, Richard; Evenson, Darin; Rundquist, Victor; Clancy, Daniel (Technical Monitor)
2002-01-01
Wireless networks are being used to connect mobile computing elements in more applications as the technology matures. There are now many products (such as 802.11 and 802.11b) which ran in the ISM frequency band and comply with wireless network standards. They are being used increasingly to link mobile Intranet into Wired networks. Standard methods of analyzing and testing their performance and compatibility are needed to determine the limits of the technology. This paper presents analytical and experimental methods of determining network throughput, range and coverage, and interference sources. Both radio frequency (BE) domain and network domain analysis have been applied to determine wireless network throughput and range in the outdoor environment- Comparison of field test data taken under optimal conditions, with performance predicted from RF analysis, yielded quantitative results applicable to future designs. Layering multiple wireless network- sooners can increase performance. Wireless network components can be set to different radio frequency-hopping sequences or spreading functions, allowing more than one sooner to coexist. Therefore, we ran multiple 802.11-compliant systems concurrently in the same geographical area to determine interference effects and scalability, The results can be used to design of more robust networks which have multiple layers of wireless data communication paths and provide increased throughput overall.
Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela
2014-09-25
Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by <15%. Stability during storage at different temperatures was confirmed for three weeks. The limits of detection and quantification for each biomarker varied from 0.3 to 6.3 μmol/l and from 1.0 to 20.9 μmol/l, respectively. Analyses of urine specimens from affected patients revealed abnormal results. Targeted biomarkers in urine were detected in the first weeks of life. This rapid, simple and robust liquid chromatography/tandem mass spectrometry methodology is an efficient tool applicable to urine screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.
Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras
2016-04-01
There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a <4 h magnetic bead-based process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (<3 min) separation to accommodate the high-throughput processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. © 2015 Society for Laboratory Automation and Screening.
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz; Strapagiel, Dominik
2017-11-03
High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup.
Słomka, Marcin; Sobalska-Kwapis, Marta; Wachulec, Monika; Bartosz, Grzegorz
2017-01-01
High resolution melting (HRM) is a convenient method for gene scanning as well as genotyping of individual and multiple single nucleotide polymorphisms (SNPs). This rapid, simple, closed-tube, homogenous, and cost-efficient approach has the capacity for high specificity and sensitivity, while allowing easy transition to high-throughput scale. In this paper, we provide examples from our laboratory practice of some problematic issues which can affect the performance and data analysis of HRM results, especially with regard to reference curve-based targeted genotyping. We present those examples in order of the typical experimental workflow, and discuss the crucial significance of the respective experimental errors and limitations for the quality and analysis of results. The experimental details which have a decisive impact on correct execution of a HRM genotyping experiment include type and quality of DNA source material, reproducibility of isolation method and template DNA preparation, primer and amplicon design, automation-derived preparation and pipetting inconsistencies, as well as physical limitations in melting curve distinction for alternative variants and careful selection of samples for validation by sequencing. We provide a case-by-case analysis and discussion of actual problems we encountered and solutions that should be taken into account by researchers newly attempting HRM genotyping, especially in a high-throughput setup. PMID:29099791
Rapid analysis of colipase gene variants by multicapillary electrophoresis.
Jaczó, Zsuzsanna; Pál, Eszter; Dénes, Réka; Somogyi, Anikó; Sasvári-Székely, Mária; Guttman, András; Rónai, Zsolt
2015-06-01
Despite of the fact that the Human Genome Project was completed more than a decade ago, identification of the genetic background of polygenic diseases is still challenging. Several somewhat different approaches are available to investigate inheritable factors of complex phenotypes, all require, however efficient, high-throughput techniques for SNP genotyping. In this paper, we report a robust and reliable multiplex PCR-RFLP for genotype and haplotype analysis of six SNPs (rs41270082, rs3748051, rs142027015, rs3748048, rs73404011, and rs72925892) of the colipase (CLPS) gene. A multicapillary (12 capillaries) electrophoresis unit was used for high throughput and sensitive analysis of the digestion fragments. A Microsoft Excel-based spreadsheet was designed for the flexible visualization and evaluation of the electrophoretic separations, which is readily adaptable for any kind of electrophoresis application. Haplotype analysis of the two loci localized in close proximity of each other was carried out by molecular method, extended haplotypes including all five SNPs in the 5' upstream region were calculated. The techniques were applied in a case-control association study of type 2 diabetes mellitus. Although, single marker analysis did not reveal any significant association, it was observed that the rare GGCCG haplotype of the five 5' upstream region SNPs was about three times more frequent among patients compared to healthy control population. Our results demonstrated the applicability of multicapillary CGE in large-scale, high-throughput SNP analysis, and suggested that the CLPS gene polymorphisms might be considered as genetic risk factor for type 2 diabetes mellitus. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-throughput microfluidic single-cell digital polymerase chain reaction.
White, A K; Heyries, K A; Doolin, C; Vaninsberghe, M; Hansen, C L
2013-08-06
Here we present an integrated microfluidic device for the high-throughput digital polymerase chain reaction (dPCR) analysis of single cells. This device allows for the parallel processing of single cells and executes all steps of analysis, including cell capture, washing, lysis, reverse transcription, and dPCR analysis. The cDNA from each single cell is distributed into a dedicated dPCR array consisting of 1020 chambers, each having a volume of 25 pL, using surface-tension-based sample partitioning. The high density of this dPCR format (118,900 chambers/cm(2)) allows the analysis of 200 single cells per run, for a total of 204,000 PCR reactions using a device footprint of 10 cm(2). Experiments using RNA dilutions show this device achieves shot-noise-limited performance in quantifying single molecules, with a dynamic range of 10(4). We performed over 1200 single-cell measurements, demonstrating the use of this platform in the absolute quantification of both high- and low-abundance mRNA transcripts, as well as micro-RNAs that are not easily measured using alternative hybridization methods. We further apply the specificity and sensitivity of single-cell dPCR to performing measurements of RNA editing events in single cells. High-throughput dPCR provides a new tool in the arsenal of single-cell analysis methods, with a unique combination of speed, precision, sensitivity, and specificity. We anticipate this approach will enable new studies where high-performance single-cell measurements are essential, including the analysis of transcriptional noise, allelic imbalance, and RNA processing.
High-throughput gender identification of penguin species using melting curve analysis.
Tseng, Chao-Neng; Chang, Yung-Ting; Chiu, Hui-Tzu; Chou, Yii-Cheng; Huang, Hurng-Wern; Cheng, Chien-Chung; Liao, Ming-Hui; Chang, Hsueh-Wei
2014-04-03
Most species of penguins are sexual monomorphic and therefore it is difficult to visually identify their genders for monitoring population stability in terms of sex ratio analysis. In this study, we evaluated the suitability using melting curve analysis (MCA) for high-throughput gender identification of penguins. Preliminary test indicated that the Griffiths's P2/P8 primers were not suitable for MCA analysis. Based on sequence alignment of Chromo-Helicase-DNA binding protein (CHD)-W and CHD-Z genes from four species of penguins (Pygoscelis papua, Aptenodytes patagonicus, Spheniscus magellanicus, and Eudyptes chrysocome), we redesigned forward primers for the CHD-W/CHD-Z-common region (PGU-ZW2) and the CHD-W-specific region (PGU-W2) to be used in combination with the reverse Griffiths's P2 primer. When tested with P. papua samples, PCR using P2/PGU-ZW2 and P2/PGU-W2 primer sets generated two amplicons of 148- and 356-bp, respectively, which were easily resolved in 1.5% agarose gels. MCA analysis indicated the melting temperature (Tm) values for P2/PGU-ZW2 and P2/PGU-W2 amplicons of P. papua samples were 79.75°C-80.5°C and 81.0°C-81.5°C, respectively. Females displayed both ZW-common and W-specific Tm peaks, whereas male was positive only for ZW-common peak. Taken together, our redesigned primers coupled with MCA analysis allows precise high throughput gender identification for P. papua, and potentially for other penguin species such as A. patagonicus, S. magellanicus, and E. chrysocome as well.
NASA Astrophysics Data System (ADS)
Park, Chanho; Nguyen, Phung K. T.; Nam, Myung Jin; Kim, Jongwook
2013-04-01
Monitoring CO2 migration and storage in geological formations is important not only for the stability of geological sequestration of CO2 but also for efficient management of CO2 injection. Especially, geophysical methods can make in situ observation of CO2 to assess the potential leakage of CO2 and to improve reservoir description as well to monitor development of geologic discontinuity (i.e., fault, crack, joint, etc.). Geophysical monitoring can be based on wireline logging or surface surveys for well-scale monitoring (high resolution and nallow area of investigation) or basin-scale monitoring (low resolution and wide area of investigation). In the meantime, crosswell tomography can make reservoir-scale monitoring to bridge the resolution gap between well logs and surface measurements. This study focuses on reservoir-scale monitoring based on crosswell seismic tomography aiming describe details of reservoir structure and monitoring migration of reservoir fluid (water and CO2). For the monitoring, we first make a sensitivity analysis on crosswell seismic tomography data with respect to CO2 saturation. For the sensitivity analysis, Rock Physics Models (RPMs) are constructed by calculating the values of density and P and S-wave velocities of a virtual CO2 injection reservoir. Since the seismic velocity of the reservoir accordingly changes as CO2 saturation changes when the CO2 saturation is less than about 20%, while when the CO2 saturation is larger than 20%, the seismic velocity is insensitive to the change, sensitivity analysis is mainly made when CO2 saturation is less than 20%. For precise simulation of seismic tomography responses for constructed RPMs, we developed a time-domain 2D elastic modeling based on finite difference method with a staggered grid employing a boundary condition of a convolutional perfectly matched layer. We further make comparison between sensitivities of seismic tomography and surface measurements for RPMs to analysis resolution difference between them. Moreover, assuming a similar reservoir situation to the CO2 storage site in Nagaoka, Japan, we generate time-lapse tomographic data sets for the corresponding CO2 injection process, and make a preliminary interpretation of the data sets.
QA/QC requirements for physical properties sampling and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Innis, B.E.
1993-07-21
This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less
A high-throughput core sampling device for the evaluation of maize stalk composition
2012-01-01
Background A major challenge in the identification and development of superior feedstocks for the production of second generation biofuels is the rapid assessment of biomass composition in a large number of samples. Currently, highly accurate and precise robotic analysis systems are available for the evaluation of biomass composition, on a large number of samples, with a variety of pretreatments. However, the lack of an inexpensive and high-throughput process for large scale sampling of biomass resources is still an important limiting factor. Our goal was to develop a simple mechanical maize stalk core sampling device that can be utilized to collect uniform samples of a dimension compatible with robotic processing and analysis, while allowing the collection of hundreds to thousands of samples per day. Results We have developed a core sampling device (CSD) to collect maize stalk samples compatible with robotic processing and analysis. The CSD facilitates the collection of thousands of uniform tissue cores consistent with high-throughput analysis required for breeding, genetics, and production studies. With a single CSD operated by one person with minimal training, more than 1,000 biomass samples were obtained in an eight-hour period. One of the main advantages of using cores is the high level of homogeneity of the samples obtained and the minimal opportunity for sample contamination. In addition, the samples obtained with the CSD can be placed directly into a bath of ice, dry ice, or liquid nitrogen maintaining the composition of the biomass sample for relatively long periods of time. Conclusions The CSD has been demonstrated to successfully produce homogeneous stalk core samples in a repeatable manner with a throughput substantially superior to the currently available sampling methods. Given the variety of maize developmental stages and the diversity of stalk diameter evaluated, it is expected that the CSD will have utility for other bioenergy crops as well. PMID:22548834
Noise Reduction in High-Throughput Gene Perturbation Screens
USDA-ARS?s Scientific Manuscript database
Motivation: Accurate interpretation of perturbation screens is essential for a successful functional investigation. However, the screened phenotypes are often distorted by noise, and their analysis requires specialized statistical analysis tools. The number and scope of statistical methods available...
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
Zhu, Xudong; Arman, Bessembayev; Chu, Ju; Wang, Yonghong; Zhuang, Yingping
2017-05-01
To develop an efficient cost-effective screening process to improve production of glucoamylase in Aspergillus niger. The cultivation of A. niger was achieved with well-dispersed morphology in 48-deep-well microtiter plates, which increased the throughput of the samples compared to traditional flask cultivation. There was a close negative correlation between glucoamylase and its pH of the fermentation broth. A novel high-throughput analysis method using Methyl Orange was developed. When compared to the conventional analysis method using 4-nitrophenyl α-D-glucopyranoside as substrate, a correlation coefficient of 0.96 by statistical analysis was obtained. Using this novel screening method, we acquired a strain with an activity of 2.2 × 10 3 U ml -1 , a 70% higher yield of glucoamylase than its parent strain.
Computational methods for evaluation of cell-based data assessment--Bioconductor.
Le Meur, Nolwenn
2013-02-01
Recent advances in miniaturization and automation of technologies have enabled cell-based assay high-throughput screening, bringing along new challenges in data analysis. Automation, standardization, reproducibility have become requirements for qualitative research. The Bioconductor community has worked in that direction proposing several R packages to handle high-throughput data including flow cytometry (FCM) experiment. Altogether, these packages cover the main steps of a FCM analysis workflow, that is, data management, quality assessment, normalization, outlier detection, automated gating, cluster labeling, and feature extraction. Additionally, the open-source philosophy of R and Bioconductor, which offers room for new development, continuously drives research and improvement of theses analysis methods, especially in the field of clustering and data mining. This review presents the principal FCM packages currently available in R and Bioconductor, their advantages and their limits. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, Yupeng; Chang, Kyunghi
In this paper, we analyze the coexistence issues of M-WiMAX TDD and WCDMA FDD systems. Smart antenna techniques are applied to mitigate the performance loss induced by adjacent channel interference (ACI) in the scenarios where performance is heavily degraded. In addition, an ACI model is proposed to capture the effect of transmit beamforming at the M-WiMAX base station. Furthermore, a MCS-based throughput analysis is proposed, to jointly consider the effects of ACI, system packet error rate requirement, and the available modulation and coding schemes, which is not possible by using the conventional Shannon equation based analysis. From the results, we find that the proposed MCS-based analysis method is quite suitable to analyze the system theoretical throughput in a practical manner.
Mercurio, Meagan D; Dambergs, Robert G; Herderich, Markus J; Smith, Paul A
2007-06-13
The methyl cellulose precipitable (MCP) tannin assay and a modified version of the Somers and Evans color assay were adapted to high-throughput (HTP) analysis. To improve efficiency of the MCP tannin assay, a miniaturized 1 mL format and a HTP format using 96 well plates were developed. The Somers color assay was modified to allow the standardization of pH and ethanol concentrations of wine samples in a simple one-step dilution with a buffer solution, thus removing inconsistencies between wine matrices prior to analysis and allowing for its adaptation to a HTP format. Validation studies showed that all new formats were efficient, and results were reproducible and analogous to the original formats.
Oxygen targeting in preterm infants using the Masimo SET Radical pulse oximeter
Johnston, Ewen D; Boyle, Breidge; Juszczak, Ed; King, Andy; Brocklehurst, Peter; Stenson, Ben J
2011-01-01
Background A pretrial clinical improvement project for the BOOST-II UK trial of oxygen saturation targeting revealed an artefact affecting saturation profiles obtained from the Masimo Set Radical pulse oximeter. Methods Saturation was recorded every 10 s for up to 2 weeks in 176 oxygen dependent preterm infants in 35 UK and Irish neonatal units between August 2006 and April 2009 using Masimo SET Radical pulse oximeters. Frequency distributions of % time at each saturation were plotted. An artefact affecting the saturation distribution was found to be attributable to the oximeter's internal calibration algorithm. Revised software was installed and saturation distributions obtained were compared with four other current oximeters in paired studies. Results There was a reduction in saturation values of 87–90%. Values above 87% were elevated by up to 2%, giving a relative excess of higher values. The software revision eliminated this, improving the distribution of saturation values. In paired comparisons with four current commercially available oximeters, Masimo oximeters with the revised software returned similar saturation distributions. Conclusions A characteristic of the software algorithm reduces the frequency of saturations of 87–90% and increases the frequency of higher values returned by the Masimo SET Radical pulse oximeter. This effect, which remains within the recommended standards for accuracy, is removed by installing revised software (board firmware V4.8 or higher). Because this observation is likely to influence oxygen targeting, it should be considered in the analysis of the oxygen trial results to maximise their generalisability. PMID:21378398
Oxygen targeting in preterm infants using the Masimo SET Radical pulse oximeter.
Johnston, Ewen D; Boyle, Breidge; Juszczak, Ed; King, Andy; Brocklehurst, Peter; Stenson, Ben J
2011-11-01
A pretrial clinical improvement project for the BOOST-II UK trial of oxygen saturation targeting revealed an artefact affecting saturation profiles obtained from the Masimo Set Radical pulse oximeter. Saturation was recorded every 10 s for up to 2 weeks in 176 oxygen dependent preterm infants in 35 UK and Irish neonatal units between August 2006 and April 2009 using Masimo SET Radical pulse oximeters. Frequency distributions of % time at each saturation were plotted. An artefact affecting the saturation distribution was found to be attributable to the oximeter's internal calibration algorithm. Revised software was installed and saturation distributions obtained were compared with four other current oximeters in paired studies. There was a reduction in saturation values of 87-90%. Values above 87% were elevated by up to 2%, giving a relative excess of higher values. The software revision eliminated this, improving the distribution of saturation values. In paired comparisons with four current commercially available oximeters, Masimo oximeters with the revised software returned similar saturation distributions. A characteristic of the software algorithm reduces the frequency of saturations of 87-90% and increases the frequency of higher values returned by the Masimo SET Radical pulse oximeter. This effect, which remains within the recommended standards for accuracy, is removed by installing revised software (board firmware V4.8 or higher). Because this observation is likely to influence oxygen targeting, it should be considered in the analysis of the oxygen trial results to maximise their generalisability.
Gochicoa-Rangel, Laura; Pérez-Padilla, José Rogelio; Rodríguez-Moreno, Luis; Montero-Matamoros, Arturo; Ojeda-Luna, Nancy; Martínez-Carbajal, Gema; Hernández-Raygoza, Roberto; Ruiz-Pedraza, Dolores; Fernández-Plata, María Rosario; Torre-Bouscoulet, Luis
2015-01-01
Altitude above sea level and body mass index are well-recognized determinants of oxygen saturation in adult populations; however, the contribution of these factors to oxygen saturation in children is less clear. To explore the contribution of altitude above sea level and body mass index to oxygen saturation in children. A multi-center, cross-sectional study conducted in nine cities in Mexico. Parents signed informed consent forms and completed a health status questionnaire. Height, weight, and pulse oximetry were recorded. We studied 2,200 subjects (52% girls) aged 8.7 ± 3.0 years. Mean body mass index, z-body mass index, and oxygen saturation were 18.1 ± 3.6 kg·m-2, 0.58 ± 1.3, and 95.5 ± 2.4%, respectively. By multiple regression analysis, altitude proved to be the main predictor of oxygen saturation, with non-significant contributions of age, gender, and body mass index. According to quantile regression, the median estimate of oxygen saturation was 98.7 minus 1.7% per km of altitude above sea level, and the oxygen saturation fifth percentile 97.4 minus 2.7% per km of altitude. Altitude was the main determinant of oxygen saturation, which on average decreased 1.7% per km of elevation from a percentage of 98.7 at sea level. In contrast with adults, this study in children found no association between oxygen saturation and obesity or age.
A high-throughput assay for DNA topoisomerases and other enzymes, based on DNA triplex formation.
Burrell, Matthew R; Burton, Nicolas P; Maxwell, Anthony
2010-01-01
We have developed a rapid, high-throughput assay for measuring the catalytic activity (DNA supercoiling or relaxation) of topoisomerase enzymes that is also capable of monitoring the activity of other enzymes that alter the topology of DNA. The assay utilises intermolecular triplex formation to resolve supercoiled and relaxed forms of DNA, the principle being the greater efficiency of a negatively supercoiled plasmid to form an intermolecular triplex with an immobilised oligonucleotide than the relaxed form. The assay provides a number of advantages over the standard gel-based methods, including greater speed of analysis, reduced sample handling, better quantitation and improved reliability and accuracy of output data. The assay is performed in microtitre plates and can be adapted to high-throughput screening of libraries of potential inhibitors of topoisomerases including bacterial DNA gyrase.
Accounting Artifacts in High-Throughput Toxicity Assays.
Hsieh, Jui-Hua
2016-01-01
Compound activity identification is the primary goal in high-throughput screening (HTS) assays. However, assay artifacts including both systematic (e.g., compound auto-fluorescence) and nonsystematic (e.g., noise) complicate activity interpretation. In addition, other than the traditional potency parameter, half-maximal effect concentration (EC50), additional activity parameters (e.g., point-of-departure, POD) could be derived from HTS data for activity profiling. A data analysis pipeline has been developed to handle the artifacts and to provide compound activity characterization with either binary or continuous metrics. This chapter outlines the steps in the pipeline using Tox21 glucocorticoid receptor (GR) β-lactamase assays, including the formats to identify either agonists or antagonists, as well as the counter-screen assays for identifying artifacts as examples. The steps can be applied to other lower-throughput assays with concentration-response data.
Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won
2015-08-21
We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.
Study on Optimum Design of Multi-Pole Interior Permanent Magnet Motor with Concentrated Windings
NASA Astrophysics Data System (ADS)
Kano, Yoshiaki; Kosaka, Takashi; Matsui, Nobuyuki
Interior Permanent Magnet Synchronous Motors (IPMSM) have been found in many applications because of their high-power density and high-efficiency. The existence of a complex magnetic circuit, however, makes the design of this machine quite complicated. Although FEM is commonly used in the IPMSM design, one of disadvantages is long CPU times. This paper presents a simple non-linear magnetic analysis for a multi-pole IPMSM as a preliminary design tool of FEM. The proposed analysis consists of the geometric-flux-tube-based equivalent-magnetic-circuit model. The model includes saturable permeances taking into account the local magnetic saturation in the core. As a result, the proposed analysis is capable of calculating the flux distribution and the torque characteristics in the presence of magnetic saturation. The effectiveness of the proposed analysis is verified by comparing with FEM in terms of the analytical accuracy and the computation time for two IPMSMs with different specifications. After verification, the proposed analysis-based optimum design is examined, by which the minimization of motor volume is realized while satisfying the necessary maximum torque for target applications.
Electronic transport in disordered chains with saturable nonlinearity
NASA Astrophysics Data System (ADS)
dos Santos, J. L. L.; Nguyen, Ba Phi; de Moura, F. A. B. F.
2015-10-01
In this work we study numerically the dynamics of an initially localized wave packet in one-dimensional disordered chains with saturable nonlinearity. By using the generalized discrete nonlinear Schrödinger equation, we calculate two different physical quantities as a function of time, which are the participation number and the mean square displacement from the excitation site. From detailed numerical analysis, we find that the saturable nonlinearity can promote a sub-diffusive spreading of the wave packet even in the presence of diagonal disorder for a long time. In addition, we also investigate the effect of the saturated nonlinearity for initial times of the electronic evolution thus showing the possibility of mobile breather-like modes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hader, J.; Moloney, J. V.; College of Optical Sciences, University of Arizona, Tucson, Arizona 85721
2016-02-07
Fully microscopic many-body calculations are used to study the influence of strong sub-picosecond pulses on the carrier distributions and corresponding optical response in saturable absorbers used for mode-locking—semiconductor (quantum well) saturable absorber mirrors (SESAMs) and single layer graphene based saturable absorber mirrors (GSAMs). Unlike in GSAMs, the saturation fluence and recovery time in SESAMs show a strong spectral dependence. While the saturation fluence in the SESAM is minimal at the excitonic bandgap, the optimal recovery time and least pulse distortion due to group delay dispersion are found for excitation higher in the first subband. For excitation near the SESAM bandgap,more » the saturation fluence is about one tenth of that in the GSAM. At energies above the bandgap, the fluences in both systems become similar. A strong dependence of the saturation fluence on the pulse width in both systems is caused by carrier relaxation during the pulse. The recovery time in graphene is found to be about two to four times faster than that in the SESAMs. The occurrence of negative differential transmission in graphene is shown to be caused by dopant related carriers. In SESAMs, a negative differential transmission is found when exciting below the excitonic resonance where excitation induced dephasing leads to an enhancement of the absorption. Comparisons of the simulation data to the experiment show a very good quantitative agreement.« less
Alterman, Julia F; Coles, Andrew H; Hall, Lauren M; Aronin, Neil; Khvorova, Anastasia; Didiot, Marie-Cécile
2017-08-20
Primary neurons represent an ideal cellular system for the identification of therapeutic oligonucleotides for the treatment of neurodegenerative diseases. However, due to the sensitive nature of primary cells, the transfection of small interfering RNAs (siRNA) using classical methods is laborious and often shows low efficiency. Recent progress in oligonucleotide chemistry has enabled the development of stabilized and hydrophobically modified small interfering RNAs (hsiRNAs). This new class of oligonucleotide therapeutics shows extremely efficient self-delivery properties and supports potent and durable effects in vitro and in vivo . We have developed a high-throughput in vitro assay to identify and test hsiRNAs in primary neuronal cultures. To simply, rapidly, and accurately quantify the mRNA silencing of hundreds of hsiRNAs, we use the QuantiGene 2.0 quantitative gene expression assay. This high-throughput, 96-well plate-based assay can quantify mRNA levels directly from sample lysate. Here, we describe a method to prepare short-term cultures of mouse primary cortical neurons in a 96-well plate format for high-throughput testing of oligonucleotide therapeutics. This method supports the testing of hsiRNA libraries and the identification of potential therapeutics within just two weeks. We detail methodologies of our high throughput assay workflow from primary neuron preparation to data analysis. This method can help identify oligonucleotide therapeutics for treatment of various neurological diseases.
Kalb, Daniel M; Fencl, Frank A; Woods, Travis A; Swanson, August; Maestas, Gian C; Juárez, Jaime J; Edwards, Bruce S; Shreve, Andrew P; Graves, Steven W
2017-09-19
Flow cytometry provides highly sensitive multiparameter analysis of cells and particles but has been largely limited to the use of a single focused sample stream. This limits the analytical rate to ∼50K particles/s and the volumetric rate to ∼250 μL/min. Despite the analytical prowess of flow cytometry, there are applications where these rates are insufficient, such as rare cell analysis in high cellular backgrounds (e.g., circulating tumor cells and fetal cells in maternal blood), detection of cells/particles in large dilute samples (e.g., water quality, urine analysis), or high-throughput screening applications. Here we report a highly parallel acoustic flow cytometer that uses an acoustic standing wave to focus particles into 16 parallel analysis points across a 2.3 mm wide optical flow cell. A line-focused laser and wide-field collection optics are used to excite and collect the fluorescence emission of these parallel streams onto a high-speed camera for analysis. With this instrument format and fluorescent microsphere standards, we obtain analysis rates of 100K/s and flow rates of 10 mL/min, while maintaining optical performance comparable to that of a commercial flow cytometer. The results with our initial prototype instrument demonstrate that the integration of key parallelizable components, including the line-focused laser, particle focusing using multinode acoustic standing waves, and a spatially arrayed detector, can increase analytical and volumetric throughputs by orders of magnitude in a compact, simple, and cost-effective platform. Such instruments will be of great value to applications in need of high-throughput yet sensitive flow cytometry analysis.
A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data
Morton, Elizabeth; Lamitina, Todd
2010-01-01
Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218
Why we need a centralized repository for isotopic data
USDA-ARS?s Scientific Manuscript database
Stable isotopes encode the origin and integrate the history of matter; thus, their analysis offers tremendous potential to address questions across diverse scientific disciplines. Indeed, the broad applicability of stable isotopes, coupled with advancements in high-throughput analysis, have created ...
NASA Technical Reports Server (NTRS)
Yim, John T.; Soulas, George C.; Shastry, Rohit; Choi, Maria; Mackey, Jonathan A.; Sarver-Verhey, Timothy R.
2017-01-01
The service life assessment for NASA's Evolutionary Xenon Thruster is updated to incorporate the results from the successful and voluntarily early completion of the 51,184 hour long duration test which demonstrated 918 kg of total xenon throughput. The results of the numerous post-test investigations including destructive interrogations have been assessed against all of the critical known and suspected failure mechanisms to update the life and throughput expectations for each major component. Analysis results of two of the most acute failure mechanisms, namely pit-and-groove erosion and aperture enlargement of the accelerator grid, are not updated in this work but will be published at a future time after analysis completion.
Pathway analyses and understanding disease associations
Liu, Yu; Chance, Mark R
2013-01-01
High throughput technologies have been applied to investigate the underlying mechanisms of complex diseases, identify disease-associations and help to improve treatment. However it is challenging to derive biological insight from conventional single gene based analysis of “omics” data from high throughput experiments due to sample and patient heterogeneity. To address these challenges, many novel pathway and network based approaches were developed to integrate various “omics” data, such as gene expression, copy number alteration, Genome Wide Association Studies, and interaction data. This review will cover recent methodological developments in pathway analysis for the detection of dysregulated interactions and disease-associated subnetworks, prioritization of candidate disease genes, and disease classifications. For each application, we will also discuss the associated challenges and potential future directions. PMID:24319650
Solid optical ring interferometer for high-throughput feedback-free spectral analysis and filtering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Petrak, B.; Peiris, M.; Muller, A., E-mail: mullera@usf.edu
2015-02-15
We describe a simple and inexpensive optical ring interferometer for use in high-resolution spectral analysis and filtering. It consists of a solid cuboid, reflection-coated on two opposite sides, in which constructive interference occurs for waves in a rhombic trajectory. Due to its monolithic design, the interferometer’s resonance frequencies are insensitive to environmental disturbances over time. Additional advantages are its simplicity of alignment, high-throughput, and feedback-free operation. If desired, it can be stabilized with a secondary laser without disturbance of the primary signal. We illustrate the use of the interferometer for the measurement of the spectral Mollow triplet from a quantummore » dot and characterize its long-term stability for filtering applications.« less
A microfluidic cell culture array with various oxygen tensions.
Peng, Chien-Chung; Liao, Wei-Hao; Chen, Ying-Hua; Wu, Chueh-Yu; Tung, Yi-Chung
2013-08-21
Oxygen tension plays an important role in regulating various cellular functions in both normal physiology and disease states. Therefore, drug testing using conventional in vitro cell models under normoxia often possesses limited prediction capability. A traditional method of setting an oxygen tension in a liquid medium is by saturating it with a gas mixture at the desired level of oxygen, which requires bulky gas cylinders, sophisticated control, and tedious interconnections. Moreover, only a single oxygen tension can be tested at the same time. In this paper, we develop a microfluidic cell culture array platform capable of performing cell culture and drug testing under various oxygen tensions simultaneously. The device is fabricated using an elastomeric material, polydimethylsiloxane (PDMS) and the well-developed multi-layer soft lithography (MSL) technique. The prototype device has 4 × 4 wells, arranged in the same dimensions as a conventional 96-well plate, for cell culture. The oxygen tensions are controlled by spatially confined oxygen scavenging chemical reactions underneath the wells using microfluidics. The platform takes advantage of microfluidic phenomena while exhibiting the combinatorial diversities achieved by microarrays. Importantly, the platform is compatible with existing cell incubators and high-throughput instruments (liquid handling systems and plate readers) for cost-effective setup and straightforward operation. Utilizing the developed platform, we successfully perform drug testing using an anti-cancer drug, triapazamine (TPZ), on adenocarcinomic human alveolar basal epithelial cell line (A549) under three oxygen tensions ranging from 1.4% to normoxia. The developed platform is promising to provide a more meaningful in vitro cell model for various biomedical applications while maintaining desired high throughput capabilities.
Scale-down of vinegar production into microtiter plates using a custom-made lid.
Schlepütz, Tino; Büchs, Jochen
2014-04-01
As an important food preservative and condiment, vinegar is widely produced in industry by submerged acetic acid bacteria cultures. Although vinegar production is established on the large scale, up to now suitable microscale cultivation methods, e.g. using microtiter plates, are missing to enable high-throughput cultivation and to optimize fermentation conditions. In order to minimize evaporation losses of ethanol and acetic acid in a 48-well microtiter plate during vinegar production a new custom-made lid was developed. A diffusion model was used to calculate the dimensions of a hole in the lid to guarantee a suitable oxygen supply and level of ventilation. Reference fermentation was conducted in a 9-L bioreactor to enable the calculation of the proper cultivation conditions in the microtiter plate. The minimum dissolved oxygen tensions in the microtiter plate were between 7.5% and 23% of air saturation and in the same range as in the 9-L bioreactor. Evaporation losses of ethanol and acetic acid were less than 5% after 47 h and considerably reduced compared to those of microtiter plate fermentations with a conventional gas-permeable seal. Furthermore, cultivation times in the microtiter plate were with about 40 h as long as in the 9-L bioreactor. In conclusion, microtiter plate cultivations with the new custom-made lid provide a platform for high-throughput studies on vinegar production. Results are comparable to those in the 9-L bioreactor. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Peng, Feng; Bian, Jing; Peng, Pai; Xiao, Huan; Ren, Jun-Li; Xu, Feng; Sun, Run-Cang
2012-04-25
Delignified Arundo donax was sequentially extracted with DMSO, saturated barium hydroxide, and 1.0 M aqueous NaOH solution. The yields of the soluble fractions were 10.2, 6.7, and 10.0% (w/w), respectively, of the dry Arundo donax materials. The DMSO-, Ba(OH)(2)- and NaOH-soluble hemicellulosic fractions were further fractionated into two subfractions by gradient 50% and 80% saturation ammonium sulfate precipitation, respectively. Monosaccharide, molecular weight, FT-IR, and 1D ((1)H and (13)C) and 2D (HSQC) NMR analysis revealed the differences in structural characteristics and physicochemical properties among the subfractions. The subfractions precipitated with 50% saturation ammonium sulfate had lower arabinose/xylose and glucuronic acid/xylose ratios but had higher molecular weight than those of the subfractions precipitated by 80% saturation ammonium sulfate. FT-IR and NMR analysis revealed that the highly acetylated DMSO-soluble hemicellulosic subfraction (H(D50)) could be precipitated with a relatively lower concentration of 50% saturated ammonium sulfate, and thus the gradient ammonium sulfate precipitation technique could discriminate acetyl and non-acetyl hemicelluloses. It was found that the DMSO-soluble subfraction H(D50) precipitated by 50% saturated ammonium sulfate mainly consisted of poorly substituted O-acetyl arabino-4-O-methylglucurono xylan with terminal units of arabinose linked on position 3 of xylose, 4-O-methylglucuronic acid residues linked on position 2 of the xylan bone, and the acetyl groups (degree of acetylation, 37%) linked on position 2 or 3. The DMSO-soluble subfraction H(D80) precipitated by 80% saturated ammonium sulfate was mainly composed of highly substituted arabino-4-O-methylglucurono xylan and β-d-glucan.
2011-08-01
further chemical analysis of the cells. While in our proof-of-concept demonstration, we showed high- throughput screening of budding yeast and...of 8.0 mW/cm2 through the transparency mask for 90 seconds. The wafer was baked again at 95°C for 4 minutes then developed in SU-8 developer...sonicated in isopropanol for 5 minutes, sonicated in deionized H2O for 5 minutes, and baked at 65°C for at least 30 minutes. Holes were punched
Annotare--a tool for annotating high-throughput biomedical investigations and resulting data.
Shankar, Ravi; Parkinson, Helen; Burdett, Tony; Hastings, Emma; Liu, Junmin; Miller, Michael; Srinivasa, Rashmi; White, Joseph; Brazma, Alvis; Sherlock, Gavin; Stoeckert, Christian J; Ball, Catherine A
2010-10-01
Computational methods in molecular biology will increasingly depend on standards-based annotations that describe biological experiments in an unambiguous manner. Annotare is a software tool that enables biologists to easily annotate their high-throughput experiments, biomaterials and data in a standards-compliant way that facilitates meaningful search and analysis. Annotare is available from http://code.google.com/p/annotare/ under the terms of the open-source MIT License (http://www.opensource.org/licenses/mit-license.php). It has been tested on both Mac and Windows.
Extended length microchannels for high density high throughput electrophoresis systems
Davidson, James C.; Balch, Joseph W.
2000-01-01
High throughput electrophoresis systems which provide extended well-to-read distances on smaller substrates, thus compacting the overall systems. The electrophoresis systems utilize a high density array of microchannels for electrophoresis analysis with extended read lengths. The microchannel geometry can be used individually or in conjunction to increase the effective length of a separation channel while minimally impacting the packing density of channels. One embodiment uses sinusoidal microchannels, while another embodiment uses plural microchannels interconnected by a via. The extended channel systems can be applied to virtually any type of channel confined chromatography.
Study of Surface Wave Propagation in Fluid-Saturated Porous Solids.
NASA Astrophysics Data System (ADS)
Azcuaga, Valery Francisco Godinez
1995-01-01
This study addresses the surface wave propagation phenomena on fluid-saturated porous solids. The analytical method for calculation of surface wave velocities (Feng and Johnson, JASA, 74, 906, 1983) is extended to the case of a porous solid saturated with a wetting fluid in contact with a non-wetting fluid, in order to study a material combination suitable for experimental investigation. The analytical method is further extended to the case of a non-wetting fluid/wetting fluid-saturated porous solid interface with an arbitrary finite surface stiffness. These extensions of the analytical method allows to theoretically study surface wave propagation phenomena during the saturation process. A modification to the 2-D space-time reflection Green's function (Feng and Johnson, JASA, 74, 915, 1983) is introduced in order to simulate the behavior of surface wave signals detected during the experimental investigation of surface wave propagation on fluid-saturated porous solids (Nagy, Appl. Phys. Lett., 60, 2735, 1992). This modification, together with the introduction of an excess attenuation for the Rayleigh surface mode, makes it possible to explain the apparent velocity changes observed on the surface wave signals during saturation. Experimental results concerning the propagation of surface waves on an alcohol-saturated porous glass are presented. These experiments were performed at frequencies of 500 and 800 kHz and show the simultaneous propagation of the two surface modes predicted by the extended analytical method. Finally an analysis of the displacements associated with the different surface modes is presented. This analysis reveals that it is possible to favor the generation of the Rayleigh surface mode or of the slow surface mode, simply by changing the type of transducer used in the generation of surface waves. Calculations show that a shear transducer couples more energy into the Rayleigh mode, whereas a longitudinal transducer couples more energy into the slow surface mode. Experimental results obtained with the modified experimental system show a qualitative agreement with the theoretical predictions.
Soureti, Anastasia; Hurling, Robert; van Mechelen, Willem; Cobain, Mark; ChinAPaw, Mai
2012-05-01
The present study aimed to advance our understanding of health-related theory, that is, the alleged intention-behavior gap in an obese population. It examined the mediating effects of planning on the intention-behavior relationship and the moderated mediation effects of age, self-efficacy and intentions within this relationship. The study was conducted over a five-week period. Complete data from 571 obese participants were analyzed. The moderated mediation hypothesis was conducted using multiple-regression analysis. To test our theoretical model, intentions (Week 2), action self-efficacy (Week 2), maintenance self-efficacy (Week 5), planning (Week 5), and saturated-fat intake (Weeks 1 and 5) were measured by self-report. As hypothesized, planning mediated the intention-behavior relationship for perceived (two-item scale) and percentage-saturated-fat intake (measured by a food frequency questionnaire). Age, self-efficacy, and intention acted as moderators in the above mediation analysis. In specific, younger individuals, those with stronger intention, and people with higher levels of maintenance self-efficacy at higher levels of planning showed greater reductions in their perceived saturated-fat intake. For successful behavior change, knowledge of its mediators and moderators is needed. Future interventions targeting planning to change saturated-fat intake should be guided by people's intentions, age, and self-efficacy levels.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suzuki, Kazumichi, E-mail: kazumichisuzuki@gmail.c
Purpose: To determine the patient throughput and the overall efficiency of the spot scanning system by analyzing treatment time, equipment availability, and maximum daily capacity for the current spot scanning port at Proton Therapy Center Houston and to assess the daily throughput capacity for a hypothetical spot scanning proton therapy center. Methods: At their proton therapy center, the authors have been recording in an electronic medical record system all treatment data, including disease site, number of fields, number of fractions, delivered dose, energy, range, number of spots, and number of layers for every treatment field. The authors analyzed delivery systemmore » downtimes that had been recorded for every equipment failure and associated incidents. These data were used to evaluate the patient census, patient distribution as a function of the number of fields and total target volume, and equipment clinical availability. The duration of each treatment session from patient walk-in to patient walk-out of the spot scanning treatment room was measured for 64 patients with head and neck, central nervous system, thoracic, and genitourinary cancers. The authors retrieved data for total target volume and the numbers of layers and spots for all fields from treatment plans for a total of 271 patients (including the above 64 patients). A sensitivity analysis of daily throughput capacity was performed by varying seven parameters in a throughput capacity model. Results: The mean monthly equipment clinical availability for the spot scanning port in April 2012–March 2015 was 98.5%. Approximately 1500 patients had received spot scanning proton therapy as of March 2015. The major disease sites treated in September 2012–August 2014 were the genitourinary system (34%), head and neck (30%), central nervous system (21%), and thorax (14%), with other sites accounting for the remaining 1%. Spot scanning beam delivery time increased with total target volume and accounted for approximately 30%–40% of total treatment time for the total target volumes exceeding 200 cm{sup 3}, which was the case for more than 80% of the patients in this study. When total treatment time was modeled as a function of the number of fields and total target volume, the model overestimated total treatment time by 12% on average, with a standard deviation of 32%. A sensitivity analysis of throughput capacity for a hypothetical four-room spot scanning proton therapy center identified several priority items for improvements in throughput capacity, including operation time, beam delivery time, and patient immobilization and setup time. Conclusions: The spot scanning port at our proton therapy center has operated at a high performance level and has been used to treat a large number of complex cases. Further improvements in efficiency may be feasible in the areas of facility operation, beam delivery, patient immobilization and setup, and optimization of treatment scheduling.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davenport, Karen
Karen Davenport of Los Alamos National Laboratory discusses a high-throughput next generation genome finishing pipeline on June 3, 2010 at the "Sequencing, Finishing, Analysis in the Future" meeting in Santa Fe, NM.
Khoo, Bee Luan; Warkiani, Majid Ebrahimi; Tan, Daniel Shao-Weng; Bhagat, Ali Asgar S; Irwin, Darryl; Lau, Dawn Pingxi; Lim, Alvin S T; Lim, Kiat Hon; Krisna, Sai Sakktee; Lim, Wan-Teck; Yap, Yoon Sim; Lee, Soo Chin; Soo, Ross A; Han, Jongyoon; Lim, Chwee Teck
2014-01-01
Circulating tumor cells (CTCs) are cancer cells that can be isolated via liquid biopsy from blood and can be phenotypically and genetically characterized to provide critical information for guiding cancer treatment. Current analysis of CTCs is hindered by the throughput, selectivity and specificity of devices or assays used in CTC detection and isolation. Here, we enriched and characterized putative CTCs from blood samples of patients with both advanced stage metastatic breast and lung cancers using a novel multiplexed spiral microfluidic chip. This system detected putative CTCs under high sensitivity (100%, n = 56) (Breast cancer samples: 12-1275 CTCs/ml; Lung cancer samples: 10-1535 CTCs/ml) rapidly from clinically relevant blood volumes (7.5 ml under 5 min). Blood samples were completely separated into plasma, CTCs and PBMCs components and each fraction were characterized with immunophenotyping (Pan-cytokeratin/CD45, CD44/CD24, EpCAM), fluorescence in-situ hybridization (FISH) (EML4-ALK) or targeted somatic mutation analysis. We used an ultra-sensitive mass spectrometry based system to highlight the presence of an EGFR-activating mutation in both isolated CTCs and plasma cell-free DNA (cf-DNA), and demonstrate concordance with the original tumor-biopsy samples. We have clinically validated our multiplexed microfluidic chip for the ultra high-throughput, low-cost and label-free enrichment of CTCs. Retrieved cells were unlabeled and viable, enabling potential propagation and real-time downstream analysis using next generation sequencing (NGS) or proteomic analysis.
High-throughput microcoil NMR of compound libraries using zero-dispersion segmented flow analysis.
Kautz, Roger A; Goetzinger, Wolfgang K; Karger, Barry L
2005-01-01
An automated system for loading samples into a microcoil NMR probe has been developed using segmented flow analysis. This approach enhanced 2-fold the throughput of the published direct injection and flow injection methods, improved sample utilization 3-fold, and was applicable to high-field NMR facilities with long transfer lines between the sample handler and NMR magnet. Sample volumes of 2 microL (10-30 mM, approximately 10 microg) were drawn from a 96-well microtiter plate by a sample handler, then pumped to a 0.5-microL microcoil NMR probe as a queue of closely spaced "plugs" separated by an immiscible fluorocarbon fluid. Individual sample plugs were detected by their NMR signal and automatically positioned for stopped-flow data acquisition. The sample in the NMR coil could be changed within 35 s by advancing the queue. The fluorocarbon liquid wetted the wall of the Teflon transfer line, preventing the DMSO samples from contacting the capillary wall and thus reducing sample losses to below 5% after passage through the 3-m transfer line. With a wash plug of solvent between samples, sample-to-sample carryover was <1%. Significantly, the samples did not disperse into the carrier liquid during loading or during acquisitions of several days for trace analysis. For automated high-throughput analysis using a 16-second acquisition time, spectra were recorded at a rate of 1.5 min/sample and total deuterated solvent consumption was <0.5 mL (1 US dollar) per 96-well plate.
High-performance single cell genetic analysis using microfluidic emulsion generator arrays.
Zeng, Yong; Novak, Richard; Shuga, Joe; Smith, Martyn T; Mathies, Richard A
2010-04-15
High-throughput genetic and phenotypic analysis at the single cell level is critical to advance our understanding of the molecular mechanisms underlying cellular function and dysfunction. Here we describe a high-performance single cell genetic analysis (SCGA) technique that combines high-throughput microfluidic emulsion generation with single cell multiplex polymerase chain reaction (PCR). Microfabricated emulsion generator array (MEGA) devices containing 4, 32, and 96 channels are developed to confer a flexible capability of generating up to 3.4 x 10(6) nanoliter-volume droplets per hour. Hybrid glass-polydimethylsiloxane diaphragm micropumps integrated into the MEGA chips afford uniform droplet formation, controlled generation frequency, and effective transportation and encapsulation of primer functionalized microbeads and cells. A multiplex single cell PCR method is developed to detect and quantify both wild type and mutant/pathogenic cells. In this method, microbeads functionalized with multiple forward primers targeting specific genes from different cell types are used for solid-phase PCR in droplets. Following PCR, the droplets are lysed and the beads are pooled and rapidly analyzed by multicolor flow cytometry. Using Escherichia coli bacterial cells as a model, we show that this technique enables digital detection of pathogenic E. coli O157 cells in a high background of normal K12 cells, with a detection limit on the order of 1/10(5). This result demonstrates that multiplex SCGA is a promising tool for high-throughput quantitative digital analysis of genetic variation in complex populations.
High-Performance Single Cell Genetic Analysis Using Microfluidic Emulsion Generator Arrays
Zeng, Yong; Novak, Richard; Shuga, Joe; Smith, Martyn T.; Mathies, Richard A.
2010-01-01
High-throughput genetic and phenotypic analysis at the single cell level is critical to advance our understanding of the molecular mechanisms underlying cellular function and dysfunction. Here we describe a high-performance single cell genetic analysis (SCGA) technique that combines high-throughput microfluidic emulsion generation with single cell multiplex PCR. Microfabricated emulsion generator array (MEGA) devices containing 4, 32 and 96 channels are developed to confer a flexible capability of generating up to 3.4 × 106 nanoliter-volume droplets per hour. Hybrid glass-polydimethylsiloxane diaphragm micropumps integrated into the MEGA chips afford uniform droplet formation, controlled generation frequency, and effective transportation and encapsulation of primer functionalized microbeads and cells. A multiplex single cell PCR method is developed to detect and quantify both wild type and mutant/pathogenic cells. In this method, microbeads functionalized with multiple forward primers targeting specific genes from different cell types are used for solid-phase PCR in droplets. Following PCR, the droplets are lysed, the beads are pooled and rapidly analyzed by multi-color flow cytometry. Using E. coli bacterial cells as a model, we show that this technique enables digital detection of pathogenic E. coli O157 cells in a high background of normal K12 cells, with a detection limit on the order of 1:105. This result demonstrates that multiplex SCGA is a promising tool for high-throughput quantitative digital analysis of genetic variation in complex populations. PMID:20192178
Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A
2017-08-01
Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of <12%. Using isobaric tags for relative and absolute quantitation (iTRAQ) 4-plex, >4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.
Li, Shihong; Chang, Eric Y.; Bae, Won C.; Chung, Christine B.; Hua, Yanqing; Zhou, Yi; Du, Jiang
2014-01-01
Purpose: The purpose of this study was to investigate the effect of excitation, fat saturation, long T2 saturation, and adiabatic inversion pulses on ultrashort echo time (UTE) imaging with bicomponent analysis of bound and free water in cortical bone for potential applications in osteoporosis. Methods: Six bovine cortical bones and six human tibial midshaft samples were harvested for this study. Each bone sample was imaged with eight sequences using 2D UTE imaging at 3T with half and hard excitation pulses, without and with fat saturation, long T2 saturation, and adiabatic inversion recovery (IR) preparation pulses. Single- and bicomponent signal models were utilized to calculate the T2*s and/or relative fractions of short and long T2*s. Results: For all bone samples UTE T2* signal decay showed bicomponent behavior. A higher short T2* fraction was observed on UTE images with hard pulse excitation compared with half pulse excitation (75.6% vs 68.8% in bovine bone, 79.9% vs 73.2% in human bone). Fat saturation pulses slightly reduced the short T2* fraction relative to regular UTE sequences (5.0% and 2.0% reduction, respectively, with half and hard excitation pulses for bovine bone, 6.3% and 8.2% reduction, respectively, with half and hard excitation pulses for human bone). Long T2 saturation pulses significantly reduced the long T2* fraction relative to regular UTE sequence (18.9% and 17.2% reduction, respectively, with half and hard excitation pulses for bovine bone, 26.4% and 27.7% reduction, respectively, with half and hard excitation pulses for human bone). With IR-UTE preparation the long T2* components were significantly reduced relative to regular UTE sequence (75.3% and 66.4% reduction, respectively, with half and hard excitation pulses for bovine bone, 87.7% and 90.3% reduction, respectively, with half and hard excitation pulses for human bone). Conclusions: Bound and free water T2*s and relative fractions can be assessed using UTE bicomponent analysis. Long T2* components are affected more by long T2 saturation and IR pulses, and short T2* components are affected more by fat saturation pulses. PMID:24506644
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Shihong; Department of Radiology, Hua Dong Hospital, Fudan University, Shanghai 200040; Yancheng Medical College, Jiangsu
Purpose: The purpose of this study was to investigate the effect of excitation, fat saturation, long T2 saturation, and adiabatic inversion pulses on ultrashort echo time (UTE) imaging with bicomponent analysis of bound and free water in cortical bone for potential applications in osteoporosis. Methods: Six bovine cortical bones and six human tibial midshaft samples were harvested for this study. Each bone sample was imaged with eight sequences using 2D UTE imaging at 3T with half and hard excitation pulses, without and with fat saturation, long T2 saturation, and adiabatic inversion recovery (IR) preparation pulses. Single- and bicomponent signal modelsmore » were utilized to calculate the T2{sup *}s and/or relative fractions of short and long T2{sup *}s. Results: For all bone samples UTE T2{sup *} signal decay showed bicomponent behavior. A higher short T2{sup *} fraction was observed on UTE images with hard pulse excitation compared with half pulse excitation (75.6% vs 68.8% in bovine bone, 79.9% vs 73.2% in human bone). Fat saturation pulses slightly reduced the short T2{sup *} fraction relative to regular UTE sequences (5.0% and 2.0% reduction, respectively, with half and hard excitation pulses for bovine bone, 6.3% and 8.2% reduction, respectively, with half and hard excitation pulses for human bone). Long T2 saturation pulses significantly reduced the long T2{sup *} fraction relative to regular UTE sequence (18.9% and 17.2% reduction, respectively, with half and hard excitation pulses for bovine bone, 26.4% and 27.7% reduction, respectively, with half and hard excitation pulses for human bone). With IR-UTE preparation the long T2{sup *} components were significantly reduced relative to regular UTE sequence (75.3% and 66.4% reduction, respectively, with half and hard excitation pulses for bovine bone, 87.7% and 90.3% reduction, respectively, with half and hard excitation pulses for human bone). Conclusions: Bound and free water T2{sup *}s and relative fractions can be assessed using UTE bicomponent analysis. Long T2{sup *} components are affected more by long T2 saturation and IR pulses, and short T2{sup *} components are affected more by fat saturation pulses.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Athavale, Ajay
Ajay Athavale (Monsanto) presents "High Throughput Plasmid Sequencing with Illumina and CLC Bio" at the 7th Annual Sequencing, Finishing, Analysis in the Future (SFAF) Meeting held in June, 2012 in Santa Fe, NM.
NASA Astrophysics Data System (ADS)
Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.
2017-06-01
Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.
Shirotani, Keiro; Futakawa, Satoshi; Nara, Kiyomitsu; Hoshi, Kyoka; Saito, Toshie; Tohyama, Yuriko; Kitazume, Shinobu; Yuasa, Tatsuhiko; Miyajima, Masakazu; Arai, Hajime; Kuno, Atsushi; Narimatsu, Hisashi; Hashimoto, Yasuhiro
2011-01-01
We have established high-throughput lectin-antibody ELISAs to measure different glycans on transferrin (Tf) in cerebrospinal fluid (CSF) using lectins and an anti-transferrin antibody (TfAb). Lectin blot and precipitation analysis of CSF revealed that PVL (Psathyrella velutina lectin) bound an unique N-acetylglucosamine-terminated N-glycans on “CSF-type” Tf whereas SSA (Sambucus sieboldiana agglutinin) bound α2,6-N-acetylneuraminic acid-terminated N-glycans on “serum-type” Tf. PVL-TfAb ELISA of 0.5 μL CSF samples detected “CSF-type” Tf but not “serum-type” Tf whereas SSA-TfAb ELISA detected “serum-type” Tf but not “CSF-type” Tf, demonstrating the specificity of the lectin-TfAb ELISAs. In idiopathic normal pressure hydrocephalus (iNPH), a senile dementia associated with ventriculomegaly, amounts of the SSA-reactive Tf were significantly higher than in non-iNPH patients, indicating that Tf glycan analysis by the high-throughput lectin-TfAb ELISAs could become practical diagnostic tools for iNPH. The lectin-antibody ELISAs of CSF proteins might be useful for diagnosis of the other neurological diseases. PMID:21876827
Shirotani, Keiro; Futakawa, Satoshi; Nara, Kiyomitsu; Hoshi, Kyoka; Saito, Toshie; Tohyama, Yuriko; Kitazume, Shinobu; Yuasa, Tatsuhiko; Miyajima, Masakazu; Arai, Hajime; Kuno, Atsushi; Narimatsu, Hisashi; Hashimoto, Yasuhiro
2011-01-01
We have established high-throughput lectin-antibody ELISAs to measure different glycans on transferrin (Tf) in cerebrospinal fluid (CSF) using lectins and an anti-transferrin antibody (TfAb). Lectin blot and precipitation analysis of CSF revealed that PVL (Psathyrella velutina lectin) bound an unique N-acetylglucosamine-terminated N-glycans on "CSF-type" Tf whereas SSA (Sambucus sieboldiana agglutinin) bound α2,6-N-acetylneuraminic acid-terminated N-glycans on "serum-type" Tf. PVL-TfAb ELISA of 0.5 μL CSF samples detected "CSF-type" Tf but not "serum-type" Tf whereas SSA-TfAb ELISA detected "serum-type" Tf but not "CSF-type" Tf, demonstrating the specificity of the lectin-TfAb ELISAs. In idiopathic normal pressure hydrocephalus (iNPH), a senile dementia associated with ventriculomegaly, amounts of the SSA-reactive Tf were significantly higher than in non-iNPH patients, indicating that Tf glycan analysis by the high-throughput lectin-TfAb ELISAs could become practical diagnostic tools for iNPH. The lectin-antibody ELISAs of CSF proteins might be useful for diagnosis of the other neurological diseases.
Laurens, L M L; Wolfrum, E J
2013-12-18
One of the challenges associated with microalgal biomass characterization and the comparison of microalgal strains and conversion processes is the rapid determination of the composition of algae. We have developed and applied a high-throughput screening technology based on near-infrared (NIR) spectroscopy for the rapid and accurate determination of algal biomass composition. We show that NIR spectroscopy can accurately predict the full composition using multivariate linear regression analysis of varying lipid, protein, and carbohydrate content of algal biomass samples from three strains. We also demonstrate a high quality of predictions of an independent validation set. A high-throughput 96-well configuration for spectroscopy gives equally good prediction relative to a ring-cup configuration, and thus, spectra can be obtained from as little as 10-20 mg of material. We found that lipids exhibit a dominant, distinct, and unique fingerprint in the NIR spectrum that allows for the use of single and multiple linear regression of respective wavelengths for the prediction of the biomass lipid content. This is not the case for carbohydrate and protein content, and thus, the use of multivariate statistical modeling approaches remains necessary.
Daily, Neil J.; Du, Zhong-Wei
2017-01-01
Abstract Electrophysiology of excitable cells, including muscle cells and neurons, has been measured by making direct contact with a single cell using a micropipette electrode. To increase the assay throughput, optical devices such as microscopes and microplate readers have been used to analyze electrophysiology of multiple cells. We have established a high-throughput (HTP) analysis of action potentials (APs) in highly enriched motor neurons and cardiomyocytes (CMs) that are differentiated from human induced pluripotent stem cells (iPSCs). A multichannel electric field stimulation (EFS) device enabled the ability to electrically stimulate cells and measure dynamic changes in APs of excitable cells ultra-rapidly (>100 data points per second) by imaging entire 96-well plates. We found that the activities of both neurons and CMs and their response to EFS and chemicals are readily discerned by our fluorescence imaging-based HTP phenotyping assay. The latest generation of calcium (Ca2+) indicator dyes, FLIPR Calcium 6 and Cal-520, with the HTP device enables physiological analysis of human iPSC-derived samples highlighting its potential application for understanding disease mechanisms and discovering new therapeutic treatments. PMID:28525289
Computational Approaches to Phenotyping
Lussier, Yves A.; Liu, Yang
2007-01-01
The recent completion of the Human Genome Project has made possible a high-throughput “systems approach” for accelerating the elucidation of molecular underpinnings of human diseases, and subsequent derivation of molecular-based strategies to more effectively prevent, diagnose, and treat these diseases. Although altered phenotypes are among the most reliable manifestations of altered gene functions, research using systematic analysis of phenotype relationships to study human biology is still in its infancy. This article focuses on the emerging field of high-throughput phenotyping (HTP) phenomics research, which aims to capitalize on novel high-throughput computation and informatics technology developments to derive genomewide molecular networks of genotype–phenotype associations, or “phenomic associations.” The HTP phenomics research field faces the challenge of technological research and development to generate novel tools in computation and informatics that will allow researchers to amass, access, integrate, organize, and manage phenotypic databases across species and enable genomewide analysis to associate phenotypic information with genomic data at different scales of biology. Key state-of-the-art technological advancements critical for HTP phenomics research are covered in this review. In particular, we highlight the power of computational approaches to conduct large-scale phenomics studies. PMID:17202287
Cytopathological image analysis using deep-learning networks in microfluidic microscopy.
Gopakumar, G; Hari Babu, K; Mishra, Deepak; Gorthi, Sai Siva; Sai Subrahmanyam, Gorthi R K
2017-01-01
Cytopathologic testing is one of the most critical steps in the diagnosis of diseases, including cancer. However, the task is laborious and demands skill. Associated high cost and low throughput drew considerable interest in automating the testing process. Several neural network architectures were designed to provide human expertise to machines. In this paper, we explore and propose the feasibility of using deep-learning networks for cytopathologic analysis by performing the classification of three important unlabeled, unstained leukemia cell lines (K562, MOLT, and HL60). The cell images used in the classification are captured using a low-cost, high-throughput cell imaging technique: microfluidics-based imaging flow cytometry. We demonstrate that without any conventional fine segmentation followed by explicit feature extraction, the proposed deep-learning algorithms effectively classify the coarsely localized cell lines. We show that the designed deep belief network as well as the deeply pretrained convolutional neural network outperform the conventionally used decision systems and are important in the medical domain, where the availability of labeled data is limited for training. We hope that our work enables the development of a clinically significant high-throughput microfluidic microscopy-based tool for disease screening/triaging, especially in resource-limited settings.
A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.
Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham
2017-08-01
Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.
Aryee, Martin J.; Jaffe, Andrew E.; Corrada-Bravo, Hector; Ladd-Acosta, Christine; Feinberg, Andrew P.; Hansen, Kasper D.; Irizarry, Rafael A.
2014-01-01
Motivation: The recently released Infinium HumanMethylation450 array (the ‘450k’ array) provides a high-throughput assay to quantify DNA methylation (DNAm) at ∼450 000 loci across a range of genomic features. Although less comprehensive than high-throughput sequencing-based techniques, this product is more cost-effective and promises to be the most widely used DNAm high-throughput measurement technology over the next several years. Results: Here we describe a suite of computational tools that incorporate state-of-the-art statistical techniques for the analysis of DNAm data. The software is structured to easily adapt to future versions of the technology. We include methods for preprocessing, quality assessment and detection of differentially methylated regions from the kilobase to the megabase scale. We show how our software provides a powerful and flexible development platform for future methods. We also illustrate how our methods empower the technology to make discoveries previously thought to be possible only with sequencing-based methods. Availability and implementation: http://bioconductor.org/packages/release/bioc/html/minfi.html. Contact: khansen@jhsph.edu; rafa@jimmy.harvard.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24478339
DnaSAM: Software to perform neutrality testing for large datasets with complex null models.
Eckert, Andrew J; Liechty, John D; Tearse, Brandon R; Pande, Barnaly; Neale, David B
2010-05-01
Patterns of DNA sequence polymorphisms can be used to understand the processes of demography and adaptation within natural populations. High-throughput generation of DNA sequence data has historically been the bottleneck with respect to data processing and experimental inference. Advances in marker technologies have largely solved this problem. Currently, the limiting step is computational, with most molecular population genetic software allowing a gene-by-gene analysis through a graphical user interface. An easy-to-use analysis program that allows both high-throughput processing of multiple sequence alignments along with the flexibility to simulate data under complex demographic scenarios is currently lacking. We introduce a new program, named DnaSAM, which allows high-throughput estimation of DNA sequence diversity and neutrality statistics from experimental data along with the ability to test those statistics via Monte Carlo coalescent simulations. These simulations are conducted using the ms program, which is able to incorporate several genetic parameters (e.g. recombination) and demographic scenarios (e.g. population bottlenecks). The output is a set of diversity and neutrality statistics with associated probability values under a user-specified null model that are stored in easy to manipulate text file. © 2009 Blackwell Publishing Ltd.
Dotsey, Emmanuel Y.; Gorlani, Andrea; Ingale, Sampat; Achenbach, Chad J.; Forthal, Donald N.; Felgner, Philip L.; Gach, Johannes S.
2015-01-01
In recent years, high throughput discovery of human recombinant monoclonal antibodies (mAbs) has been applied to greatly advance our understanding of the specificity, and functional activity of antibodies against HIV. Thousands of antibodies have been generated and screened in functional neutralization assays, and antibodies associated with cross-strain neutralization and passive protection in primates, have been identified. To facilitate this type of discovery, a high throughput-screening tool is needed to accurately classify mAbs, and their antigen targets. In this study, we analyzed and evaluated a prototype microarray chip comprised of the HIV-1 recombinant proteins gp140, gp120, gp41, and several membrane proximal external region peptides. The protein microarray analysis of 11 HIV-1 envelope-specific mAbs revealed diverse binding affinities and specificities across clades. Half maximal effective concentrations, generated by our chip analysis, correlated significantly (P<0.0001) with concentrations from ELISA binding measurements. Polyclonal immune responses in plasma samples from HIV-1 infected subjects exhibited different binding patterns, and reactivity against printed proteins. Examining the totality of the specificity of the humoral response in this way reveals the exquisite diversity, and specificity of the humoral response to HIV. PMID:25938510
Automatic Segmentation of High-Throughput RNAi Fluorescent Cellular Images
Yan, Pingkum; Zhou, Xiaobo; Shah, Mubarak; Wong, Stephen T. C.
2010-01-01
High-throughput genome-wide RNA interference (RNAi) screening is emerging as an essential tool to assist biologists in understanding complex cellular processes. The large number of images produced in each study make manual analysis intractable; hence, automatic cellular image analysis becomes an urgent need, where segmentation is the first and one of the most important steps. In this paper, a fully automatic method for segmentation of cells from genome-wide RNAi screening images is proposed. Nuclei are first extracted from the DNA channel by using a modified watershed algorithm. Cells are then extracted by modeling the interaction between them as well as combining both gradient and region information in the Actin and Rac channels. A new energy functional is formulated based on a novel interaction model for segmenting tightly clustered cells with significant intensity variance and specific phenotypes. The energy functional is minimized by using a multiphase level set method, which leads to a highly effective cell segmentation method. Promising experimental results demonstrate that automatic segmentation of high-throughput genome-wide multichannel screening can be achieved by using the proposed method, which may also be extended to other multichannel image segmentation problems. PMID:18270043
Xia, Li C; Ai, Dongmei; Cram, Jacob A; Liang, Xiaoyi; Fuhrman, Jed A; Sun, Fengzhu
2015-09-21
Local trend (i.e. shape) analysis of time series data reveals co-changing patterns in dynamics of biological systems. However, slow permutation procedures to evaluate the statistical significance of local trend scores have limited its applications to high-throughput time series data analysis, e.g., data from the next generation sequencing technology based studies. By extending the theories for the tail probability of the range of sum of Markovian random variables, we propose formulae for approximating the statistical significance of local trend scores. Using simulations and real data, we show that the approximate p-value is close to that obtained using a large number of permutations (starting at time points >20 with no delay and >30 with delay of at most three time steps) in that the non-zero decimals of the p-values obtained by the approximation and the permutations are mostly the same when the approximate p-value is less than 0.05. In addition, the approximate p-value is slightly larger than that based on permutations making hypothesis testing based on the approximate p-value conservative. The approximation enables efficient calculation of p-values for pairwise local trend analysis, making large scale all-versus-all comparisons possible. We also propose a hybrid approach by integrating the approximation and permutations to obtain accurate p-values for significantly associated pairs. We further demonstrate its use with the analysis of the Polymouth Marine Laboratory (PML) microbial community time series from high-throughput sequencing data and found interesting organism co-occurrence dynamic patterns. The software tool is integrated into the eLSA software package that now provides accelerated local trend and similarity analysis pipelines for time series data. The package is freely available from the eLSA website: http://bitbucket.org/charade/elsa.
Protein-RNA specificity by high-throughput principal component analysis of NMR spectra.
Collins, Katherine M; Oregioni, Alain; Robertson, Laura E; Kelly, Geoff; Ramos, Andres
2015-03-31
Defining the RNA target selectivity of the proteins regulating mRNA metabolism is a key issue in RNA biology. Here we present a novel use of principal component analysis (PCA) to extract the RNA sequence preference of RNA binding proteins. We show that PCA can be used to compare the changes in the nuclear magnetic resonance (NMR) spectrum of a protein upon binding a set of quasi-degenerate RNAs and define the nucleobase specificity. We couple this application of PCA to an automated NMR spectra recording and processing protocol and obtain an unbiased and high-throughput NMR method for the analysis of nucleobase preference in protein-RNA interactions. We test the method on the RNA binding domains of three important regulators of RNA metabolism. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Hadoop and friends - first experience at CERN with a new platform for high throughput analysis steps
NASA Astrophysics Data System (ADS)
Duellmann, D.; Surdy, K.; Menichetti, L.; Toebbicke, R.
2017-10-01
The statistical analysis of infrastructure metrics comes with several specific challenges, including the fairly large volume of unstructured metrics from a large set of independent data sources. Hadoop and Spark provide an ideal environment in particular for the first steps of skimming rapidly through hundreds of TB of low relevance data to find and extract the much smaller data volume that is relevant for statistical analysis and modelling. This presentation will describe the new Hadoop service at CERN and the use of several of its components for high throughput data aggregation and ad-hoc pattern searches. We will describe the hardware setup used, the service structure with a small set of decoupled clusters and the first experience with co-hosting different applications and performing software upgrades. We will further detail the common infrastructure used for data extraction and preparation from continuous monitoring and database input sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gentry, T.; Schadt, C.; Zhou, J.
Microarray technology has the unparalleled potential tosimultaneously determine the dynamics and/or activities of most, if notall, of the microbial populations in complex environments such as soilsand sediments. Researchers have developed several types of arrays thatcharacterize the microbial populations in these samples based on theirphylogenetic relatedness or functional genomic content. Several recentstudies have used these microarrays to investigate ecological issues;however, most have only analyzed a limited number of samples withrelatively few experiments utilizing the full high-throughput potentialof microarray analysis. This is due in part to the unique analyticalchallenges that these samples present with regard to sensitivity,specificity, quantitation, and data analysis. Thismore » review discussesspecific applications of microarrays to microbial ecology research alongwith some of the latest studies addressing the difficulties encounteredduring analysis of complex microbial communities within environmentalsamples. With continued development, microarray technology may ultimatelyachieve its potential for comprehensive, high-throughput characterizationof microbial populations in near real-time.« less
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
High-Throughput Single-Cell RNA Sequencing and Data Analysis.
Sagar; Herman, Josip Stefan; Pospisilik, John Andrew; Grün, Dominic
2018-01-01
Understanding biological systems at a single cell resolution may reveal several novel insights which remain masked by the conventional population-based techniques providing an average readout of the behavior of cells. Single-cell transcriptome sequencing holds the potential to identify novel cell types and characterize the cellular composition of any organ or tissue in health and disease. Here, we describe a customized high-throughput protocol for single-cell RNA-sequencing (scRNA-seq) combining flow cytometry and a nanoliter-scale robotic system. Since scRNA-seq requires amplification of a low amount of endogenous cellular RNA, leading to substantial technical noise in the dataset, downstream data filtering and analysis require special care. Therefore, we also briefly describe in-house state-of-the-art data analysis algorithms developed to identify cellular subpopulations including rare cell types as well as to derive lineage trees by ordering the identified subpopulations of cells along the inferred differentiation trajectories.
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Evaluation of residual oil saturation after waterflood in a carbonate reservoir
DOE Office of Scientific and Technical Information (OSTI.GOV)
Verma, M.K.; Boucherit, M.; Bouvier, L.
Four different approaches, including special core analysis (SCAL), log-inject-log, thermal-decay-time (TDT) logs, and material balance, were used to narrow the range of residual oil saturation (ROS) after waterflood, S[sub orw], in a carbonate reservoir in Qatar to between 23% and 27%. An equation was developed that relates S[sub orw] with connate-water saturation, S[sub wi], and porosity. This paper presents the results of S[sub orw] determinations with four different techniques: core waterflood followed by centrifuging, log-inject-log, TDT logging, and material balance.
Observer-based adaptive backstepping control for fractional order systems with input saturation.
Sheng, Dian; Wei, Yiheng; Cheng, Songsong; Wang, Yong
2017-07-03
An observer-based fractional order anti-saturation adaptive backstepping control scheme is proposed for incommensurate fractional order systems with input saturation and partial measurable state in this paper. On the basis of stability analysis, a novel state observer is established first since the only information we could acquire is the system output. In order to compensate the saturation, a series of virtual signals are generated via the construction of fractional order auxiliary system. Afterwards, the controller design is carried out in accordance with the adaptive backstepping control method by introduction of the indirect Lyapunov method. To highlight the effectiveness of the proposed control scheme, simulation examples are demonstrated at last. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Upscaling soil saturated hydraulic conductivity from pore throat characteristics
NASA Astrophysics Data System (ADS)
Ghanbarian, Behzad; Hunt, Allen G.; Skaggs, Todd H.; Jarvis, Nicholas
2017-06-01
Upscaling and/or estimating saturated hydraulic conductivity Ksat at the core scale from microscopic/macroscopic soil characteristics has been actively under investigation in the hydrology and soil physics communities for several decades. Numerous models have been developed based on different approaches, such as the bundle of capillary tubes model, pedotransfer functions, etc. In this study, we apply concepts from critical path analysis, an upscaling technique first developed in the physics literature, to estimate saturated hydraulic conductivity at the core scale from microscopic pore throat characteristics reflected in capillary pressure data. With this new model, we find Ksat estimations to be within a factor of 3 of the average measured saturated hydraulic conductivities reported by Rawls et al. (1982) for the eleven USDA soil texture classes.
Payne, Philip R O; Kwok, Alan; Dhaval, Rakesh; Borlawsky, Tara B
2009-03-01
The conduct of large-scale translational studies presents significant challenges related to the storage, management and analysis of integrative data sets. Ideally, the application of methodologies such as conceptual knowledge discovery in databases (CKDD) provides a means for moving beyond intuitive hypothesis discovery and testing in such data sets, and towards the high-throughput generation and evaluation of knowledge-anchored relationships between complex bio-molecular and phenotypic variables. However, the induction of such high-throughput hypotheses is non-trivial, and requires correspondingly high-throughput validation methodologies. In this manuscript, we describe an evaluation of the efficacy of a natural language processing-based approach to validating such hypotheses. As part of this evaluation, we will examine a phenomenon that we have labeled as "Conceptual Dissonance" in which conceptual knowledge derived from two or more sources of comparable scope and granularity cannot be readily integrated or compared using conventional methods and automated tools.
Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid
2016-01-01
With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments. PMID:27529152
High Throughput Biological Analysis Using Multi-bit Magnetic Digital Planar Tags
NASA Astrophysics Data System (ADS)
Hong, B.; Jeong, J.-R.; Llandro, J.; Hayward, T. J.; Ionescu, A.; Trypiniotis, T.; Mitrelias, T.; Kopper, K. P.; Steinmuller, S. J.; Bland, J. A. C.
2008-06-01
We report a new magnetic labelling technology for high-throughput biomolecular identification and DNA sequencing. Planar multi-bit magnetic tags have been designed and fabricated, which comprise a magnetic barcode formed by an ensemble of micron-sized thin film Ni80Fe20 bars encapsulated in SU8. We show that by using a globally applied magnetic field and magneto-optical Kerr microscopy the magnetic elements in the multi-bit magnetic tags can be addressed individually and encoded/decoded remotely. The critical steps needed to show the feasibility of this technology are demonstrated, including fabrication, flow transport, remote writing and reading, and successful functionalization of the tags as verified by fluorescence detection. This approach is ideal for encoding information on tags in microfluidic flow or suspension, for such applications as labelling of chemical precursors during drug synthesis and combinatorial library-based high-throughput multiplexed bioassays.
Combinatorial and high-throughput screening of materials libraries: review of state of the art.
Potyrailo, Radislav; Rajan, Krishna; Stoewe, Klaus; Takeuchi, Ichiro; Chisholm, Bret; Lam, Hubert
2011-11-14
Rational materials design based on prior knowledge is attractive because it promises to avoid time-consuming synthesis and testing of numerous materials candidates. However with the increase of complexity of materials, the scientific ability for the rational materials design becomes progressively limited. As a result of this complexity, combinatorial and high-throughput (CHT) experimentation in materials science has been recognized as a new scientific approach to generate new knowledge. This review demonstrates the broad applicability of CHT experimentation technologies in discovery and optimization of new materials. We discuss general principles of CHT materials screening, followed by the detailed discussion of high-throughput materials characterization approaches, advances in data analysis/mining, and new materials developments facilitated by CHT experimentation. We critically analyze results of materials development in the areas most impacted by the CHT approaches, such as catalysis, electronic and functional materials, polymer-based industrial coatings, sensing materials, and biomaterials.
Gardner, J. Mark F.; Bell, Andrew S.; Parkinson, Tanya; Bickle, Quentin
2016-01-01
An estimated 600 million people are affected by the helminth disease schistosomiasis caused by parasites of the genus Schistosoma. There is currently only one drug recommended for treating schistosomiasis, praziquantel (PZQ), which is effective against adult worms but not against the juvenile stage. In an attempt to identify improved drugs for treating the disease, we have carried out high throughput screening of a number of small molecule libraries with the aim of identifying lead compounds with balanced activity against all life stages of Schistosoma. A total of almost 300,000 compounds were screened using a high throughput assay based on motility of worm larvae and image analysis of assay plates. Hits were screened against juvenile and adult worms to identify broadly active compounds and against a mammalian cell line to assess cytotoxicity. A number of compounds were identified as promising leads for further chemical optimization. PMID:27128493
High throughput ion-channel pharmacology: planar-array-based voltage clamp.
Kiss, Laszlo; Bennett, Paul B; Uebele, Victor N; Koblan, Kenneth S; Kane, Stefanie A; Neagle, Brad; Schroeder, Kirk
2003-02-01
Technological advances often drive major breakthroughs in biology. Examples include PCR, automated DNA sequencing, confocal/single photon microscopy, AFM, and voltage/patch-clamp methods. The patch-clamp method, first described nearly 30 years ago, was a major technical achievement that permitted voltage-clamp analysis (membrane potential control) of ion channels in most cells and revealed a role for channels in unimagined areas. Because of the high information content, voltage clamp is the best way to study ion-channel function; however, throughput is too low for drug screening. Here we describe a novel breakthrough planar-array-based HT patch-clamp technology developed by Essen Instruments capable of voltage-clamping thousands of cells per day. This technology provides greater than two orders of magnitude increase in throughput compared with the traditional voltage-clamp techniques. We have applied this method to study the hERG K(+) channel and to determine the pharmacological profile of QT prolonging drugs.
Kizaki, Seiichiro; Chandran, Anandhakumar; Sugiyama, Hiroshi
2016-03-02
Tet (ten-eleven translocation) family proteins have the ability to oxidize 5-methylcytosine (mC) to 5-hydroxymethylcytosine (hmC), 5-formylcytosine (fC), and 5-carboxycytosine (caC). However, the oxidation reaction of Tet is not understood completely. Evaluation of genomic-level epigenetic changes by Tet protein requires unbiased identification of the highly selective oxidation sites. In this study, we used high-throughput sequencing to investigate the sequence specificity of mC oxidation by Tet1. A 6.6×10(4) -member mC-containing random DNA-sequence library was constructed. The library was subjected to Tet-reactive pulldown followed by high-throughput sequencing. Analysis of the obtained sequence data identified the Tet1-reactive sequences. We identified mCpG as a highly reactive sequence of Tet1 protein. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Information-based management mode based on value network analysis for livestock enterprises
NASA Astrophysics Data System (ADS)
Liu, Haoqi; Lee, Changhoon; Han, Mingming; Su, Zhongbin; Padigala, Varshinee Anu; Shen, Weizheng
2018-01-01
With the development of computer and IT technologies, enterprise management has gradually become information-based management. Moreover, due to poor technical competence and non-uniform management, most breeding enterprises show a lack of organisation in data collection and management. In addition, low levels of efficiency result in increasing production costs. This paper adopts 'struts2' in order to construct an information-based management system for standardised and normalised management within the process of production in beef cattle breeding enterprises. We present a radio-frequency identification system by studying multiple-tag anti-collision via a dynamic grouping ALOHA algorithm. This algorithm is based on the existing ALOHA algorithm and uses an improved packet dynamic of this algorithm, which is characterised by a high-throughput rate. This new algorithm can reach a throughput 42% higher than that of the general ALOHA algorithm. With a change in the number of tags, the system throughput is relatively stable.
AmpliVar: mutation detection in high-throughput sequence from amplicon-based libraries.
Hsu, Arthur L; Kondrashova, Olga; Lunke, Sebastian; Love, Clare J; Meldrum, Cliff; Marquis-Nicholson, Renate; Corboy, Greg; Pham, Kym; Wakefield, Matthew; Waring, Paul M; Taylor, Graham R
2015-04-01
Conventional means of identifying variants in high-throughput sequencing align each read against a reference sequence, and then call variants at each position. Here, we demonstrate an orthogonal means of identifying sequence variation by grouping the reads as amplicons prior to any alignment. We used AmpliVar to make key-value hashes of sequence reads and group reads as individual amplicons using a table of flanking sequences. Low-abundance reads were removed according to a selectable threshold, and reads above this threshold were aligned as groups, rather than as individual reads, permitting the use of sensitive alignment tools. We show that this approach is more sensitive, more specific, and more computationally efficient than comparable methods for the analysis of amplicon-based high-throughput sequencing data. The method can be extended to enable alignment-free confirmation of variants seen in hybridization capture target-enrichment data. © 2015 WILEY PERIODICALS, INC.
Arend, Daniel; Lange, Matthias; Pape, Jean-Michel; Weigelt-Fischer, Kathleen; Arana-Ceballos, Fernando; Mücke, Ingo; Klukas, Christian; Altmann, Thomas; Scholz, Uwe; Junker, Astrid
2016-08-16
With the implementation of novel automated, high throughput methods and facilities in the last years, plant phenomics has developed into a highly interdisciplinary research domain integrating biology, engineering and bioinformatics. Here we present a dataset of a non-invasive high throughput plant phenotyping experiment, which uses image- and image analysis- based approaches to monitor the growth and development of 484 Arabidopsis thaliana plants (thale cress). The result is a comprehensive dataset of images and extracted phenotypical features. Such datasets require detailed documentation, standardized description of experimental metadata as well as sustainable data storage and publication in order to ensure the reproducibility of experiments, data reuse and comparability among the scientific community. Therefore the here presented dataset has been annotated using the standardized ISA-Tab format and considering the recently published recommendations for the semantical description of plant phenotyping experiments.
Bahrami-Samani, Emad; Vo, Dat T.; de Araujo, Patricia Rosa; Vogel, Christine; Smith, Andrew D.; Penalva, Luiz O. F.; Uren, Philip J.
2014-01-01
Co- and post-transcriptional regulation of gene expression is complex and multi-faceted, spanning the complete RNA lifecycle from genesis to decay. High-throughput profiling of the constituent events and processes is achieved through a range of technologies that continue to expand and evolve. Fully leveraging the resulting data is non-trivial, and requires the use of computational methods and tools carefully crafted for specific data sources and often intended to probe particular biological processes. Drawing upon databases of information pre-compiled by other researchers can further elevate analyses. Within this review, we describe the major co- and post-transcriptional events in the RNA lifecycle that are amenable to high-throughput profiling. We place specific emphasis on the analysis of the resulting data, in particular the computational tools and resources available, as well as looking towards future challenges that remain to be addressed. PMID:25515586
Quigley, Lisa; O'Sullivan, Orla; Beresford, Tom P.; Ross, R. Paul; Fitzgerald, Gerald F.
2012-01-01
Here, high-throughput sequencing was employed to reveal the highly diverse bacterial populations present in 62 Irish artisanal cheeses and, in some cases, associated cheese rinds. Using this approach, we revealed the presence of several genera not previously associated with cheese, including Faecalibacterium, Prevotella, and Helcococcus and, for the first time, detected the presence of Arthrobacter and Brachybacterium in goats' milk cheese. Our analysis confirmed many previously observed patterns, such as the dominance of typical cheese bacteria, the fact that the microbiota of raw and pasteurized milk cheeses differ, and that the level of cheese maturation has a significant influence on Lactobacillus populations. It was also noted that cheeses containing adjunct ingredients had lower proportions of Lactococcus species. It is thus apparent that high-throughput sequencing-based investigations can provide valuable insights into the microbial populations of artisanal foods. PMID:22685131
Quigley, Lisa; O'Sullivan, Orla; Beresford, Tom P; Ross, R Paul; Fitzgerald, Gerald F; Cotter, Paul D
2012-08-01
Here, high-throughput sequencing was employed to reveal the highly diverse bacterial populations present in 62 Irish artisanal cheeses and, in some cases, associated cheese rinds. Using this approach, we revealed the presence of several genera not previously associated with cheese, including Faecalibacterium, Prevotella, and Helcococcus and, for the first time, detected the presence of Arthrobacter and Brachybacterium in goats' milk cheese. Our analysis confirmed many previously observed patterns, such as the dominance of typical cheese bacteria, the fact that the microbiota of raw and pasteurized milk cheeses differ, and that the level of cheese maturation has a significant influence on Lactobacillus populations. It was also noted that cheeses containing adjunct ingredients had lower proportions of Lactococcus species. It is thus apparent that high-throughput sequencing-based investigations can provide valuable insights into the microbial populations of artisanal foods.
Li, Zhoufang; Liu, Guangjie; Tong, Yin; Zhang, Meng; Xu, Ying; Qin, Li; Wang, Zhanhui; Chen, Xiaoping; He, Jiankui
2015-01-01
Profiling immune repertoires by high throughput sequencing enhances our understanding of immune system complexity and immune-related diseases in humans. Previously, cloning and Sanger sequencing identified limited numbers of T cell receptor (TCR) nucleotide sequences in rhesus monkeys, thus their full immune repertoire is unknown. We applied multiplex PCR and Illumina high throughput sequencing to study the TCRβ of rhesus monkeys. We identified 1.26 million TCRβ sequences corresponding to 643,570 unique TCRβ sequences and 270,557 unique complementarity-determining region 3 (CDR3) gene sequences. Precise measurements of CDR3 length distribution, CDR3 amino acid distribution, length distribution of N nucleotide of junctional region, and TCRV and TCRJ gene usage preferences were performed. A comprehensive profile of rhesus monkey immune repertoire might aid human infectious disease studies using rhesus monkeys. PMID:25961410
Ching, Travers; Zhu, Xun; Garmire, Lana X
2018-04-01
Artificial neural networks (ANN) are computing architectures with many interconnections of simple neural-inspired computing elements, and have been applied to biomedical fields such as imaging analysis and diagnosis. We have developed a new ANN framework called Cox-nnet to predict patient prognosis from high throughput transcriptomics data. In 10 TCGA RNA-Seq data sets, Cox-nnet achieves the same or better predictive accuracy compared to other methods, including Cox-proportional hazards regression (with LASSO, ridge, and mimimax concave penalty), Random Forests Survival and CoxBoost. Cox-nnet also reveals richer biological information, at both the pathway and gene levels. The outputs from the hidden layer node provide an alternative approach for survival-sensitive dimension reduction. In summary, we have developed a new method for accurate and efficient prognosis prediction on high throughput data, with functional biological insights. The source code is freely available at https://github.com/lanagarmire/cox-nnet.
Troggio, Michela; Surbanovski, Nada; Bianco, Luca; Moretto, Marco; Giongo, Lara; Banchi, Elisa; Viola, Roberto; Fernández, Felicdad Fernández; Costa, Fabrizio; Velasco, Riccardo; Cestaro, Alessandro; Sargent, Daniel James
2013-01-01
High throughput arrays for the simultaneous genotyping of thousands of single-nucleotide polymorphisms (SNPs) have made the rapid genetic characterisation of plant genomes and the development of saturated linkage maps a realistic prospect for many plant species of agronomic importance. However, the correct calling of SNP genotypes in divergent polyploid genomes using array technology can be problematic due to paralogy, and to divergence in probe sequences causing changes in probe binding efficiencies. An Illumina Infinium II whole-genome genotyping array was recently developed for the cultivated apple and used to develop a molecular linkage map for an apple rootstock progeny (M432), but a large proportion of segregating SNPs were not mapped in the progeny, due to unexpected genotype clustering patterns. To investigate the causes of this unexpected clustering we performed BLAST analysis of all probe sequences against the 'Golden Delicious' genome sequence and discovered evidence for paralogous annealing sites and probe sequence divergence for a high proportion of probes contained on the array. Following visual re-evaluation of the genotyping data generated for 8,788 SNPs for the M432 progeny using the array, we manually re-scored genotypes at 818 loci and mapped a further 797 markers to the M432 linkage map. The newly mapped markers included the majority of those that could not be mapped previously, as well as loci that were previously scored as monomorphic, but which segregated due to divergence leading to heterozygosity in probe annealing sites. An evaluation of the 8,788 probes in a diverse collection of Malus germplasm showed that more than half the probes returned genotype clustering patterns that were difficult or impossible to interpret reliably, highlighting implications for the use of the array in genome-wide association studies.
Ma, G J; Song, Q J; Markell, S G; Qi, L L
2018-07-01
A novel rust resistance gene, R 15 , derived from the cultivated sunflower HA-R8 was assigned to linkage group 8 of the sunflower genome using a genotyping-by-sequencing approach. SNP markers closely linked to R 15 were identified, facilitating marker-assisted selection of resistance genes. The rust virulence gene is co-evolving with the resistance gene in sunflower, leading to the emergence of new physiologic pathotypes. This presents a continuous threat to the sunflower crop necessitating the development of resistant sunflower hybrids providing a more efficient, durable, and environmentally friendly host plant resistance. The inbred line HA-R8 carries a gene conferring resistance to all known races of the rust pathogen in North America and can be used as a broad-spectrum resistance resource. Based on phenotypic assessments of 140 F 2 individuals derived from a cross of HA 89 with HA-R8, rust resistance in the population was found to be conferred by a single dominant gene (R 15 ) originating from HA-R8. Genotypic analysis with the currently available SSR markers failed to find any association between rust resistance and any markers. Therefore, we used genotyping-by-sequencing (GBS) analysis to achieve better genomic coverage. The GBS data showed that R 15 was located at the top end of linkage group (LG) 8. Saturation with 71 previously mapped SNP markers selected within this region further showed that it was located in a resistance gene cluster on LG8, and mapped to a 1.0-cM region between three co-segregating SNP makers SFW01920, SFW00128, and SFW05824 as well as the NSA_008457 SNP marker. These closely linked markers will facilitate marker-assisted selection and breeding in sunflower.
Conservation of water for washing beef heads at harvest.
DeOtte, R E; Spivey, K S; Galloway, H O; Lawrence, T E
2010-03-01
The objective of this research was to develop methods to conserve water necessary to cleanse beef heads prior to USDA-FSIS inspection. This was to be accomplished by establishing a baseline for the minimum amount of water necessary to adequately wash a head and application of image analysis to provide an objective measure of head cleaning. Twenty-one beef heads were manually washed during the harvest process. An average 18.75 L (2.49 SD) and a maximum of 23.88 L were required to cleanse the heads to USDA-FSIS standards. Digital images were captured before and after manual washing then evaluated for percentage red saturation using commercially available image analysis software. A decaying exponential curve extracted from these data indicated that as wash water increased beyond 20 L the impact on red saturation decreased. At 4 sigma from the mean of 18.75 L, red saturation is 16.0 percent, at which logistic regression analysis indicates 99.994 percent of heads would be accepted for inspection, or less than 1 head in 15,000 would be rejected. Reducing to 3 sigma would increase red saturation to 27.6 percent, for which 99.730 percent of heads likely would be accepted (less than 1 in 370 would be rejected). Copyright 2009 Elsevier Ltd. All rights reserved.
Lee, M.W.; Collett, T.S.
2011-01-01
In 2006, the U.S. Geological Survey (USGS) completed detailed analysis and interpretation of available 2-D and 3-D seismic data and proposed a viable method for identifying sub-permafrost gas hydrate prospects within the gas hydrate stability zone in the Milne Point area of northern Alaska. To validate the predictions of the USGS and to acquire critical reservoir data needed to develop a long-term production testing program, a well was drilled at the Mount Elbert prospect in February, 2007. Numerous well log data and cores were acquired to estimate in-situ gas hydrate saturations and reservoir properties.Gas hydrate saturations were estimated from various well logs such as nuclear magnetic resonance (NMR), P- and S-wave velocity, and electrical resistivity logs along with pore-water salinity. Gas hydrate saturations from the NMR log agree well with those estimated from P- and S-wave velocity data. Because of the low salinity of the connate water and the low formation temperature, the resistivity of connate water is comparable to that of shale. Therefore, the effect of clay should be accounted for to accurately estimate gas hydrate saturations from the resistivity data. Two highly gas hydrate-saturated intervals are identified - an upper ???43 ft zone with an average gas hydrate saturation of 54% and a lower ???53 ft zone with an average gas hydrate saturation of 50%; both zones reach a maximum of about 75% saturation. ?? 2009.
Buckner, Diana; Wilson, Suzanne; Kurk, Sandra; Hardy, Michele; Miessner, Nicole; Jutila, Mark A
2006-09-01
Innate immune system stimulants (innate adjuvants) offer complementary approaches to vaccines and antimicrobial compounds to increase host resistance to infection. The authors established fetal bovine intestinal epithelial cell (BIEC) cultures to screen natural product and synthetic compound libraries for novel mucosal adjuvants. They showed that BIECs from fetal intestine maintained an in vivo phenotype as reflected in cytokeratin expression, expression of antigens restricted to intestinal enterocytes, and induced interleukin-8 (IL-8) production. BIECs could be infected by and support replication of bovine rotavirus. A semi-high-throughput enzyme-linked immunosorbent assay-based assay that measured IL-8 production by BIECs was established and used to screen commercially available natural compounds for novel adjuvant activity. Five novel hits were identified, demonstrating the utility of the assay for selecting and screening new epithelial cell adjuvants. Although the identified compounds had not previously been shown to induce IL-8 production in epithelial cells, other known functions for 3 of the 5 were consistent with this activity. Statistical analysis of the throughput data demonstrated that the assay is adaptable to a high-throughput format for screening both synthetic and natural product derived compound libraries.
High-throughput bioinformatics with the Cyrille2 pipeline system
Fiers, Mark WEJ; van der Burgt, Ate; Datema, Erwin; de Groot, Joost CW; van Ham, Roeland CHJ
2008-01-01
Background Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and integrated with existing knowledge through the use of diverse sets of software tools, models and databases. The analyses are often interdependent and chained together to form complex workflows or pipelines. Given the volume of the data used and the multitude of computational resources available, specialized pipeline software is required to make high-throughput analysis of large-scale omics datasets feasible. Results We have developed a generic pipeline system called Cyrille2. The system is modular in design and consists of three functionally distinct parts: 1) a web based, graphical user interface (GUI) that enables a pipeline operator to manage the system; 2) the Scheduler, which forms the functional core of the system and which tracks what data enters the system and determines what jobs must be scheduled for execution, and; 3) the Executor, which searches for scheduled jobs and executes these on a compute cluster. Conclusion The Cyrille2 system is an extensible, modular system, implementing the stated requirements. Cyrille2 enables easy creation and execution of high throughput, flexible bioinformatics pipelines. PMID:18269742
Akeroyd, Michiel; Olsthoorn, Maurien; Gerritsma, Jort; Gutker-Vermaas, Diana; Ekkelkamp, Laurens; van Rij, Tjeerd; Klaassen, Paul; Plugge, Wim; Smit, Ed; Strupat, Kerstin; Wenzel, Thibaut; van Tilborg, Marcel; van der Hoeven, Rob
2013-03-10
In the discovery of new enzymes genomic and cDNA expression libraries containing thousands of differential clones are generated to obtain biodiversity. These libraries need to be screened for the activity of interest. Removing so-called empty and redundant clones significantly reduces the size of these expression libraries and therefore speeds up new enzyme discovery. Here, we present a sensitive, generic workflow for high throughput screening of successful microbial protein over-expression in microtiter plates containing a complex matrix based on mass spectrometry techniques. MALDI-LTQ-Orbitrap screening followed by principal component analysis and peptide mass fingerprinting was developed to obtain a throughput of ∼12,000 samples per week. Alternatively, a UHPLC-MS(2) approach including MS(2) protein identification was developed for microorganisms with a complex protein secretome with a throughput of ∼2000 samples per week. TCA-induced protein precipitation enhanced by addition of bovine serum albumin is used for protein purification prior to MS detection. We show that this generic workflow can effectively reduce large expression libraries from fungi and bacteria to their minimal size by detection of successful protein over-expression using MS. Copyright © 2012 Elsevier B.V. All rights reserved.
Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen
2015-04-15
High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yonggang, E-mail: wangyg@ustc.edu.cn; Hui, Cong; Liu, Chong
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving,more » so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.« less
Wang, Yonggang; Hui, Cong; Liu, Chong; Xu, Chao
2016-04-01
The contribution of this paper is proposing a new entropy extraction mechanism based on sampling phase jitter in ring oscillators to make a high throughput true random number generator in a field programmable gate array (FPGA) practical. Starting from experimental observation and analysis of the entropy source in FPGA, a multi-phase sampling method is exploited to harvest the clock jitter with a maximum entropy and fast sampling speed. This parametrized design is implemented in a Xilinx Artix-7 FPGA, where the carry chains in the FPGA are explored to realize the precise phase shifting. The generator circuit is simple and resource-saving, so that multiple generation channels can run in parallel to scale the output throughput for specific applications. The prototype integrates 64 circuit units in the FPGA to provide a total output throughput of 7.68 Gbps, which meets the requirement of current high-speed quantum key distribution systems. The randomness evaluation, as well as its robustness to ambient temperature, confirms that the new method in a purely digital fashion can provide high-speed high-quality random bit sequences for a variety of embedded applications.
THE RABIT: A RAPID AUTOMATED BIODOSIMETRY TOOL FOR RADIOLOGICAL TRIAGE
Garty, Guy; Chen, Youhua; Salerno, Alessio; Turner, Helen; Zhang, Jian; Lyulko, Oleksandra; Bertucci, Antonella; Xu, Yanping; Wang, Hongliang; Simaan, Nabil; Randers-Pehrson, Gerhard; Yao, Y. Lawrence; Amundson, Sally A.; Brenner, David J.
2010-01-01
In response to the recognized need for high throughput biodosimetry methods for use after large scale radiological events, a logical approach is complete automation of standard biodosimetric assays that are currently performed manually. We describe progress to date on the RABIT (Rapid Automated BIodosimetry Tool), designed to score micronuclei or γ-H2AX fluorescence in lymphocytes derived from a single drop of blood from a fingerstick. The RABIT system is designed to be completely automated, from the input of the capillary blood sample into the machine, to the output of a dose estimate. Improvements in throughput are achieved through use of a single drop of blood, optimization of the biological protocols for in-situ analysis in multi-well plates, implementation of robotic plate and liquid handling, and new developments in high-speed imaging. Automating well-established bioassays represents a promising approach to high-throughput radiation biodosimetry, both because high throughputs can be achieved, but also because the time to deployment is potentially much shorter than for a new biological assay. Here we describe the development of each of the individual modules of the RABIT system, and show preliminary data from key modules. Ongoing is system integration, followed by calibration and validation. PMID:20065685
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila.
Chiaraviglio, Lucius; Kirby, James E
2015-12-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. Copyright © 2015, American Society for Microbiology. All Rights Reserved.
High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila
Chiaraviglio, Lucius
2015-01-01
Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509
Pathway analysis of high-throughput biological data within a Bayesian network framework.
Isci, Senol; Ozturk, Cengizhan; Jones, Jon; Otu, Hasan H
2011-06-15
Most current approaches to high-throughput biological data (HTBD) analysis either perform individual gene/protein analysis or, gene/protein set enrichment analysis for a list of biologically relevant molecules. Bayesian Networks (BNs) capture linear and non-linear interactions, handle stochastic events accounting for noise, and focus on local interactions, which can be related to causal inference. Here, we describe for the first time an algorithm that models biological pathways as BNs and identifies pathways that best explain given HTBD by scoring fitness of each network. Proposed method takes into account the connectivity and relatedness between nodes of the pathway through factoring pathway topology in its model. Our simulations using synthetic data demonstrated robustness of our approach. We tested proposed method, Bayesian Pathway Analysis (BPA), on human microarray data regarding renal cell carcinoma (RCC) and compared our results with gene set enrichment analysis. BPA was able to find broader and more specific pathways related to RCC. Accompanying BPA software (BPAS) package is freely available for academic use at http://bumil.boun.edu.tr/bpa.
High-throughput SNP-genotyping analysis of the relationships among Ponto-Caspian sturgeon species
Rastorguev, Sergey M; Nedoluzhko, Artem V; Mazur, Alexander M; Gruzdeva, Natalia M; Volkov, Alexander A; Barmintseva, Anna E; Mugue, Nikolai S; Prokhortchouk, Egor B
2013-01-01
Abstract Legally certified sturgeon fisheries require population protection and conservation methods, including DNA tests to identify the source of valuable sturgeon roe. However, the available genetic data are insufficient to distinguish between different sturgeon populations, and are even unable to distinguish between some species. We performed high-throughput single-nucleotide polymorphism (SNP)-genotyping analysis on different populations of Russian (Acipenser gueldenstaedtii), Persian (A. persicus), and Siberian (A. baerii) sturgeon species from the Caspian Sea region (Volga and Ural Rivers), the Azov Sea, and two Siberian rivers. We found that Russian sturgeons from the Volga and Ural Rivers were essentially indistinguishable, but they differed from Russian sturgeons in the Azov Sea, and from Persian and Siberian sturgeons. We identified eight SNPs that were sufficient to distinguish these sturgeon populations with 80% confidence, and allowed the development of markers to distinguish sturgeon species. Finally, on the basis of our SNP data, we propose that the A. baerii-like mitochondrial DNA found in some Russian sturgeons from the Caspian Sea arose via an introgression event during the Pleistocene glaciation. In the present study, the high-throughput genotyping analysis of several sturgeon populations was performed. SNP markers for species identification were defined. The possible explanation of the baerii-like mitotype presence in some Russian sturgeons in the Caspian Sea was suggested. PMID:24567827
Segat, Ludovica; Padovan, Lara; Doc, Darja; Petix, Vincenzo; Morgutti, Marcello; Crovella, Sergio; Ricci, Giuseppe
2012-12-01
We describe a real-time polymerase chain reaction (PCR) protocol based on the fluorescent molecule SYBR Green chemistry, for a low- to medium-throughput analysis of Y-chromosome microdeletions, optimized according to the European guidelines and aimed at making the protocol faster, avoiding post-PCR processing, and simplifying the results interpretation. We screened 156 men from the Assisted Reproduction Unit, Department of Obstetrics and Gynecology, Institute for Maternal and Child Health IRCCS Burlo Garofolo (Trieste, Italy), 150 not presenting Y-chromosome microdeletion, and 6 with microdeletions in different azoospermic factor (AZF) regions. For each sample, the Zinc finger Y-chromosomal protein (ZFY), sex-determining region Y (SRY), sY84, sY86, sY127, sY134, sY254, and sY255 loci were analyzed by performing one reaction for each locus. AZF microdeletions were successfully detected in six individuals, confirming the results obtained with commercial kits. Our real-time PCR protocol proved to be a rapid, safe, and relatively cheap method that was suitable for a low- to medium-throughput diagnosis of Y-chromosome microdeletion, which allows an analysis of approximately 10 samples (with the addition of positive and negative controls) in a 96-well plate format, or approximately 46 samples in a 384-well plate for all markers simultaneously, in less than 2 h without the need of post-PCR manipulation.
Cheng, Sy-Chyi; Huang, Min-Zong; Wu, Li-Chieh; Chou, Chih-Chiang; Cheng, Chu-Nian; Jhang, Siou-Sian; Shiea, Jentaie
2012-07-17
Interfacing thin layer chromatography (TLC) with ambient mass spectrometry (AMS) has been an important area of analytical chemistry because of its capability to rapidly separate and characterize the chemical compounds. In this study, we have developed a high-throughput TLC-AMS system using building blocks to deal, deliver, and collect the TLC plate through an electrospray-assisted laser desorption ionization (ELDI) source. This is the first demonstration of the use of building blocks to construct and test the TLC-MS interfacing system. With the advantages of being readily available, cheap, reusable, and extremely easy to modify without consuming any material or reagent, the use of building blocks to develop the TLC-AMS interface is undoubtedly a green methodology. The TLC plate delivery system consists of a storage box, plate dealing component, conveyer, light sensor, and plate collecting box. During a TLC-AMS analysis, the TLC plate was sent to the conveyer from a stack of TLC plates placed in the storage box. As the TLC plate passed through the ELDI source, the chemical compounds separated on the plate would be desorbed by laser desorption and subsequently postionized by electrospray ionization. The samples, including a mixture of synthetic dyes and extracts of pharmaceutical drugs, were analyzed to demonstrate the capability of this TLC-ELDI/MS system for high-throughput analysis.
Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T
2015-08-01
Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Orton, Daniel J.; Tfaily, Malak M.; Moore, Ronald J.
To better understand disease conditions and environmental perturbations, multi-omic studies (i.e. proteomic, lipidomic, metabolomic, etc. analyses) are vastly increasing in popularity. In a multi-omic study, a single sample is typically extracted in multiple ways and numerous analyses are performed using different instruments. Thus, one sample becomes many analyses, making high throughput and reproducible evaluations a necessity. One way to address the numerous samples and varying instrumental conditions is to utilize a flow injection analysis (FIA) system for rapid sample injection. While some FIA systems have been created to address these challenges, many have limitations such as high consumable costs, lowmore » pressure capabilities, limited pressure monitoring and fixed flow rates. To address these limitations, we created an automated, customizable FIA system capable of operating at diverse flow rates (~50 nL/min to 500 µL/min) to accommodate low- and high-flow instrument sources. This system can also operate at varying analytical throughputs from 24 to 1200 samples per day to enable different MS analysis approaches. Applications ranging from native protein analyses to molecular library construction were performed using the FIA system. The results from these studies showed a highly robust platform, providing consistent performance over many days without carryover as long as washing buffers specific to each molecular analysis were utilized.« less
Methods for processing high-throughput RNA sequencing data.
Ares, Manuel
2014-11-03
High-throughput sequencing (HTS) methods for analyzing RNA populations (RNA-Seq) are gaining rapid application to many experimental situations. The steps in an RNA-Seq experiment require thought and planning, especially because the expense in time and materials is currently higher and the protocols are far less routine than those used for other high-throughput methods, such as microarrays. As always, good experimental design will make analysis and interpretation easier. Having a clear biological question, an idea about the best way to do the experiment, and an understanding of the number of replicates needed will make the entire process more satisfying. Whether the goal is capturing transcriptome complexity from a tissue or identifying small fragments of RNA cross-linked to a protein of interest, conversion of the RNA to cDNA followed by direct sequencing using the latest methods is a developing practice, with new technical modifications and applications appearing every day. Even more rapid are the development and improvement of methods for analysis of the very large amounts of data that arrive at the end of an RNA-Seq experiment, making considerations regarding reproducibility, validation, visualization, and interpretation increasingly important. This introduction is designed to review and emphasize a pathway of analysis from experimental design through data presentation that is likely to be successful, with the recognition that better methods are right around the corner. © 2014 Cold Spring Harbor Laboratory Press.
DOT National Transportation Integrated Search
1996-04-01
THE STUDY INVESTIGATES THE APPLICATION OF SIMULATION ALONG WITH FIELD OBSERVATIONS FOR ESTIMATION OF EXCLUSIVE LEFT-TURN SATURATION FLOW RATE AND CAPACITY. THE ENTIRE RESEARCH HAS COVERED THE FOLLOWING PRINCIPAL SUBJECTS: (1) A SATURATION FLOW MODEL ...
Sideband instability analysis based on a one-dimensional high-gain free electron laser model
Tsai, Cheng-Ying; Wu, Juhao; Yang, Chuan; ...
2017-12-18
When an untapered high-gain free electron laser (FEL) reaches saturation, the exponential growth ceases and the radiation power starts to oscillate about an equilibrium. The FEL radiation power or efficiency can be increased by undulator tapering. For a high-gain tapered FEL, although the power is enhanced after the first saturation, it is known that there is a so-called second saturation where the FEL power growth stops even with a tapered undulator system. The sideband instability is one of the primary reasons leading to this second saturation. In this paper, we provide a quantitative analysis on how the gradient of undulatormore » tapering can mitigate the sideband growth. The study is carried out semianalytically and compared with one-dimensional numerical simulations. The physical parameters are taken from Linac Coherent Light Source-like electron bunch and undulator systems. The sideband field gain and the evolution of the radiation spectra for different gradients of undulator tapering are examined. It is found that a strong undulator tapering (~10 % ) provides effective suppression of the sideband instability in the postsaturation regime.« less
Sideband instability analysis based on a one-dimensional high-gain free electron laser model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsai, Cheng-Ying; Wu, Juhao; Yang, Chuan
When an untapered high-gain free electron laser (FEL) reaches saturation, the exponential growth ceases and the radiation power starts to oscillate about an equilibrium. The FEL radiation power or efficiency can be increased by undulator tapering. For a high-gain tapered FEL, although the power is enhanced after the first saturation, it is known that there is a so-called second saturation where the FEL power growth stops even with a tapered undulator system. The sideband instability is one of the primary reasons leading to this second saturation. In this paper, we provide a quantitative analysis on how the gradient of undulatormore » tapering can mitigate the sideband growth. The study is carried out semianalytically and compared with one-dimensional numerical simulations. The physical parameters are taken from Linac Coherent Light Source-like electron bunch and undulator systems. The sideband field gain and the evolution of the radiation spectra for different gradients of undulator tapering are examined. It is found that a strong undulator tapering (~10 % ) provides effective suppression of the sideband instability in the postsaturation regime.« less
Sideband instability analysis based on a one-dimensional high-gain free electron laser model
NASA Astrophysics Data System (ADS)
Tsai, Cheng-Ying; Wu, Juhao; Yang, Chuan; Yoon, Moohyun; Zhou, Guanqun
2017-12-01
When an untapered high-gain free electron laser (FEL) reaches saturation, the exponential growth ceases and the radiation power starts to oscillate about an equilibrium. The FEL radiation power or efficiency can be increased by undulator tapering. For a high-gain tapered FEL, although the power is enhanced after the first saturation, it is known that there is a so-called second saturation where the FEL power growth stops even with a tapered undulator system. The sideband instability is one of the primary reasons leading to this second saturation. In this paper, we provide a quantitative analysis on how the gradient of undulator tapering can mitigate the sideband growth. The study is carried out semianalytically and compared with one-dimensional numerical simulations. The physical parameters are taken from Linac Coherent Light Source-like electron bunch and undulator systems. The sideband field gain and the evolution of the radiation spectra for different gradients of undulator tapering are examined. It is found that a strong undulator tapering (˜10 %) provides effective suppression of the sideband instability in the postsaturation regime.
Bioconductor | Informatics Technology for Cancer Research (ITCR)
Bioconductor provides tools for the analysis and comprehension of high-throughput genomic data. R/Bioconductor will be enhanced to meet the increasing complexity of multiassay cancer genomics experiments.
Hu, Wenping; Boerman, Jacquelyn P.; Aldrich, James M.
2017-01-01
Objective A meta-analysis was conducted to evaluate the effects of supplemental fat containing saturated free fatty acids (FA) on milk performance of Holstein dairy cows. Methods A database was developed from 21 studies published between 1991 and 2016 that included 502 dairy cows and a total of 29 to 30 comparisons between dietary treatment and control without fat supplementation. Only saturated free FA (>80% of total FA) was considered as the supplemental fat. Concentration of the supplemental fat was not higher than 3.5% of diet dry matter (DM). Dairy cows were offered total mixed ration, and fed individually. Statistical analysis was conducted using random- or mixed-effects models with Metafor package in R. Results Sub-group analysis showed that there were no differences in studies between randomized block design and Latin square/crossover design for dry matter intake (DMI) and milk production responses to the supplemental fat (all response variables, p≥0.344). The supplemental fat across all studies improved milk yield, milk fat concentration and yield, and milk protein yield by 1.684 kg/d (p<0.001), 0.095 percent unit (p = 0.003), 0.072 kg/d (p<0.001), and 0.036 kg/d (p<0.001), respectively, but tended to decrease milk protein concentration (mean difference = −0.022 percent unit; p = 0.063) while DMI (mean difference = 0.061 kg/d; p = 0.768) remained unchanged. The assessment of heterogeneity suggested that no substantial heterogeneity occurred among all studies for DMI and milk production responses to the supplemental fat (all response variables, I2≤24.1%; p≥0.166). Conclusion The effects of saturated free FA were quantitatively evaluated. Higher milk production and yields of milk fat and protein, with DMI remaining unchanged, indicated that saturated free FA, supplemented at ≤3.5% dietary DM from commercially available fat sources, likely improved the efficiency of milk production. Nevertheless, more studies are needed to assess the variation of production responses to different saturated free FA, either C16:0 or C18:0 alone, or in combination with potentially optimal ratio, when supplemented in dairy cow diets. PMID:28183166
NASA Astrophysics Data System (ADS)
Camgoz, Nilgun; Yener, Cengiz
2002-06-01
In order to investigate preference responses for foreground- background color relationships, 85 university undergraduates in Ankara, Turkey, viewed 6 background colors (red, yellow, green, cyan, blue, and magenta) on which color squares of differing hues, saturations, and brightnesses were presented. All the background colors had maximum brightness (100%) and maximum saturation (100%). Subjects were asked to show the color square they preferred on the presented background color viewed through a computer monitor. The experimental setup consisted of a computer monitor located in a windowless room, illuminated with cove lighting. The findings of the experiment show that the brightness 100%- saturation 100% range is significantly preferred the most (p-value < 0.03). Thus, color squares that are most saturated and brightest are preferred on backgrounds of most saturated and brightest colors. Regardless of the background colors viewed, the subjects preferred blue the most (p-value < 0.01). Findings of the study are also discussed with pertinent research on the field. Through this analysis, an understanding of foreground-background color relationships in terms of preference is sought.
Dutta, Sanjib; Koide, Akiko; Koide, Shohei
2008-01-01
Stability evaluation of many mutants can lead to a better understanding of the sequence determinants of a structural motif and of factors governing protein stability and protein evolution. The traditional biophysical analysis of protein stability is low throughput, limiting our ability to widely explore the sequence space in a quantitative manner. In this study, we have developed a high-throughput library screening method for quantifying stability changes, which is based on protein fragment reconstitution and yeast surface display. Our method exploits the thermodynamic linkage between protein stability and fragment reconstitution and the ability of the yeast surface display technique to quantitatively evaluate protein-protein interactions. The method was applied to a fibronectin type III (FN3) domain. Characterization of fragment reconstitution was facilitated by the co-expression of two FN3 fragments, thus establishing a "yeast surface two-hybrid" method. Importantly, our method does not rely on competition between clones and thus eliminates a common limitation of high-throughput selection methods in which the most stable variants are predominantly recovered. Thus, it allows for the isolation of sequences that exhibits a desired level of stability. We identified over one hundred unique sequences for a β-bulge motif, which was significantly more informative than natural sequences of the FN3 family in revealing the sequence determinants for the β-bulge. Our method provides a powerful means to rapidly assess stability of many variants, to systematically assess contribution of different factors to protein stability and to enhance protein stability. PMID:18674545
Droplet microfluidics--a tool for single-cell analysis.
Joensson, Haakan N; Andersson Svahn, Helene
2012-12-03
Droplet microfluidics allows the isolation of single cells and reagents in monodisperse picoliter liquid capsules and manipulations at a throughput of thousands of droplets per second. These qualities allow many of the challenges in single-cell analysis to be overcome. Monodispersity enables quantitative control of solute concentrations, while encapsulation in droplets provides an isolated compartment for the single cell and its immediate environment. The high throughput allows the processing and analysis of the tens of thousands to millions of cells that must be analyzed to accurately describe a heterogeneous cell population so as to find rare cell types or access sufficient biological space to find hits in a directed evolution experiment. The low volumes of the droplets make very large screens economically viable. This Review gives an overview of the current state of single-cell analysis involving droplet microfluidics and offers examples where droplet microfluidics can further biological understanding. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Hayden, Eric J
2016-08-15
RNA molecules provide a realistic but tractable model of a genotype to phenotype relationship. This relationship has been extensively investigated computationally using secondary structure prediction algorithms. Enzymatic RNA molecules, or ribozymes, offer access to genotypic and phenotypic information in the laboratory. Advancements in high-throughput sequencing technologies have enabled the analysis of sequences in the lab that now rivals what can be accomplished computationally. This has motivated a resurgence of in vitro selection experiments and opened new doors for the analysis of the distribution of RNA functions in genotype space. A body of computational experiments has investigated the persistence of specific RNA structures despite changes in the primary sequence, and how this mutational robustness can promote adaptations. This article summarizes recent approaches that were designed to investigate the role of mutational robustness during the evolution of RNA molecules in the laboratory, and presents theoretical motivations, experimental methods and approaches to data analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Tissue matrix arrays for high throughput screening and systems analysis of cell function
Beachley, Vince Z.; Wolf, Matthew T.; Sadtler, Kaitlyn; Manda, Srikanth S.; Jacobs, Heather; Blatchley, Michael; Bader, Joel S.; Pandey, Akhilesh; Pardoll, Drew; Elisseeff, Jennifer H.
2015-01-01
Cell and protein arrays have demonstrated remarkable utility in the high-throughput evaluation of biological responses; however, they lack the complexity of native tissue and organs. Here, we describe tissue extracellular matrix (ECM) arrays for screening biological outputs and systems analysis. We spotted processed tissue ECM particles as two-dimensional arrays or incorporated them with cells to generate three-dimensional cell-matrix microtissue arrays. We then investigated the response of human stem, cancer, and immune cells to tissue ECM arrays originating from 11 different tissues, and validated the 2D and 3D arrays as representative of the in vivo microenvironment through quantitative analysis of tissue-specific cellular responses, including matrix production, adhesion and proliferation, and morphological changes following culture. The biological outputs correlated with tissue proteomics, and network analysis identified several proteins linked to cell function. Our methodology enables broad screening of ECMs to connect tissue-specific composition with biological activity, providing a new resource for biomaterials research and translation. PMID:26480475
NASA Astrophysics Data System (ADS)
Yang, Haoyu; Hattori, Ken
2018-03-01
We studied the initial stage of iron deposition on an ethanol-saturated Si(111)7 × 7 surface at room temperature using scanning tunneling microscopy (STM). The statistical analysis of the Si adatom height at empty states for Si(111)-C2H5OH before and after the Fe deposition showed different types of adatoms: type B (before the deposition) and type B' (after the deposition) assigned to bare adatoms, type D and type D' to C2H5O-terminated adatoms, and type E' to adatoms with Fe. The analysis of the height distribution revealed the protection of the molecule termination for the Fe capture at the initial stage. The analysis also indicated the preferential capture of a single Fe atom to a bare center-adatom rather than a bare corner-adatom which remain after the C2H5OH saturation, but no selectivity was observed in faulted and unfaulted half unit-cells. This is the first STM-based report proving that a remaining bare adatom, but not a molecule-terminated adatom, captures a metal.
Garst, Andrew D; Bassalo, Marcelo C; Pines, Gur; Lynch, Sean A; Halweg-Edwards, Andrea L; Liu, Rongming; Liang, Liya; Wang, Zhiwen; Zeitoun, Ramsey; Alexander, William G; Gill, Ryan T
2017-01-01
Improvements in DNA synthesis and sequencing have underpinned comprehensive assessment of gene function in bacteria and eukaryotes. Genome-wide analyses require high-throughput methods to generate mutations and analyze their phenotypes, but approaches to date have been unable to efficiently link the effects of mutations in coding regions or promoter elements in a highly parallel fashion. We report that CRISPR-Cas9 gene editing in combination with massively parallel oligomer synthesis can enable trackable editing on a genome-wide scale. Our method, CRISPR-enabled trackable genome engineering (CREATE), links each guide RNA to homologous repair cassettes that both edit loci and function as barcodes to track genotype-phenotype relationships. We apply CREATE to site saturation mutagenesis for protein engineering, reconstruction of adaptive laboratory evolution experiments, and identification of stress tolerance and antibiotic resistance genes in bacteria. We provide preliminary evidence that CREATE will work in yeast. We also provide a webtool to design multiplex CREATE libraries.
System considerations for detection and tracking of small targets using passive sensors
NASA Astrophysics Data System (ADS)
DeBell, David A.
1991-08-01
Passive sensors provide only a few discriminants to assist in threat assessment of small targets. Tracking of the small targets provides additional discriminants. This paper discusses the system considerations for tracking small targets using passive sensors, in particular EO sensors. Tracking helps establish good versus bad detections. Discussed are the requirements to be placed on the sensor system's accuracy, with respect to knowledge of the sightline direction. The detection of weak targets sets a requirement for two levels of tracking in order to reduce processor throughput. A system characteristic is the need to track all detections. For low thresholds, this can mean a heavy track burden. Therefore, thresholds must be adaptive in order not to saturate the processors. Second-level tracks must develop a range estimate in order to assess threat. Sensor platform maneuvers are required if the targets are moving. The need for accurate pointing, good stability, and a good update rate will be shown quantitatively, relating to track accuracy and track association.
Multiplexed precision genome editing with trackable genomic barcodes in yeast.
Roy, Kevin R; Smith, Justin D; Vonesch, Sibylle C; Lin, Gen; Tu, Chelsea Szu; Lederer, Alex R; Chu, Angela; Suresh, Sundari; Nguyen, Michelle; Horecka, Joe; Tripathi, Ashutosh; Burnett, Wallace T; Morgan, Maddison A; Schulz, Julia; Orsley, Kevin M; Wei, Wu; Aiyar, Raeka S; Davis, Ronald W; Bankaitis, Vytas A; Haber, James E; Salit, Marc L; St Onge, Robert P; Steinmetz, Lars M
2018-07-01
Our understanding of how genotype controls phenotype is limited by the scale at which we can precisely alter the genome and assess the phenotypic consequences of each perturbation. Here we describe a CRISPR-Cas9-based method for multiplexed accurate genome editing with short, trackable, integrated cellular barcodes (MAGESTIC) in Saccharomyces cerevisiae. MAGESTIC uses array-synthesized guide-donor oligos for plasmid-based high-throughput editing and features genomic barcode integration to prevent plasmid barcode loss and to enable robust phenotyping. We demonstrate that editing efficiency can be increased more than fivefold by recruiting donor DNA to the site of breaks using the LexA-Fkh1p fusion protein. We performed saturation editing of the essential gene SEC14 and identified amino acids critical for chemical inhibition of lipid signaling. We also constructed thousands of natural genetic variants, characterized guide mismatch tolerance at the genome scale, and ascertained that cryptic Pol III termination elements substantially reduce guide efficacy. MAGESTIC will be broadly useful to uncover the genetic basis of phenotypes in yeast.
NASA Astrophysics Data System (ADS)
Su, Huaizhi; Li, Hao; Kang, Yeyuan; Wen, Zhiping
2018-02-01
Seepage is one of key factors which affect the levee engineering safety. The seepage danger without timely detection and rapid response may likely lead to severe accidents such as seepage failure, slope instability, and even levee break. More than 90 percent of levee break events are caused by the seepage. It is very important for seepage behavior identification to determine accurately saturation line in levee engineering. Furthermore, the location of saturation line has a major impact on slope stability in levee engineering. Considering the structure characteristics and service condition of levee engineering, the distributed optical fiber sensing technology is introduced to implement the real-time observation of saturation line in levee engineering. The distributed optical fiber temperature sensor system (DTS)-based monitoring principle of saturation line in levee engineering is investigated. An experimental platform, which consists of DTS, heating system, water-supply system, auxiliary analysis system and levee model, is designed and constructed. The monitoring experiment of saturation line in levee model is implemented on this platform. According to the experimental results, the numerical relationship between moisture content and thermal conductivity in porous medium is identified. A line heat source-based distributed optical fiber method obtaining the thermal conductivity in porous medium is developed. A DTS-based approach is proposed to monitor the saturation line in levee engineering. The embedment pattern of optical fiber for monitoring saturation line is presented.
Three types of gas hydrate reservoirs in the Gulf of Mexico identified in LWD data
Lee, Myung Woong; Collett, Timothy S.
2011-01-01
High quality logging-while-drilling (LWD) well logs were acquired in seven wells drilled during the Gulf of Mexico Gas Hydrate Joint Industry Project Leg II in the spring of 2009. These data help to identify three distinct types of gas hydrate reservoirs: isotropic reservoirs in sands, vertical fractured reservoirs in shale, and horizontally layered reservoirs in silty shale. In general, most gas hydratebearing sand reservoirs exhibit isotropic elastic velocities and formation resistivities, and gas hydrate saturations estimated from the P-wave velocity agree well with those from the resistivity. However, in highly gas hydrate-saturated sands, resistivity-derived gas hydrate-saturation estimates appear to be systematically higher by about 5% over those estimated by P-wave velocity, possibly because of the uncertainty associated with the consolidation state of gas hydrate-bearing sands. Small quantities of gas hydrate were observed in vertical fractures in shale. These occurrences are characterized by high formation resistivities with P-wave velocities close to those of water-saturated sediment. Because the formation factor varies significantly with respect to the gas hydrate saturation for vertical fractures at low saturations, an isotropic analysis of formation factor highly overestimates the gas hydrate saturation. Small quantities of gas hydrate in horizontal layers in shale are characterized by moderate increase in P-wave velocities and formation resistivities and either measurement can be used to estimate gas hydrate saturations.
Oxygen saturation in optic nerve head structures by hyperspectral image analysis.
Beach, James; Ning, Jinfeng; Khoobehi, Bahram
2007-02-01
A method is presented for the calculation and visualization of percent blood oxygen saturation from specific tissue structures in hyperspectral images of the optic nerve head (ONH). Trans-pupillary images of the primate optic nerve head and overlying retinal blood vessels were obtained with a hyperspectral imaging (HSI) system attached to a fundus camera. Images were recorded during normal blood flow and after partially interrupting flow to the ONH and retinal circulation by elevation of the intraocular pressure (IOP) from 10 mmHg to 55 mmHg in steps. Percent oxygen saturation was calculated from groups of pixels associated with separate tissue structures, using a linear least-squares curve fit of the recorded hemoglobin spectrum to reference spectra obtained from fully oxygenated and deoxygenated red cell suspensions. Color maps of saturation were obtained from a new algorithm that enables comparison of oxygen saturation from large vessels and tissue areas in hyperspectral images. Percent saturation in retinal vessels and from the average over ONH structures (IOP = 10 mmHg) was (mean +/- SE): artery 81.8 +/- 0.4%, vein 42.6 +/- 0.9%, average ONH 68.3 +/- 0.4%. Raising IOP from 10 mmHg to 55 mmHg for 5 min caused blood oxygen saturation to decrease (mean +/- SE): artery 46.1 +/- 6.2%, vein 36.1 +/- 1.6%, average ONH 41.9 +/- 1.6%. The temporal cup showed the highest saturation at low and high IOP (77.3 +/- 1.0% and 60.1 +/- 4.0%) and the least reduction in saturation at high IOP (22.3%) compared with that of the average ONH (38.6%). A linear relationship was found between saturation indices obtained from the algorithm and percent saturation values obtained by spectral curve fits to calibrated red cell samples. Percent oxygen saturation was determined from hyperspectral images of the ONH tissue and retinal vessels overlying the ONH at normal and elevated IOP. Pressure elevation was shown to reduce blood oxygen saturation in vessels and ONH structures, with the smallest reduction in the ONH observed in the temporal cup. IOP-induced saturation changes were visualized in color maps using an algorithm that follows saturation-dependent changes in the blood spectrum and blood volume differences across tissue. Reduced arterial saturation at high IOP may have resulted from a flow-dependent mechanism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daigle, Hugh; Rice, Mary Anna; Daigle, Hugh
Relative permeabilities to water and gas are important parameters for accurate modeling of the formation of methane hydrate deposits and production of methane from hydrate reservoirs. Experimental measurements of gas and water permeability in the presence of hydrate are difficult to obtain. The few datasets that do exist suggest that relative permeability obeys a power law relationship with water or gas saturation with exponents ranging from around 2 to greater than 10. Critical path analysis and percolation theory provide a framework for interpreting the saturation-dependence of relative permeability based on percolation thresholds and the breadth of pore size distributions, whichmore » may be determined easily from 3-D images or gas adsorption-desorption hysteresis. We show that the exponent of the permeability-saturation relationship for relative permeability to water is related to the breadth of the pore size distribution, with broader pore size distributions corresponding to larger exponents. Relative permeability to water in well-sorted sediments with narrow pore size distributions, such as Berea sandstone or Toyoura sand, follows percolation scaling with an exponent of 2. On the other hand, pore-size distributions determined from argon adsorption measurements we performed on clays from the Nankai Trough suggest that relative permeability to water in fine-grained intervals may be characterized by exponents as large as 10 as determined from critical path analysis. We also show that relative permeability to the gas phase follows percolation scaling with a quadratic dependence on gas saturation, but the threshold gas saturation for percolation changes with hydrate saturation, which is an important consideration in systems in which both hydrate and gas are present, such as during production from a hydrate reservoir. Our work shows how measurements of pore size distributions from 3-D imaging or gas adsorption may be used to determine relative permeabilities.« less
Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses
Cabrera, Raúl Iskander
2017-01-01
This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system. PMID:28542547
A parametric analysis of waves propagating in a porous solid saturated by a three-phase fluid.
Santos, Juan E; Savioli, Gabriela B
2015-11-01
This paper presents an analysis of a model for the propagation of waves in a poroelastic solid saturated by a three-phase viscous, compressible fluid. The constitutive relations and the equations of motion are stated first. Then a plane wave analysis determines the phase velocities and attenuation coefficients of the four compressional waves and one shear wave that propagate in this type of medium. A procedure to compute the elastic constants in the constitutive relations is defined next. Assuming the knowledge of the shear modulus of the dry matrix, the other elastic constants in the stress-strain relations are determined by employing ideal gedanken experiments generalizing those of Biot's theory for single-phase fluids. These experiments yield expressions for the elastic constants in terms of the properties of the individual solid and fluids phases. Finally the phase velocities and attenuation coefficients of all waves are computed for a sample of Berea sandstone saturated by oil, gas, and water.
Report for the NGFA-5 project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jaing, C; Jackson, P; Thissen, J
The objective of this project is to provide DHS a comprehensive evaluation of the current genomic technologies including genotyping, TaqMan PCR, multiple locus variable tandem repeat analysis (MLVA), microarray and high-throughput DNA sequencing in the analysis of biothreat agents from complex environmental samples. To effectively compare the sensitivity and specificity of the different genomic technologies, we used SNP TaqMan PCR, MLVA, microarray and high-throughput illumine and 454 sequencing to test various strains from B. anthracis, B. thuringiensis, BioWatch aerosol filter extracts or soil samples that were spiked with B. anthracis, and samples that were previously collected during DHS and EPAmore » environmental release exercises that were known to contain B. thuringiensis spores. The results of all the samples against the various assays are discussed in this report.« less
High-Throughput Quantitative Lipidomics Analysis of Nonesterified Fatty Acids in Plasma by LC-MS.
Christinat, Nicolas; Morin-Rivron, Delphine; Masoodi, Mojgan
2017-01-01
Nonesterified fatty acids are important biological molecules which have multiple functions such as energy storage, gene regulation, or cell signaling. Comprehensive profiling of nonesterified fatty acids in biofluids can facilitate studying and understanding their roles in biological systems. For these reasons, we have developed and validated a high-throughput, nontargeted lipidomics method coupling liquid chromatography to high-resolution mass spectrometry for quantitative analysis of nonesterified fatty acids. Sufficient chromatographic separation is achieved to separate positional isomers such as polyunsaturated and branched-chain species and quantify a wide range of nonesterified fatty acids in human plasma samples. However, this method is not limited only to these fatty acid species and offers the possibility to perform untargeted screening of additional nonesterified fatty acid species.
Engelmann, Brett W
2017-01-01
The Src Homology 2 (SH2) domain family primarily recognizes phosphorylated tyrosine (pY) containing peptide motifs. The relative affinity preferences among competing SH2 domains for phosphopeptide ligands define "specificity space," and underpins many functional pY mediated interactions within signaling networks. The degree of promiscuity exhibited and the dynamic range of affinities supported by individual domains or phosphopeptides is best resolved by a carefully executed and controlled quantitative high-throughput experiment. Here, I describe the fabrication and application of a cellulose-peptide conjugate microarray (CPCMA) platform to the quantitative analysis of SH2 domain specificity space. Included herein are instructions for optimal experimental design with special attention paid to common sources of systematic error, phosphopeptide SPOT synthesis, microarray fabrication, analyte titrations, data capture, and analysis.
Automated sample area definition for high-throughput microscopy.
Zeder, M; Ellrott, A; Amann, R
2011-04-01
High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.
Cuthbertson, Daniel; Piljac-Žegarac, Jasenka; Lange, Bernd Markus
2011-01-01
Herein we report on an improved method for the microscale extraction of huperzine A (HupA), an acetylcholinesterase-inhibiting alkaloid, from as little as 3 mg of tissue homogenate from the clubmoss Huperzia squarrosa (G. Forst.) Trevis with 99.95 % recovery. We also validated a novel UHPLC-QTOF-MS method for the high-throughput analysis of H. squarrosa extracts in only 6 min, which, in combination with the very low limit of detection (20 pg on column) and the wide linear range for quantification (20 to 10,000 pg on column), allow for a highly efficient screening of extracts containing varying amounts of HupA. Utilization of this methodology has the potential to conserve valuable plant resources. PMID:22275140