Science.gov

Sample records for mirexpress analyzing high-throughput

  1. High throughput assays for analyzing transcription factors.

    PubMed

    Li, Xianqiang; Jiang, Xin; Yaoi, Takuro

    2006-06-01

    Transcription factors are a group of proteins that modulate the expression of genes involved in many biological processes, such as cell growth and differentiation. Alterations in transcription factor function are associated with many human diseases, and therefore these proteins are attractive potential drug targets. A key issue in the development of such therapeutics is the generation of effective tools that can be used for high throughput discovery of the critical transcription factors involved in human diseases, and the measurement of their activities in a variety of disease or compound-treated samples. Here, a number of innovative arrays and 96-well format assays for profiling and measuring the activities of transcription factors will be discussed.

  2. Diversity and distribution of unicellular opisthokonts along the European coast analyzed using high-throughput sequencing

    PubMed Central

    del Campo, Javier; Mallo, Diego; Massana, Ramon; de Vargas, Colomban; Richards, Thomas A.; Ruiz-Trillo, Iñaki

    2015-01-01

    Summary The opisthokonts are one of the major super-groups of eukaryotes. It comprises two major clades: 1) the Metazoa and their unicellular relatives and 2) the Fungi and their unicellular relatives. There is, however, little knowledge of the role of opisthokont microbes in many natural environments, especially among non-metazoan and non-fungal opisthokonts. Here we begin to address this gap by analyzing high throughput 18S rDNA and 18S rRNA sequencing data from different European coastal sites, sampled at different size fractions and depths. In particular, we analyze the diversity and abundance of choanoflagellates, filastereans, ichthyosporeans, nucleariids, corallochytreans and their related lineages. Our results show the great diversity of choanoflagellates in coastal waters as well as a relevant role of the ichthyosporeans and the uncultured marine opisthokonts (MAOP). Furthermore, we describe a new lineage of marine fonticulids (MAFO) that appears to be abundant in sediments. Therefore, our work points to a greater potential ecological role for unicellular opisthokonts than previously appreciated in marine environments, both in water column and sediments, and also provides evidence of novel opisthokont phylogenetic lineages. This study highlights the importance of high throughput sequencing approaches to unravel the diversity and distribution of both known and novel eukaryotic lineages. PMID:25556908

  3. Compositional analysis: a valid approach to analyze microbiome high-throughput sequencing data.

    PubMed

    Gloor, Gregory B; Reid, Gregor

    2016-08-01

    A workshop held at the 2015 annual meeting of the Canadian Society of Microbiologists highlighted compositional data analysis methods and the importance of exploratory data analysis for the analysis of microbiome data sets generated by high-throughput DNA sequencing. A summary of the content of that workshop, a review of new methods of analysis, and information on the importance of careful analyses are presented herein. The workshop focussed on explaining the rationale behind the use of compositional data analysis, and a demonstration of these methods for the examination of 2 microbiome data sets. A clear understanding of bioinformatics methodologies and the type of data being analyzed is essential, given the growing number of studies uncovering the critical role of the microbiome in health and disease and the need to understand alterations to its composition and function following intervention with fecal transplant, probiotics, diet, and pharmaceutical agents.

  4. Gene expression and splicing alterations analyzed by high throughput RNA sequencing of chronic lymphocytic leukemia specimens.

    PubMed

    Liao, Wei; Jordaan, Gwen; Nham, Phillipp; Phan, Ryan T; Pelegrini, Matteo; Sharma, Sanjai

    2015-10-16

    To determine differentially expressed and spliced RNA transcripts in chronic lymphocytic leukemia specimens a high throughput RNA-sequencing (HTS RNA-seq) analysis was performed. Ten CLL specimens and five normal peripheral blood CD19+ B cells were analyzed by HTS RNA-seq. The library preparation was performed with Illumina TrueSeq RNA kit and analyzed by Illumina HiSeq 2000 sequencing system. An average of 48.5 million reads for B cells, and 50.6 million reads for CLL specimens were obtained with 10396 and 10448 assembled transcripts for normal B cells and primary CLL specimens respectively. With the Cuffdiff analysis, 2091 differentially expressed genes (DEG) between B cells and CLL specimens based on FPKM (fragments per kilobase of transcript per million reads and false discovery rate, FDR q < 0.05, fold change >2) were identified. Expression of selected DEGs (n = 32) with up regulated and down regulated expression in CLL from RNA-seq data were also analyzed by qRT-PCR in a test cohort of CLL specimens. Even though there was a variation in fold expression of DEG genes between RNA-seq and qRT-PCR; more than 90 % of analyzed genes were validated by qRT-PCR analysis. Analysis of RNA-seq data for splicing alterations in CLL and B cells was performed by Multivariate Analysis of Transcript Splicing (MATS analysis). Skipped exon was the most frequent splicing alteration in CLL specimens with 128 significant events (P-value <0.05, minimum inclusion level difference >0.1). The RNA-seq analysis of CLL specimens identifies novel DEG and alternatively spliced genes that are potential prognostic markers and therapeutic targets. High level of validation by qRT-PCR for a number of DEG genes supports the accuracy of this analysis. Global comparison of transcriptomes of B cells, IGVH non-mutated CLL (U-CLL) and mutated CLL specimens (M-CLL) with multidimensional scaling analysis was able to segregate CLL and B cell transcriptomes but the M-CLL and U-CLL transcriptomes

  5. A combinatorial approach for analyzing intra-tumor heterogeneity from high-throughput sequencing data

    PubMed Central

    Hajirasouliha, Iman; Mahmoody, Ahmad; Raphael, Benjamin J.

    2014-01-01

    Motivation: High-throughput sequencing of tumor samples has shown that most tumors exhibit extensive intra-tumor heterogeneity, with multiple subpopulations of tumor cells containing different somatic mutations. Recent studies have quantified this intra-tumor heterogeneity by clustering mutations into subpopulations according to the observed counts of DNA sequencing reads containing the variant allele. However, these clustering approaches do not consider that the population frequencies of different tumor subpopulations are correlated by their shared ancestry in the same population of cells. Results: We introduce the binary tree partition (BTP), a novel combinatorial formulation of the problem of constructing the subpopulations of tumor cells from the variant allele frequencies of somatic mutations. We show that finding a BTP is an NP-complete problem; derive an approximation algorithm for an optimization version of the problem; and present a recursive algorithm to find a BTP with errors in the input. We show that the resulting algorithm outperforms existing clustering approaches on simulated and real sequencing data. Availability and implementation: Python and MATLAB implementations of our method are available at http://compbio.cs.brown.edu/software/ Contact: braphael@cs.brown.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24932008

  6. Chemiluminescence analyzer of NOx as a high-throughput screening tool in selective catalytic reduction of NO

    PubMed Central

    Oh, Kwang Seok; Woo, Seong Ihl

    2011-01-01

    A chemiluminescence-based analyzer of NOx gas species has been applied for high-throughput screening of a library of catalytic materials. The applicability of the commercial NOx analyzer as a rapid screening tool was evaluated using selective catalytic reduction of NO gas. A library of 60 binary alloys composed of Pt and Co, Zr, La, Ce, Fe or W on Al2O3 substrate was tested for the efficiency of NOx removal using a home-built 64-channel parallel and sequential tubular reactor. The NOx concentrations measured by the NOx analyzer agreed well with the results obtained using micro gas chromatography for a reference catalyst consisting of 1 wt% Pt on γ-Al2O3. Most alloys showed high efficiency at 275 °C, which is typical of Pt-based catalysts for selective catalytic reduction of NO. The screening with NOx analyzer allowed to select Pt-Ce(X) (X=1–3) and Pt–Fe(2) as the optimal catalysts for NOx removal: 73% NOx conversion was achieved with the Pt–Fe(2) alloy, which was much better than the results for the reference catalyst and the other library alloys. This study demonstrates a sequential high-throughput method of practical evaluation of catalysts for the selective reduction of NO. PMID:27877438

  7. Evaluation of the automated coagulation analyzer CS-5100 and its utility in high throughput laboratories.

    PubMed

    Ratzinger, Franz; Schmetterer, Klaus G; Haslacher, Helmuth; Perkmann, Thomas; Belik, Sabine; Quehenberger, Peter

    2014-08-01

    Automated analyzers are an important component of modern laboratories. As a representative of the newest generation of coagulation analyzers, the CS-5100 features several technical refinements including a pre-analytical assessment unit as well as multi-wavelength optical detection units. Therefore, the CS-5100 is supposed to rapidly and accurately perform a broad panel of coagulation tests. In the current study, the CS-5100 was evaluated regarding its precision and practicability in a clinical laboratory setting. The CS-5100 was evaluated regarding its intra- and inter-assay precision using commercially available control samples. RESULTS of patient samples, including hemolytic, icteric and lipemic specimens, measured on the CS-5100 were compared to reference analyzers, which are used in our accredited laboratory. The coefficients of variation, assessed in the intra- and inter-assay precision analyses were below 5% representatively for most parameters. RESULTS, obtained by the CS-5100 showed predominantly a high comparability to used reference analyzers, with correlation coefficients ranging from 0.857 to 0.990. Only minor ranged systemic or proportional differences were found in Passing-Bablok regression between the CS-5100 and reference analyzers regarding most of the tested parameters. Lipemic samples had a tendency to deteriorate correlation coefficients, but an overall effect of the sample's triglyceride level could be ruled out. In a routine setting, the analyzer reached a sample throughput rate of 160 tests per hour. The CS-5100 is able to rapidly and precisely measure patient samples. No considerable influence on test comparability was found for elevated levels of free hemoglobin, bilirubin or triglycerides.

  8. Distribution and Diversity of Bacteria and Fungi Colonization in Stone Monuments Analyzed by High-Throughput Sequencing

    PubMed Central

    Li, Qiang; Zhang, Bingjian; He, Zhang; Yang, Xiaoru

    2016-01-01

    The historical and cultural heritage of Qingxing palace and Lingyin and Kaihua temple, located in Hangzhou of China, include a large number of exquisite Buddhist statues and ancient stone sculptures which date back to the Northern Song (960–1219 A.D.) and Qing dynasties (1636–1912 A.D.) and are considered to be some of the best examples of ancient stone sculpting techniques. They were added to the World Heritage List in 2011 because of their unique craftsmanship and importance to the study of ancient Chinese Buddhist culture. However, biodeterioration of the surface of the ancient Buddhist statues and white marble pillars not only severely impairs their aesthetic value but also alters their material structure and thermo-hygric properties. In this study, high-throughput sequencing was utilized to identify the microbial communities colonizing the stone monuments. The diversity and distribution of the microbial communities in six samples collected from three different environmental conditions with signs of deterioration were analyzed by means of bioinformatics software and diversity indices. In addition, the impact of environmental factors, including temperature, light intensity, air humidity, and the concentration of NO2 and SO2, on the microbial communities’ diversity and distribution was evaluated. The results indicate that the presence of predominantly phototrophic microorganisms was correlated with light and humidity, while nitrifying bacteria and Thiobacillus were associated with NO2 and SO2 from air pollution. PMID:27658256

  9. Distribution and Diversity of Bacteria and Fungi Colonization in Stone Monuments Analyzed by High-Throughput Sequencing.

    PubMed

    Li, Qiang; Zhang, Bingjian; He, Zhang; Yang, Xiaoru

    The historical and cultural heritage of Qingxing palace and Lingyin and Kaihua temple, located in Hangzhou of China, include a large number of exquisite Buddhist statues and ancient stone sculptures which date back to the Northern Song (960-1219 A.D.) and Qing dynasties (1636-1912 A.D.) and are considered to be some of the best examples of ancient stone sculpting techniques. They were added to the World Heritage List in 2011 because of their unique craftsmanship and importance to the study of ancient Chinese Buddhist culture. However, biodeterioration of the surface of the ancient Buddhist statues and white marble pillars not only severely impairs their aesthetic value but also alters their material structure and thermo-hygric properties. In this study, high-throughput sequencing was utilized to identify the microbial communities colonizing the stone monuments. The diversity and distribution of the microbial communities in six samples collected from three different environmental conditions with signs of deterioration were analyzed by means of bioinformatics software and diversity indices. In addition, the impact of environmental factors, including temperature, light intensity, air humidity, and the concentration of NO2 and SO2, on the microbial communities' diversity and distribution was evaluated. The results indicate that the presence of predominantly phototrophic microorganisms was correlated with light and humidity, while nitrifying bacteria and Thiobacillus were associated with NO2 and SO2 from air pollution.

  10. Evaluation of genomic high-throughput sequencing data generated on Illumina HiSeq and Genome Analyzer systems

    PubMed Central

    2011-01-01

    Background The generation and analysis of high-throughput sequencing data are becoming a major component of many studies in molecular biology and medical research. Illumina's Genome Analyzer (GA) and HiSeq instruments are currently the most widely used sequencing devices. Here, we comprehensively evaluate properties of genomic HiSeq and GAIIx data derived from two plant genomes and one virus, with read lengths of 95 to 150 bases. Results We provide quantifications and evidence for GC bias, error rates, error sequence context, effects of quality filtering, and the reliability of quality values. By combining different filtering criteria we reduced error rates 7-fold at the expense of discarding 12.5% of alignable bases. While overall error rates are low in HiSeq data we observed regions of accumulated wrong base calls. Only 3% of all error positions accounted for 24.7% of all substitution errors. Analyzing the forward and reverse strands separately revealed error rates of up to 18.7%. Insertions and deletions occurred at very low rates on average but increased to up to 2% in homopolymers. A positive correlation between read coverage and GC content was found depending on the GC content range. Conclusions The errors and biases we report have implications for the use and the interpretation of Illumina sequencing data. GAIIx and HiSeq data sets show slightly different error profiles. Quality filtering is essential to minimize downstream analysis artifacts. Supporting previous recommendations, the strand-specificity provides a criterion to distinguish sequencing errors from low abundance polymorphisms. PMID:22067484

  11. GeoChip 3.0: A High Throughput Tool for Analyzing Microbial Community, Composition, Structure, and Functional Activity

    SciTech Connect

    He, Zhili; Deng, Ye; Nostrand, Joy Van; Tu, Qichao; Xu, Meiying; Hemme, Chris; Wu, Liyou; Hazen, Terry; Zhou, Jizhong; Li, Xingyuan; Gentry, Terry; Yin, Yifeng; Liebich, Jost

    2010-05-17

    Microarray-based genomic technology has been widely used for microbial community analysis, and it is expected that microarray-based genomic technologies will revolutionize the analysis of microbial community structure, function and dynamics. A new generation of functional gene arrays (GeoChip 3.0) has been developed, with 27,812 probes covering 56,990 gene variants from 292 functional gene families involved in carbon, nitrogen, phosphorus and sulfur cycles, energy metabolism, antibiotic resistance, metal resistance, and organic contaminant degradation. Those probes were derived from 2,744, 140, and 262 species for bacteria, archaea, and fungi, respectively. GeoChip 3.0 has several other distinct features, such as a common oligo reference standard (CORS) for data normalization and comparison, a software package for data management and future updating, and the gyrB gene for phylogenetic analysis. Our computational evaluation of probe specificity indicated that all designed probes had a high specificity to their corresponding targets. Also, experimental analysis with synthesized oligonucleotides and genomic DNAs showed that only 0.0036percent-0.025percent false positive rates were observed, suggesting that the designed probes are highly specific under the experimental conditions examined. In addition, GeoChip 3.0 was applied to analyze soil microbial communities in a multifactor grassland ecosystem in Minnesota, USA, which demonstrated that the structure, composition, and potential activity of soil microbial communities significantly changed with the plant species diversity. All results indicate that GeoChip 3.0 is a high throughput powerful tool for studying microbial community functional structure, and linking microbial communities to ecosystem processes and functioning. To our knowledge, GeoChip 3.0 is the most comprehensive microarrays currently available for studying microbial communities associated with geobiochemical cycling, global climate change, bioenergy

  12. EXPath tool-a system for comprehensively analyzing regulatory pathways and coexpression networks from high-throughput transcriptome data.

    PubMed

    Zheng, Han-Qin; Wu, Nai-Yun; Chow, Chi-Nga; Tseng, Kuan-Chieh; Chien, Chia-Hung; Hung, Yu-Cheng; Li, Guan-Zhen; Chang, Wen-Chi

    2017-03-13

    Next generation sequencing (NGS) has become the mainstream approach for monitoring gene expression levels in parallel with various experimental treatments. Unfortunately, there is no systematical webserver to comprehensively perform further analysis based on the huge amount of preliminary data that is obtained after finishing the process of gene annotation. Therefore, a user-friendly and effective system is required to mine important genes and regulatory pathways under specific conditions from high-throughput transcriptome data. EXPath Tool (available at: http://expathtool.itps.ncku.edu.tw/) was developed for the pathway annotation and comparative analysis of user-customized gene expression profiles derived from microarray or NGS platforms under various conditions to infer metabolic pathways for all organisms in the KEGG database. EXPath Tool contains several functions: access the gene expression patterns and the candidates of co-expression genes; dissect differentially expressed genes (DEGs) between two conditions (DEGs search), functional grouping with pathway and GO (Pathway/GO enrichment analysis), and correlation networks (co-expression analysis), and view the expression patterns of genes involved in specific pathways to infer the effects of the treatment. Additionally, the effectively of EXPath Tool has been performed by a case study on IAA-responsive genes. The results demonstrated that critical hub genes under IAA treatment could be efficiently identified.

  13. An informatic pipeline for managing high-throughput screening experiments and analyzing data from stereochemically diverse libraries

    NASA Astrophysics Data System (ADS)

    Mulrooney, Carol A.; Lahr, David L.; Quintin, Michael J.; Youngsaye, Willmen; Moccia, Dennis; Asiedu, Jacob K.; Mulligan, Evan L.; Akella, Lakshmi B.; Marcaurelle, Lisa A.; Montgomery, Philip; Bittker, Joshua A.; Clemons, Paul A.; Brudz, Stephen; Dandapani, Sivaraman; Duvall, Jeremy R.; Tolliday, Nicola J.; De Souza, Andrea

    2013-05-01

    Integration of flexible data-analysis tools with cheminformatics methods is a prerequisite for successful identification and validation of "hits" in high-throughput screening (HTS) campaigns. We have designed, developed, and implemented a suite of robust yet flexible cheminformatics tools to support HTS activities at the Broad Institute, three of which are described herein. The "hit-calling" tool allows a researcher to set a hit threshold that can be varied during downstream analysis. The results from the hit-calling exercise are reported to a database for record keeping and further data analysis. The "cherry-picking" tool enables creation of an optimized list of hits for confirmatory and follow-up assays from an HTS hit list. This tool allows filtering by computed chemical property and by substructure. In addition, similarity searches can be performed on hits of interest and sets of related compounds can be selected. The third tool, an "S/SAR viewer," has been designed specifically for the Broad Institute's diversity-oriented synthesis (DOS) collection. The compounds in this collection are rich in chiral centers and the full complement of all possible stereoisomers of a given compound are present in the collection. The S/SAR viewer allows rapid identification of both structure/activity relationships and stereo-structure/activity relationships present in HTS data from the DOS collection. Together, these tools enable the prioritization and analysis of hits from diverse compound collections, and enable informed decisions for follow-up biology and chemistry efforts.

  14. Chicken skin virome analyzed by high-throughput sequencing shows a composition highly different from human skin.

    PubMed

    Denesvre, Caroline; Dumarest, Marine; Rémy, Sylvie; Gourichon, David; Eloit, Marc

    2015-10-01

    Recent studies show that human skin at homeostasis is a complex ecosystem whose virome include circular DNA viruses, especially papillomaviruses and polyomaviruses. To determine the chicken skin virome in comparison with human skin virome, a chicken swabs pool sample from fifteen indoor healthy chickens of five genetic backgrounds was examined for the presence of DNA viruses by high-throughput sequencing (HTS). The results indicate a predominance of herpesviruses from the Mardivirus genus, coming from either vaccinal origin or presumably asymptomatic infection. Despite the high sensitivity of the HTS method used herein to detect small circular DNA viruses, we did not detect any papillomaviruses, polyomaviruses, or circoviruses, indicating that these viruses may not be resident of the chicken skin. The results suggest that the turkey herpesvirus is a resident of chicken skin in vaccinated chickens. This study indicates major differences between the skin viromes of chickens and humans. The origin of this difference remains to be further studied in relation with skin physiology, environment, or virus population dynamics.

  15. mzGroupAnalyzer--predicting pathways and novel chemical structures from untargeted high-throughput metabolomics data.

    PubMed

    Doerfler, Hannes; Sun, Xiaoliang; Wang, Lei; Engelmeier, Doris; Lyon, David; Weckwerth, Wolfram

    2014-01-01

    The metabolome is a highly dynamic entity and the final readout of the genotype x environment x phenotype (GxExP) relationship of an organism. Monitoring metabolite dynamics over time thus theoretically encrypts the whole range of possible chemical and biochemical transformations of small molecules involved in metabolism. The bottleneck is, however, the sheer number of unidentified structures in these samples. This represents the next challenge for metabolomics technology and is comparable with genome sequencing 30 years ago. At the same time it is impossible to handle the amount of data involved in a metabolomics analysis manually. Algorithms are therefore imperative to allow for automated m/z feature extraction and subsequent structure or pathway assignment. Here we provide an automated pathway inference strategy comprising measurements of metabolome time series using LC- MS with high resolution and high mass accuracy. An algorithm was developed, called mzGroupAnalyzer, to automatically explore the metabolome for the detection of metabolite transformations caused by biochemical or chemical modifications. Pathways are extracted directly from the data and putative novel structures can be identified. The detected m/z features can be mapped on a van Krevelen diagram according to their H/C and O/C ratios for pattern recognition and to visualize oxidative processes and biochemical transformations. This method was applied to Arabidopsis thaliana treated simultaneously with cold and high light. Due to a protective antioxidant response the plants turn from green to purple color via the accumulation of flavonoid structures. The detection of potential biochemical pathways resulted in 15 putatively new compounds involved in the flavonoid-pathway. These compounds were further validated by product ion spectra from the same data. The mzGroupAnalyzer is implemented in the graphical user interface (GUI) of the metabolomics toolbox COVAIN (Sun & Weckwerth, 2012, Metabolomics 8: 81

  16. mzGroupAnalyzer-Predicting Pathways and Novel Chemical Structures from Untargeted High-Throughput Metabolomics Data

    PubMed Central

    Wang, Lei; Engelmeier, Doris; Lyon, David; Weckwerth, Wolfram

    2014-01-01

    The metabolome is a highly dynamic entity and the final readout of the genotype x environment x phenotype (GxExP) relationship of an organism. Monitoring metabolite dynamics over time thus theoretically encrypts the whole range of possible chemical and biochemical transformations of small molecules involved in metabolism. The bottleneck is, however, the sheer number of unidentified structures in these samples. This represents the next challenge for metabolomics technology and is comparable with genome sequencing 30 years ago. At the same time it is impossible to handle the amount of data involved in a metabolomics analysis manually. Algorithms are therefore imperative to allow for automated m/z feature extraction and subsequent structure or pathway assignment. Here we provide an automated pathway inference strategy comprising measurements of metabolome time series using LC- MS with high resolution and high mass accuracy. An algorithm was developed, called mzGroupAnalyzer, to automatically explore the metabolome for the detection of metabolite transformations caused by biochemical or chemical modifications. Pathways are extracted directly from the data and putative novel structures can be identified. The detected m/z features can be mapped on a van Krevelen diagram according to their H/C and O/C ratios for pattern recognition and to visualize oxidative processes and biochemical transformations. This method was applied to Arabidopsis thaliana treated simultaneously with cold and high light. Due to a protective antioxidant response the plants turn from green to purple color via the accumulation of flavonoid structures. The detection of potential biochemical pathways resulted in 15 putatively new compounds involved in the flavonoid-pathway. These compounds were further validated by product ion spectra from the same data. The mzGroupAnalyzer is implemented in the graphical user interface (GUI) of the metabolomics toolbox COVAIN (Sun & Weckwerth, 2012, Metabolomics 8: 81

  17. A new perspective on studying burial environment before archaeological excavation: analyzing bacterial community distribution by high-throughput sequencing

    PubMed Central

    Xu, Jinjin; Wei, Yanfei; Jia, Hanqing; Xiao, Lin; Gong, Decai

    2017-01-01

    Burial conditions play a crucial role in archaeological heritage preservation. Especially, the microorganisms were considered as the leading causes which incurred degradation and vanishment of historic materials. In this article, we analyzed bacterial diversity and community structure from M1 of Wangshanqiao using 16 S rRNA gene amplicon sequencing. The results indicated that microbial communities in burial conditions were diverse among four different samples. The samples from the robber hole varied most obviously in community structure both in Alpha and Beta diversity. In addition, the dominant phylum in different samples were Proteobacteria, Actinobacteria and Bacteroidetes, respectively. Moreover, the study implied that historical materials preservation conditions had connections with bacterial community distribution. At the genus level, Acinetobacter might possess high ability in degrading organic culture heritage in burial conditions, while Bacteroides were associated closely with favorable preservation conditions. This method contributes to fetch information which would never recover after excavation, and it will help to explore microbial degradation on precious organic culture heritage and further our understanding of archaeological burial environment. The study also indicates that robbery has a serious negative impact on burial remains. PMID:28169321

  18. A new perspective on studying burial environment before archaeological excavation: analyzing bacterial community distribution by high-throughput sequencing.

    PubMed

    Xu, Jinjin; Wei, Yanfei; Jia, Hanqing; Xiao, Lin; Gong, Decai

    2017-02-07

    Burial conditions play a crucial role in archaeological heritage preservation. Especially, the microorganisms were considered as the leading causes which incurred degradation and vanishment of historic materials. In this article, we analyzed bacterial diversity and community structure from M1 of Wangshanqiao using 16 S rRNA gene amplicon sequencing. The results indicated that microbial communities in burial conditions were diverse among four different samples. The samples from the robber hole varied most obviously in community structure both in Alpha and Beta diversity. In addition, the dominant phylum in different samples were Proteobacteria, Actinobacteria and Bacteroidetes, respectively. Moreover, the study implied that historical materials preservation conditions had connections with bacterial community distribution. At the genus level, Acinetobacter might possess high ability in degrading organic culture heritage in burial conditions, while Bacteroides were associated closely with favorable preservation conditions. This method contributes to fetch information which would never recover after excavation, and it will help to explore microbial degradation on precious organic culture heritage and further our understanding of archaeological burial environment. The study also indicates that robbery has a serious negative impact on burial remains.

  19. High-throughput simultaneous determination of plasma water deuterium and 18-oxygen enrichment using a high-temperature conversion elemental analyzer with isotope ratio mass spectrometry.

    PubMed

    Richelle, M; Darimont, C; Piguet-Welsch, C; Fay, L B

    2004-01-01

    This paper presents a high-throughput method for the simultaneous determination of deuterium and oxygen-18 (18O) enrichment of water samples isolated from blood. This analytical method enables rapid and simple determination of these enrichments of microgram quantities of water. Water is converted into hydrogen and carbon monoxide gases by the use of a high-temperature conversion elemental analyzer (TC-EA), that are then transferred on-line into the isotope ratio mass spectrometer. Accuracy determined with the standard light Antartic precipitation (SLAP) and Greenland ice sheet precipitation (GISP) is reliable for deuterium and 18O enrichments. The range of linearity is from 0 up to 0.09 atom percent excess (APE, i.e. -78 up to 5725 delta per mil (dpm)) for deuterium enrichment and from 0 up to 0.17 APE (-11 up to 890 dpm) for 18O enrichment. Memory effects do exist but can be avoided by analyzing the biological samples in quintuplet. This method allows the determination of 1440 samples per week, i.e. 288 biological samples per week.

  20. High-throughput proteomics

    NASA Astrophysics Data System (ADS)

    Lesley, Scott A.; Nasoff, Marc; Kreusch, Andreas; Spraggon, Glen

    2001-04-01

    Proteomics has become a major focus as researchers attempt to understand the vast amount of genomic information. Protein complexity makes identifying and understanding gene function inherently difficult. The challenge of studying proteins in a global way is driving the development of new technologies for systematic and comprehensive analysis of protein structure and function. We are addressing this challenge through instrumentation and approaches to rapidly express, purify, crystallize, and mutate large numbers of human gene products. Our approach applies the principles of HTS technologies commonly used in pharmaceutical development. Genes are cloned, expressed, and purified in parallel to achieve a throughput potential of hundreds per day. Our instrumentation allows us to produce tens of milligrams of protein from 96 separate clones simultaneously. Purified protein is used for several applications including a high-throughput crystallographic screening approach for structure determination using automated image analysis. To further understand protein function, we are integrating a mutagenesis and screening approach. By combining these key technologies, we hope to provide a fundamental basis for understanding gene function at the protein level.

  1. High throughput optical scanner

    SciTech Connect

    Basiji, David A.; van den Engh, Gerrit J.

    2001-01-01

    A scanning apparatus is provided to obtain automated, rapid and sensitive scanning of substrate fluorescence, optical density or phosphorescence. The scanner uses a constant path length optical train, which enables the combination of a moving beam for high speed scanning with phase-sensitive detection for noise reduction, comprising a light source, a scanning mirror to receive light from the light source and sweep it across a steering mirror, a steering mirror to receive light from the scanning mirror and reflect it to the substrate, whereby it is swept across the substrate along a scan arc, and a photodetector to receive emitted or scattered light from the substrate, wherein the optical path length from the light source to the photodetector is substantially constant throughout the sweep across the substrate. The optical train can further include a waveguide or mirror to collect emitted or scattered light from the substrate and direct it to the photodetector. For phase-sensitive detection the light source is intensity modulated and the detector is connected to phase-sensitive detection electronics. A scanner using a substrate translator is also provided. For two dimensional imaging the substrate is translated in one dimension while the scanning mirror scans the beam in a second dimension. For a high throughput scanner, stacks of substrates are loaded onto a conveyor belt from a tray feeder.

  2. High throughput screening informatics.

    PubMed

    Ling, Xuefeng Bruce

    2008-03-01

    High throughput screening (HTS), an industrial effort to leverage developments in the areas of modern robotics, data analysis and control software, liquid handling devices, and sensitive detectors, has played a pivotal role in the drug discovery process, allowing researchers to efficiently screen millions of compounds to identify tractable small molecule modulators of a given biological process or disease state and advance them into high quality leads. As HTS throughput has significantly increased the volume, complexity, and information content of datasets, lead discovery research demands a clear corporate strategy for scientific computing and subsequent establishment of robust enterprise-wide (usually global) informatics platforms, which enable complicated HTS work flows, facilitate HTS data mining, and drive effective decision-making. The purpose of this review is, from the data analysis and handling perspective, to examine key elements in HTS operations and some essential data-related activities supporting or interfacing the screening process, and outline properties that various enabling software should have. Additionally, some general advice for corporate managers with system procurement responsibilities is offered.

  3. BiQ Analyzer HiMod: an interactive software tool for high-throughput locus-specific analysis of 5-methylcytosine and its oxidized derivatives.

    PubMed

    Becker, Daniel; Lutsik, Pavlo; Ebert, Peter; Bock, Christoph; Lengauer, Thomas; Walter, Jörn

    2014-07-01

    Recent data suggest important biological roles for oxidative modifications of methylated cytosines, specifically hydroxymethylation, formylation and carboxylation. Several assays are now available for profiling these DNA modifications genome-wide as well as in targeted, locus-specific settings. Here we present BiQ Analyzer HiMod, a user-friendly software tool for sequence alignment, quality control and initial analysis of locus-specific DNA modification data. The software supports four different assay types, and it leads the user from raw sequence reads to DNA modification statistics and publication-quality plots. BiQ Analyzer HiMod combines well-established graphical user interface of its predecessor tool, BiQ Analyzer HT, with new and extended analysis modes. BiQ Analyzer HiMod also includes updates of the analysis workspace, an intuitive interface, a custom vector graphics engine and support of additional input and output data formats. The tool is freely available as a stand-alone installation package from http://biq-analyzer-himod.bioinf.mpi-inf.mpg.de/.

  4. Diel Growth Cycle of Isolated Leaf Discs Analyzed with a Novel, High-Throughput Three-Dimensional Imaging Method Is Identical to That of Intact Leaves1[W

    PubMed Central

    Biskup, Bernhard; Scharr, Hanno; Fischbach, Andreas; Wiese-Klinkenberg, Anika; Schurr, Ulrich; Walter, Achim

    2009-01-01

    Dicot leaves grow with pronounced diel (24-h) cycles that are controlled by a complex network of factors. It is an open question to what extent leaf growth dynamics are controlled by long-range or by local signals. To address this question, we established a stereoscopic imaging system, GROWSCREEN 3D, which quantifies surface growth of isolated leaf discs floating on nutrient solution in wells of microtiter plates. A total of 458 leaf discs of tobacco (Nicotiana tabacum) were cut at different developmental stages, incubated, and analyzed for their relative growth rates. The camera system was automatically displaced across the array of leaf discs; visualization and camera displacement took about 12 s for each leaf disc, resulting in a time interval of 1.5 h for consecutive size analyses. Leaf discs showed a comparable diel leaf growth cycle as intact leaves but weaker peak growth activity. Hence, it can be concluded that the timing of leaf growth is regulated by local rather than by systemic control processes. This conclusion was supported by results from leaf discs of Arabidopsis (Arabidopsis thaliana) Landsberg erecta wild-type plants and starch-free1 mutants. At night, utilization of transitory starch leads to increased growth of Landsberg erecta wild-type discs compared with starch-free1 discs. Moreover, the decrease of leaf disc growth when exposed to different concentrations of glyphosate showed an immediate dose-dependent response. Our results demonstrate that a dynamic leaf disc growth analysis as we present it here is a promising approach to uncover the effects of internal and external cues on dicot leaf development. PMID:19168641

  5. High-throughput TILLING for functional genomics.

    PubMed

    Till, Bradley J; Colbert, Trenton; Tompa, Rachel; Enns, Linda C; Codomo, Christine A; Johnson, Jessica E; Reynolds, Steven H; Henikoff, Jorja G; Greene, Elizabeth A; Steine, Michael N; Comai, Luca; Henikoff, Steven

    2003-01-01

    Targeting-induced local lesions in genomes (TILLING) is a general strategy for identifying induced point mutations that can be applied to almost any organism. Here, we describe the basic methodology for high-throughput TILLING. Gene segments are amplified using fluorescently tagged primers, and products are denatured and reannealed to form heteroduplexes between the mutated sequence and its wild-type counterpart. These heteroduplexes are substrates for cleavage by the endonuclease CEL I. Following cleavage, products are analyzed on denaturing polyacrylamide gels using the LI-COR DNA analyzer system. High-throughput TILLING has been adopted by the Arabidopsis TILLING Project (ATP) to provide allelic series of point mutations for the general Arabidopsis community.

  6. High-throughput TILLING for Arabidopsis.

    PubMed

    Till, Bradley J; Colbert, Trenton; Codomo, Christine; Enns, Linda; Johnson, Jessica; Reynolds, Steven H; Henikoff, Jorja G; Greene, Elizabeth A; Steine, Michael N; Comai, Luca; Henikoff, Steven

    2006-01-01

    Targeting induced local lesions in genomes (TILLING) is a general strategy for identifying induced point mutations that can be applied to almost any organism. In this chapter, we describe the basic methodology for high-throughput TILLING. Gene segments are amplified using fluorescently tagged primers, and products are denatured and reannealed to form heteroduplexes between the mutated sequence and its wild-type counterpart. These heteroduplexes are substrates for cleavage by the endonuclease CEL I. Following cleavage, products are analyzed on denaturing polyacrylamide gels using the LI-COR DNA analyzer system. High-throughput TILLING has been adopted by the Arabidopsis TILLING Project (ATP) to provide allelic series of point mutations for the general Arabidopsis community.

  7. High Throughput Transcriptomics @ USEPA (Toxicology ...

    EPA Pesticide Factsheets

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  8. Clustering of High Throughput Gene Expression Data

    PubMed Central

    Pirim, Harun; Ekşioğlu, Burak; Perkins, Andy; Yüceer, Çetin

    2012-01-01

    High throughput biological data need to be processed, analyzed, and interpreted to address problems in life sciences. Bioinformatics, computational biology, and systems biology deal with biological problems using computational methods. Clustering is one of the methods used to gain insight into biological processes, particularly at the genomics level. Clearly, clustering can be used in many areas of biological data analysis. However, this paper presents a review of the current clustering algorithms designed especially for analyzing gene expression data. It is also intended to introduce one of the main problems in bioinformatics - clustering gene expression data - to the operations research community. PMID:23144527

  9. High-Throughput Sequencing Technologies

    PubMed Central

    Reuter, Jason A.; Spacek, Damek; Snyder, Michael P.

    2015-01-01

    Summary The human genome sequence has profoundly altered our understanding of biology, human diversity and disease. The path from the first draft sequence to our nascent era of personal genomes and genomic medicine has been made possible only because of the extraordinary advancements in DNA sequencing technologies over the past ten years. Here, we discuss commonly used high-throughput sequencing platforms, the growing array of sequencing assays developed around them as well as the challenges facing current sequencing platforms and their clinical application. PMID:26000844

  10. High throughput protein production screening

    DOEpatents

    Beernink, Peter T.; Coleman, Matthew A.; Segelke, Brent W.

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  11. High-throughput discovery metabolomics.

    PubMed

    Fuhrer, Tobias; Zamboni, Nicola

    2015-02-01

    Non-targeted metabolomics by mass spectrometry has established as the method of choice for investigating metabolic phenotypes in basic and applied research. Compared to other omics, metabolomics provides broad scope and yet direct information on the integrated cellular response with low demand in material and sample preparation. These features render non-targeted metabolomics ideally suited for large scale screens and discovery. Here we review the achievements and potential in high-throughput, non-targeted metabolomics. We found that routine and precise analysis of thousands of small molecular features in thousands of complex samples per day and instrument is already reality, and ongoing developments in microfluidics and integrated interfaces will likely further boost throughput in the next few years. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. High Throughput Plasma Water Treatment

    NASA Astrophysics Data System (ADS)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  13. High-throughput hyperdimensional vertebrate phenotyping

    PubMed Central

    Pardo-Martin, Carlos; Allalou, Amin; Medina, Jaime; Eimon, Peter M.; Wählby, Carolina; Yanik, Mehmet Fatih

    2013-01-01

    Most gene mutations and biologically active molecules cause complex responses in animals that cannot be predicted by cell culture models. Yet animal studies remain too slow and their analyses are often limited to only a few readouts. Here we demonstrate high-throughput optical projection tomography with micrometer resolution and hyperdimensional screening of entire vertebrates in tens of seconds using a simple fluidic system. Hundreds of independent morphological features and complex phenotypes are automatically captured in three dimensions with unprecedented speed and detail in semi-transparent zebrafish larvae. By clustering quantitative phenotypic signatures, we can detect and classify even subtle alterations in many biological processes simultaneously. We term our approach hyperdimensional in vivo phenotyping (HIP). To illustrate the power of HIP, we have analyzed the effects of several classes of teratogens on cartilage formation using 200 independent morphological measurements and identified similarities and differences that correlate well with their known mechanisms of actions in mammals. PMID:23403568

  14. High-Throughput Analysis of Enzyme Activities

    SciTech Connect

    Lu, Guoxin

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  15. High Throughput Sequence Analysis for Disease Resistance in Maize

    USDA-ARS?s Scientific Manuscript database

    Preliminary results of a computational analysis of high throughput sequencing data from Zea mays and the fungus Aspergillus are reported. The Illumina Genome Analyzer was used to sequence RNA samples from two strains of Z. mays (Va35 and Mp313) collected over a time course as well as several specie...

  16. Application of a High-Throughput Analyzer in Evaluating Solid Adsorbents for Post-Combustion Carbon Capture via Multicomponent Adsorption of CO2, N-2, and H2O

    SciTech Connect

    Mason, JA; McDonald, TM; Bae, TH; Bachman, JE; Sumida, K; Dutton, JJ; Kaye, SS; Long, JR

    2015-04-15

    Despite the large number of metal-organic frameworks that have been studied in the context of post-combustion carbon capture, adsorption equilibria of gas mixtures including CO2, N-2, and H2O, which are the three biggest components of the flue gas emanating from a coal- or natural gas-fired power plant, have never been reported. Here, we disclose the design and validation of a high-throughput multicomponent adsorption instrument that can measure equilibrium adsorption isotherms for mixtures of gases at conditions that are representative of an actual flue gas from a power plant. This instrument is used to study 15 different metal-organic frameworks, zeolites, mesoporous silicas, and activated carbons representative of the broad range of solid adsorbents that have received attention for CO2 capture. While the multicomponent results presented in this work provide many interesting fundamental insights, only adsorbents functionalized with alkylamines are shown to have any significant CO2 capacity in the presence of N-2 and H2O at equilibrium partial pressures similar to those expected in a carbon capture process. Most significantly, the amine-appended metal organic framework mmen-Mg-2(dobpdc) (mmen = N,N'-dimethylethylenediamine, dobpdc (4-) = 4,4'-dioxido-3,3'-biphenyldicarboxylate) exhibits a record CO2 capacity of 4.2 +/- 0.2 mmol/g (16 wt %) at 0.1 bar and 40 degrees C in the presence of a high partial pressure of H2O.

  17. Application of a high-throughput analyzer in evaluating solid adsorbents for post-combustion carbon capture via multicomponent adsorption of CO2, N2, and H2O.

    PubMed

    Mason, Jarad A; McDonald, Thomas M; Bae, Tae-Hyun; Bachman, Jonathan E; Sumida, Kenji; Dutton, Justin J; Kaye, Steven S; Long, Jeffrey R

    2015-04-15

    Despite the large number of metal-organic frameworks that have been studied in the context of post-combustion carbon capture, adsorption equilibria of gas mixtures including CO2, N2, and H2O, which are the three biggest components of the flue gas emanating from a coal- or natural gas-fired power plant, have never been reported. Here, we disclose the design and validation of a high-throughput multicomponent adsorption instrument that can measure equilibrium adsorption isotherms for mixtures of gases at conditions that are representative of an actual flue gas from a power plant. This instrument is used to study 15 different metal-organic frameworks, zeolites, mesoporous silicas, and activated carbons representative of the broad range of solid adsorbents that have received attention for CO2 capture. While the multicomponent results presented in this work provide many interesting fundamental insights, only adsorbents functionalized with alkylamines are shown to have any significant CO2 capacity in the presence of N2 and H2O at equilibrium partial pressures similar to those expected in a carbon capture process. Most significantly, the amine-appended metal organic framework mmen-Mg2(dobpdc) (mmen = N,N'-dimethylethylenediamine, dobpdc (4-) = 4,4'-dioxido-3,3'-biphenyldicarboxylate) exhibits a record CO2 capacity of 4.2 ± 0.2 mmol/g (16 wt %) at 0.1 bar and 40 °C in the presence of a high partial pressure of H2O.

  18. Practical High-Throughput Experimentation for Chemists

    PubMed Central

    2017-01-01

    Large arrays of hypothesis-driven, rationally designed experiments are powerful tools for solving complex chemical problems. Conceptual and practical aspects of chemical high-throughput experimentation are discussed. A case study in the application of high-throughput experimentation to a key synthetic step in a drug discovery program and subsequent optimization for the first large scale synthesis of a drug candidate is exemplified. PMID:28626518

  19. Uncertainty Quantification in High Throughput Screening ...

    EPA Pesticide Factsheets

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  20. High Throughput Determination of Critical Human Dosing ...

    EPA Pesticide Factsheets

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data into predicted human equivalent doses that can be linked with biologically relevant exposure scenarios. Thus, HTTK provides essential data for risk prioritization for thousands of chemicals that lack TK data. One critical HTTK parameter that can be measured in vitro is the unbound fraction of a chemical in plasma (Fub). However, for chemicals that bind strongly to plasma, Fub is below the limits of detection (LOD) for high throughput analytical chemistry, and therefore cannot be quantified. A novel method for quantifying Fub was implemented for 85 strategically selected chemicals: measurement of Fub was attempted at 10%, 30%, and 100% of physiological plasma concentrations using rapid equilibrium dialysis assays. Varying plasma concentrations instead of chemical concentrations makes high throughput analytical methodology more likely to be successful. Assays at 100% plasma concentration were unsuccessful for 34 chemicals. For 12 of these 34 chemicals, Fub could be quantified at 10% and/or 30% plasma concentrations; these results imply that the assay failure at 100% plasma concentration was caused by plasma protein binding for these chemicals. Assay failure for the remaining 22 chemicals may

  1. High-Throughput Contact Flow Lithography.

    PubMed

    Le Goff, Gaelle C; Lee, Jiseok; Gupta, Ankur; Hill, William Adam; Doyle, Patrick S

    2015-10-01

    High-throughput fabrication of graphically encoded hydrogel microparticles is achieved by combining flow contact lithography in a multichannel microfluidic device and a high capacity 25 mm LED UV source. Production rates of chemically homogeneous particles are improved by two orders of magnitude. Additionally, the custom-built contact lithography instrument provides an affordable solution for patterning complex microstructures on surfaces.

  2. High-throughput computing in the sciences.

    PubMed

    Morgan, Mark; Grimshaw, Andrew

    2009-01-01

    While it is true that the modern computer is many orders of magnitude faster than that of yesteryear; this tremendous growth in CPU clock rates is now over. Unfortunately, however, the growth in demand for computational power has not abated; whereas researchers a decade ago could simply wait for computers to get faster, today the only solution to the growing need for more powerful computational resource lies in the exploitation of parallelism. Software parallelization falls generally into two broad categories--"true parallel" and high-throughput computing. This chapter focuses on the latter of these two types of parallelism. With high-throughput computing, users can run many copies of their software at the same time across many different computers. This technique for achieving parallelism is powerful in its ability to provide high degrees of parallelism, yet simple in its conceptual implementation. This chapter covers various patterns of high-throughput computing usage and the skills and techniques necessary to take full advantage of them. By utilizing numerous examples and sample codes and scripts, we hope to provide the reader not only with a deeper understanding of the principles behind high-throughput computing, but also with a set of tools and references that will prove invaluable as she explores software parallelism with her own software applications and research.

  3. Evaluation of High-Throughput Chemical Exposure Models ...

    EPA Pesticide Factsheets

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer parent chemical exposures from biomonitoring measurements and forward models to predict multi-pathway exposures from chemical use information and/or residential media concentrations. Here, both forward and reverse modeling methods are used to characterize the relationship between matched near-field environmental (air and dust) and biomarker measurements. Indoor air, house dust, and urine samples from a sample of 120 females (aged 60 to 80 years) were analyzed. In the measured data, 78% of the residential media measurements (across 80 chemicals) and 54% of the urine measurements (across 21 chemicals) were censored, i.e. below the limit of quantification (LOQ). Because of the degree of censoring, we applied a Bayesian approach to impute censored values for 69 chemicals having at least 15% of measurements above LOQ. This resulted in 10 chemicals (5 phthalates, 5 pesticides) with matched air, dust, and urine metabolite measurements. The population medians of indoor air and dust concentrations were compared to population median exposures inferred from urine metabolites concentrations using a high-throughput reverse-dosimetry approach. Median air and dust concentrations were found to be correl

  4. High Throughput Assays for Exposure Science (NIEHS OHAT ...

    EPA Pesticide Factsheets

    High throughput screening (HTS) data that characterize chemically induced biological activity have been generated for thousands of chemicals by the US interagency Tox21 and the US EPA ToxCast programs. In many cases there are no data available for comparing bioactivity from HTS with relevant human exposures. The EPA’s ExpoCast program is developing high-throughput approaches to generate the needed exposure estimates using existing databases and new, high-throughput measurements. The exposure pathway (i.e., the route of chemical from manufacture to human intake) significantly impacts the level of exposure. The presence, concentration, and formulation of chemicals in consumer products and articles of commerce (e.g., clothing) can therefore provide critical information for estimating risk. We have found that there are only limited data available on the chemical constituents (e.g., flame retardants, plasticizers) within most articles of commerce. Furthermore, the presence of some chemicals in otherwise well characterized products may be due to product packaging. We are analyzing sample consumer products using 2D gas chromatograph (GC) x GC Time of Flight Mass Spectrometry (GCxGCTOF/MS), which is suited for forensic investigation of chemicals in complex matrices (including toys, cleaners, and food). In parallel, we are working to create a reference library of retention times and spectral information for the entire Tox21 chemical library. In an examination of five p

  5. NCBI GEO: archive for high-throughput functional genomic data.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  6. High-throughput sequence alignment using Graphics Processing Units.

    PubMed

    Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh

    2007-12-10

    The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  7. MIrExpress: A Database for Gene Coexpression Correlation in Immune Cells Based on Mutual Information and Pearson Correlation.

    PubMed

    Wang, Luman; Mo, Qiaochu; Wang, Jianxin

    2015-01-01

    Most current gene coexpression databases support the analysis for linear correlation of gene pairs, but not nonlinear correlation of them, which hinders precisely evaluating the gene-gene coexpression strengths. Here, we report a new database, MIrExpress, which takes advantage of the information theory, as well as the Pearson linear correlation method, to measure the linear correlation, nonlinear correlation, and their hybrid of cell-specific gene coexpressions in immune cells. For a given gene pair or probe set pair input by web users, both mutual information (MI) and Pearson correlation coefficient (r) are calculated, and several corresponding values are reported to reflect their coexpression correlation nature, including MI and r values, their respective rank orderings, their rank comparison, and their hybrid correlation value. Furthermore, for a given gene, the top 10 most relevant genes to it are displayed with the MI, r, or their hybrid perspective, respectively. Currently, the database totally includes 16 human cell groups, involving 20,283 human genes. The expression data and the calculated correlation results from the database are interactively accessible on the web page and can be implemented for other related applications and researches.

  8. Microfabricated high-throughput electronic particle detector.

    PubMed

    Wood, D K; Requa, M V; Cleland, A N

    2007-10-01

    We describe the design, fabrication, and use of a radio frequency reflectometer integrated with a microfluidic system, applied to the very high-throughput measurement of micron-scale particles, passing in a microfluidic channel through the sensor region. The device operates as a microfabricated Coulter counter [U.S. Patent No. 2656508 (1953)], similar to a design we have described previously, but here with significantly improved electrode geometry as well as including electronic tuning of the reflectometer; the two improvements yielding an improvement by more than a factor of 10 in the signal to noise and in the diametric discrimination of single particles. We demonstrate the high-throughput discrimination of polystyrene beads with diameters in the 4-10 microm range, achieving diametric resolutions comparable to the intrinsic spread of diameters in the bead distribution, at rates in excess of 15 x 10(6) beads/h.

  9. High Throughput Determination of Tetramine in Drinking ...

    EPA Pesticide Factsheets

    Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.

  10. High-throughput in vivo vertebrate screening

    PubMed Central

    Pardo-Martin, Carlos; Chang, Tsung-Yao; Koo, Bryan Kyo; Gilleland, Cody L.; Wasserman, Steven C.; Yanik, Mehmet Fatih

    2010-01-01

    We demonstrate a high-throughput platform for cellular-resolution in vivo pharmaceutical and genetic screens on zebrafish larvae. The system automatically loads animals from reservoirs or multiwell plates, and positions and orients them for high-speed confocal imaging and laser manipulation of both superficial and deep organs within 19 seconds without damage. We show small-scale test screening of retinal axon guidance mutants and neuronal regeneration assays in combination with femtosecond laser microsurgery. PMID:20639868

  11. N-terminal H3/D3-acetylation for improved high-throughput peptide sequencing by matrix-assisted laser desorption/ionization mass spectrometry with a time-of-flight/time-of-flight analyzer.

    PubMed

    Noga, Marek J; Asperger, Arndt; Silberring, Jerzy

    2006-01-01

    A novel method for peptide sequencing by matrix-assisted laser desorption/ionization mass spectrometry with a time-of-flight/time-of-flight analyzer (MALDI-TOF/TOF) is presented. A stable isotope label introduced in the peptide N-terminus by derivatization, using a 1:1 mixture of acetic anhydride and deuterated acetic anhydride, allows for easy and unambiguous identification of ions belonging either to the N- or the C-terminal ion series in the product ion spectrum, making sequence assignment significantly simplified. The good performance of this technique was shown by successful sequencing of the contents of several peptide maps. A similar approach was recently applied to nanoelectrospray ionization (nanoESI) and nano-liquid chromatography/tandem mass spectrometry (LC/MS/MS). The MALDI-TOF/TOF technique allows for fast, direct sequencing of modified peptides in proteomics samples, and is complementary to the nanoESI and nanoLC/MS/MS approaches.

  12. High-throughput neuro-imaging informatics.

    PubMed

    Miller, Michael I; Faria, Andreia V; Oishi, Kenichi; Mori, Susumu

    2013-01-01

    This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high-throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high-dimensional neuroinformatic representation index containing O(1000-10,000) discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high-throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high-throughput machine learning methods for supporting (i) cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii) integration of image and personal medical record non-image information for diagnosis and prognosis.

  13. High-throughput neuro-imaging informatics

    PubMed Central

    Miller, Michael I.; Faria, Andreia V.; Oishi, Kenichi; Mori, Susumu

    2013-01-01

    This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high-throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high-dimensional neuroinformatic representation index containing O(1000–10,000) discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high-throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high-throughput machine learning methods for supporting (i) cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii) integration of image and personal medical record non-image information for diagnosis and prognosis. PMID:24381556

  14. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  15. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  16. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  17. A high volume, high throughput volumetric sorption analyzer

    NASA Astrophysics Data System (ADS)

    Soo, Y. C.; Beckner, M.; Romanos, J.; Wexler, C.; Pfeifer, P.; Buckley, P.; Clement, J.

    2011-03-01

    In this talk we will present an overview of our new Hydrogen Test Fixture (HTF) constructed by the Midwest Research Institute for The Alliance for Collaborative Research in Alternative Fuel Technology to test activated carbon monoliths for hydrogen gas storage. The HTF is an automated, computer-controlled volumetric instrument for rapid screening and manipulation of monoliths under an inert atmosphere (to exclude degradation of carbon from exposure to oxygen). The HTF allows us to measure large quantity (up to 500 g) of sample in a 0.5 l test tank, making our results less sensitive to sample inhomogeneity. The HTF can measure isotherms at pressures ranging from 1 to 300 bar at room temperature. For comparison, other volumetric instruments such as Hiden Isochema's HTP-1 Volumetric Analyser can only measure carbon samples up to 150 mg at pressures up to 200 bar. Work supported by the US DOD Contract # N00164-08-C-GS37.

  18. Adaptive Sampling for High Throughput Data Using Similarity Measures

    SciTech Connect

    Bulaevskaya, V.; Sales, A. P.

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  19. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  20. High throughput screening technologies for ion channels

    PubMed Central

    Yu, Hai-bo; Li, Min; Wang, Wei-ping; Wang, Xiao-liang

    2016-01-01

    Ion channels are involved in a variety of fundamental physiological processes, and their malfunction causes numerous human diseases. Therefore, ion channels represent a class of attractive drug targets and a class of important off-targets for in vitro pharmacological profiling. In the past decades, the rapid progress in developing functional assays and instrumentation has enabled high throughput screening (HTS) campaigns on an expanding list of channel types. Chronologically, HTS methods for ion channels include the ligand binding assay, flux-based assay, fluorescence-based assay, and automated electrophysiological assay. In this review we summarize the current HTS technologies for different ion channel classes and their applications. PMID:26657056

  1. High throughput chemical munitions treatment system

    DOEpatents

    Haroldsen, Brent L [Manteca, CA; Stofleth, Jerome H [Albuquerque, NM; Didlake, Jr., John E.; Wu, Benjamin C-P [San Ramon, CA

    2011-11-01

    A new High-Throughput Explosive Destruction System is disclosed. The new system is comprised of two side-by-side detonation containment vessels each comprising first and second halves that feed into a single agent treatment vessel. Both detonation containment vessels further comprise a surrounding ventilation facility. Moreover, the detonation containment vessels are designed to separate into two half-shells, wherein one shell can be moved axially away from the fixed, second half for ease of access and loading. The vessels are closed by means of a surrounding, clam-shell type locking seal mechanisms.

  2. Preliminary High-Throughput Metagenome Assembly

    SciTech Connect

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  3. Economic consequences of high throughput maskless lithography

    NASA Astrophysics Data System (ADS)

    Hartley, John G.; Govindaraju, Lakshmi

    2005-11-01

    Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?

  4. Incorporating High-Throughput Exposure Predictions with ...

    EPA Pesticide Factsheets

    We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compared against regulatory exposure estimates, providing an activity-to-exposure ratio (AER) useful for a risk-based ranking strategy. As ToxCast™ efforts expand (i.e., Phase II) beyond food-use pesticides towards a wider chemical domain that lacks exposure and toxicity information, prediction tools become increasingly important. In this study, in vitro hepatic clearance and plasma protein binding were measured to estimate OEDs for a subset of Phase II chemicals. OEDs were compared against high-throughput (HT) exposure predictions generated using probabilistic modeling and Bayesian approaches generated by the U.S. EPA ExpoCast™ program. This approach incorporated chemical-specific use and national production volume data with biomonitoring data to inform the exposure predictions. This HT exposure modeling approach provided predictions for all Phase II chemicals assessed in this study whereas estimates from regulatory sources were available for only 7% of chemicals. Of the 163 chemicals assessed in this study, three or 13 chemicals possessed AERs <1 or <100, respectively. Diverse bioactivities y across a range of assays and concentrations was also noted across the wider chemical space su

  5. Modeling Steroidogenesis Disruption Using High-Throughput ...

    EPA Pesticide Factsheets

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  6. A high-throughput radiometric kinase assay

    PubMed Central

    Duong-Ly, Krisna C.; Peterson, Jeffrey R.

    2016-01-01

    Aberrant kinase signaling has been implicated in a number of diseases. While kinases have become attractive drug targets, only a small fraction of human protein kinases have validated inhibitors. Screening libraries of compounds against a kinase or kinases of interest is routinely performed during kinase inhibitor development to identify promising scaffolds for a particular target and to identify kinase targets for compounds of interest. Screening of more focused compound libraries may also be conducted in the later stages of inhibitor development to improve potency and optimize selectivity. The dot blot kinase assay is a robust, high-throughput kinase assay that can be used to screen a number of small molecule compounds against one kinase of interest or several kinases. Here, a protocol for a dot blot kinase assay used for measuring insulin receptor kinase activity is presented. This protocol can be readily adapted for use with other protein kinases. PMID:26501904

  7. High-Throughput Nonlinear Optical Microscopy

    PubMed Central

    So, Peter T.C.; Yew, Elijah Y.S.; Rowlands, Christopher

    2013-01-01

    High-resolution microscopy methods based on different nonlinear optical (NLO) contrast mechanisms are finding numerous applications in biology and medicine. While the basic implementations of these microscopy methods are relatively mature, an important direction of continuing technological innovation lies in improving the throughput of these systems. Throughput improvement is expected to be important for studying fast kinetic processes, for enabling clinical diagnosis and treatment, and for extending the field of image informatics. This review will provide an overview of the fundamental limitations on NLO microscopy throughput. We will further cover several important classes of high-throughput NLO microscope designs with discussions on their strengths and weaknesses and their key biomedical applications. Finally, this review will close with a perspective of potential future technological improvements in this field. PMID:24359736

  8. A high-throughput neutron spectrometer

    NASA Astrophysics Data System (ADS)

    Stampfl, Anton; Noakes, Terry; Bartsch, Friedl; Bertinshaw, Joel; Veliscek-Carolan, Jessica; Nateghi, Ebrahim; Raeside, Tyler; Yethiraj, Mohana; Danilkin, Sergey; Kearley, Gordon

    2010-03-01

    A cross-disciplinary high-throughput neutron spectrometer is currently under construction at OPAL, ANSTO's open pool light-water research reactor. The spectrometer is based on the design of a Be-filter spectrometer (FANS) that is operating at the National Institute of Standards research reactor in the USA. The ANSTO filter-spectrometer will be switched in and out with another neutron spectrometer, the triple-axis spectrometer, Taipan. Thus two distinct types of neutron spectrometers will be accessible: one specialised to perform phonon dispersion analysis and the other, the filter-spectrometer, designed specifically to measure vibrational density of states. A summary of the design will be given along with a detailed ray-tracing analysis. Some preliminary results will be presented from the spectrometer.

  9. Sequential stopping for high-throughput experiments.

    PubMed

    Rossell, David; Müller, Peter

    2013-01-01

    In high-throughput experiments, the sample size is typically chosen informally. Most formal sample-size calculations depend critically on prior knowledge. We propose a sequential strategy that, by updating knowledge when new data are available, depends less critically on prior assumptions. Experiments are stopped or continued based on the potential benefits in obtaining additional data. The underlying decision-theoretic framework guarantees the design to proceed in a coherent fashion. We propose intuitively appealing, easy-to-implement utility functions. As in most sequential design problems, an exact solution is prohibitive. We propose a simulation-based approximation that uses decision boundaries. We apply the method to RNA-seq, microarray, and reverse-phase protein array studies and show its potential advantages. The approach has been added to the Bioconductor package gaga.

  10. High Throughput Screening Tools for Thermoelectric Materials

    NASA Astrophysics Data System (ADS)

    Wong-Ng, W.; Yan, Y.; Otani, M.; Martin, J.; Talley, K. R.; Barron, S.; Carroll, D. L.; Hewitt, C.; Joress, H.; Thomas, E. L.; Green, M. L.; Tang, X. F.

    2015-06-01

    A suite of complementary high-throughput screening systems for combinatorial films was developed at National Institute of Standards and Technology to facilitate the search for efficient thermoelectric materials. These custom-designed capabilities include a facility for combinatorial thin film synthesis and a suite of tools for screening the Seebeck coefficient, electrical resistance (electrical resistivity), and thermal effusivity (thermal conductivity) of these films. The Seebeck coefficient and resistance are measured via custom-built automated apparatus at both ambient and high temperatures. Thermal effusivity is measured using a frequency domain thermoreflectance technique. This paper will discuss applications using these tools on representative thermoelectric materials, including combinatorial composition-spread films, conventional films, single crystals, and ribbons.

  11. High-throughput electrophysiology with Xenopus oocytes

    PubMed Central

    Papke, Roger L.; Smith-Maxwell, Cathy

    2010-01-01

    Voltage-clamp techniques are typically used to study the plasma membrane proteins, such as ion channels and transporters that control bioelectrical signals. Many of these proteins have been cloned and can now be studied as potential targets for drug development. The two approaches most commonly used for heterologous expression of cloned ion channels and transporters involve either transfection of the genes into small cells grown in tissue culture or the injection of the genetic material into larger cells. The standard large cells used for the expression of cloned cDNA or synthetic RNA are the egg progenitor cells (oocytes) of the African frog, Xenopus laevis. Until recently, cellular electrophysiology was performed manually, one cell at a time by a single operator. However, methods of high-throughput electrophysiology have been developed which are automated and permit data acquisition and analysis from multiple cells in parallel. These methods are breaking a bottleneck in drug discovery, useful in some cases for primary screening as well as for thorough characterization of new drugs. Increasing throughput of high-quality functional data greatly augments the efficiency of academic research and pharmaceutical drug development. Some examples of studies that benefit most from high-throughput electrophysiology include pharmaceutical screening of targeted compound libraries, secondary screening of identified compounds for subtype selectivity, screening mutants of ligand-gated channels for changes in receptor function, scanning mutagenesis of protein segments, and mutant-cycle analysis. We describe here the main features and potential applications of OpusXpress, an efficient commercially available system for automated recording from Xenopus oocytes. We show some types of data that have been gathered by this system and review realized and potential applications. PMID:19149490

  12. Origin and evolution of high throughput screening

    PubMed Central

    Pereira, D A; Williams, J A

    2007-01-01

    This article reviews the origin and evolution of high throughput screening (HTS) through the experience of an individual pharmaceutical company, revealing some of the mysteries of the early stages of drug discovery to the wider pharmacology audience. HTS in this company (Pfizer, Groton, USA) had its origin in natural products screening in 1986, by substituting fermentation broths with dimethyl sulphoxide solutions of synthetic compounds, using 96-well plates and reduced assay volumes of 50-100μl. A nominal 30mM source compound concentration provided high μM assay concentrations. Starting at 800 compounds each week, the process reached a steady state of 7200 compounds per week by 1989. Screening in the Applied Biotechnology and Screening Group was centralized with screens operating in lock-step to maximize efficiency. Initial screens were full files run in triplicate. Autoradiography and image analysis were introduced for 125I receptor ligand screens. Reverse transcriptase (RT) coupled with quantitative PCR and multiplexing addressed several targets in a single assay. By 1992 HTS produced ‘hits' as starting matter for approximately 40% of the Discovery portfolio. In 1995, the HTS methodology was expanded to include ADMET targets. ADME targets required each compound to be physically detected leading to the development of automated high throughput LC-MS. In 1996, 90 compounds/week were screened in microsomal, protein binding and serum stability assays. Subsequently, the mutagenic Ames assay was adapted to a 96-well plate liquid assay and novel algorithms permitted automated image analysis of the micronucleus assay. By 1999 ADME HTS was fully integrated into the discovery cycle. PMID:17603542

  13. High-Throughput Methods for Electron Crystallography

    PubMed Central

    Stokes, David L.; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas

    2013-01-01

    Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing the natural environment of a lipid membrane. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, images and diffraction can be recorded by electron microscopy. The corresponding data can be combined to produce a three-dimensional reconstruction which, under favorable conditions, can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative and potentially complementary methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on detergent complexation by cyclodextrin; a specialized pipetting robot has been designed not only to titrate cyclodextrin, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described. PMID:23132066

  14. Compression of Structured High-Throughput Sequencing Data

    PubMed Central

    Campagne, Fabien; Dorff, Kevin C.; Chambwe, Nyasha; Robinson, James T.; Mesirov, Jill P.

    2013-01-01

    Large biological datasets are being produced at a rapid pace and create substantial storage challenges, particularly in the domain of high-throughput sequencing (HTS). Most approaches currently used to store HTS data are either unable to quickly adapt to the requirements of new sequencing or analysis methods (because they do not support schema evolution), or fail to provide state of the art compression of the datasets. We have devised new approaches to store HTS data that support seamless data schema evolution and compress datasets substantially better than existing approaches. Building on these new approaches, we discuss and demonstrate how a multi-tier data organization can dramatically reduce the storage, computational and network burden of collecting, analyzing, and archiving large sequencing datasets. For instance, we show that spliced RNA-Seq alignments can be stored in less than 4% the size of a BAM file with perfect data fidelity. Compared to the previous compression state of the art, these methods reduce dataset size more than 40% when storing exome, gene expression or DNA methylation datasets. The approaches have been integrated in a comprehensive suite of software tools (http://goby.campagnelab.org) that support common analyses for a range of high-throughput sequencing assays. PMID:24260313

  15. High throughput inclusion body sizing: Nano particle tracking analysis.

    PubMed

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Structuring intuition with theory: The high-throughput way

    NASA Astrophysics Data System (ADS)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  17. High-throughput screening to enhance oncolytic virus immunotherapy

    PubMed Central

    Allan, KJ; Stojdl, David F; Swift, SL

    2016-01-01

    High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs) are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. PMID:27579293

  18. A high throughput respirometric assay for mitochondrial biogenesis and toxicity

    PubMed Central

    Beeson, Craig C.; Beeson, Gyda C.; Schnellmann, Rick G.

    2010-01-01

    Mitochondria are a common target of toxicity for drugs and other chemicals, and results in decreased aerobic metabolism and cell death. In contrast, mitochondrial biogenesis restores cell vitality and there is a need for new agents to induce biogenesis. Current cell-based models of mitochondrial biogenesis or toxicity are inadequate because cultured cell lines are highly glycolytic with minimal aerobic metabolism and altered mitochondrial physiology. In addition, there are no high-throughput, real-time assays that assess mitochondrial function. We adapted primary cultures of renal proximal tubular cells (RPTC) that exhibit in vivo levels of aerobic metabolism, are not glycolytic, and retain higher levels of differentiated functions and used the Seahorse Biosciences analyzer to measure mitochondrial function in real time in multi-well plates. Using uncoupled respiration as a marker of electron transport chain (ETC) integrity, the nephrotoxicants cisplatin, HgCl2 and gentamicin exhibited mitochondrial toxicity prior to decreases in basal respiration and cell death. Conversely, using FCCP-uncoupled respiration as a marker of maximal ETC activity, 1-(2,5-dimethoxy-4-iodophenyl)-2-aminopropane (DOI), SRT1720, resveratrol, daidzein, and metformin produced mitochondrial biogenesis in RPTC. The merger of the RPTC model and multi-well respirometry results in a single high throughput assay to measure mitochondrial biogenesis and toxicity, and nephrotoxic potential. PMID:20465991

  19. High-Throughput Screening in Primary Neurons

    PubMed Central

    Sharma, Punita; Ando, D. Michael; Daub, Aaron; Kaye, Julia A.; Finkbeiner, Steven

    2013-01-01

    Despite years of incremental progress in our understanding of diseases such as Alzheimer's disease (AD), Parkinson's disease (PD), Huntington's disease (HD), and amyotrophic lateral sclerosis (ALS), there are still no disease-modifying therapeutics. The discrepancy between the number of lead compounds and approved drugs may partially be a result of the methods used to generate the leads and highlights the need for new technology to obtain more detailed and physiologically relevant information on cellular processes in normal and diseased states. Our high-throughput screening (HTS) system in a primary neuron model can help address this unmet need. HTS allows scientists to assay thousands of conditions in a short period of time which can reveal completely new aspects of biology and identify potential therapeutics in the span of a few months when conventional methods could take years or fail all together. HTS in primary neurons combines the advantages of HTS with the biological relevance of intact, fully differentiated neurons which can capture the critical cellular events or homeostatic states that make neurons uniquely susceptible to disease-associated proteins. We detail methodologies of our primary neuron HTS assay workflow from sample preparation to data reporting. We also discuss our adaptation of our HTS system into high-content screening (HCS), a type of HTS that uses multichannel fluorescence images to capture biological events in situ, and is uniquely suited to study dynamical processes in living cells. PMID:22341232

  20. AOPs and Biomarkers: Bridging High Throughput Screening ...

    EPA Pesticide Factsheets

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  1. High-Throughput Enzyme Kinetics Using Microarrays

    SciTech Connect

    Guoxin Lu; Edward S. Yeung

    2007-11-01

    We report a microanalytical method to study enzyme kinetics. The technique involves immobilizing horseradish peroxidase on a poly-L-lysine (PLL)- coated glass slide in a microarray format, followed by applying substrate solution onto the enzyme microarray. Enzyme molecules are immobilized on the PLL-coated glass slide through electrostatic interactions, and no further modification of the enzyme or glass slide is needed. In situ detection of the products generated on the enzyme spots is made possible by monitoring the light intensity of each spot using a scientific-grade charged-coupled device (CCD). Reactions of substrate solutions of various types and concentrations can be carried out sequentially on one enzyme microarray. To account for the loss of enzyme from washing in between runs, a standard substrate solution is used for calibration. Substantially reduced amounts of substrate solution are consumed for each reaction on each enzyme spot. The Michaelis constant K{sub m} obtained by using this method is comparable to the result for homogeneous solutions. Absorbance detection allows universal monitoring, and no chemical modification of the substrate is needed. High-throughput studies of native enzyme kinetics for multiple enzymes are therefore possible in a simple, rapid, and low-cost manner.

  2. AOPs and Biomarkers: Bridging High Throughput Screening ...

    EPA Pesticide Factsheets

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  3. High-throughput rod-induced electrospinning

    NASA Astrophysics Data System (ADS)

    Wu, Dezhi; Xiao, Zhiming; Teh, Kwok Siong; Han, Zhibin; Luo, Guoxi; Shi, Chuan; Sun, Daoheng; Zhao, Jinbao; Lin, Liwei

    2016-09-01

    A high throughput electrospinning process, directly from flat polymer solution surfaces induced by a moving insulating rod, has been proposed and demonstrated. Different rods made of either phenolic resin or paper with a diameter of 1-3 cm and a resistance of about 100-500 MΩ, has been successfully utilized in the process. The rod is placed approximately 10 mm above the flat polymer solution surface with a moving speed of 0.005-0.4 m s-1 this causes the solution to generate multiple liquid jets under an applied voltage of 15-60 kV for the tip-less electrospinning process. The local electric field induced by the rod can boost electrohydrodynamic instability in order to generate Taylor cones and liquid jets. Experimentally, it is found that a large rod diameter and a small solution-to-rod distance can enhance the local electrical field to reduce the magnitude of the applied voltage. In the prototype setup with poly (ethylene oxide) polymer solution, an area of 5 cm  ×  10 cm and under an applied voltage of 60 kV, the maximum throughput of nanofibers is recorded to be approximately144 g m-2 h-1.

  4. New High Throughput Methods to Estimate Chemical ...

    EPA Pesticide Factsheets

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP

  5. High-throughput Crystallography for Structural Genomics

    PubMed Central

    Joachimiak, Andrzej

    2009-01-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976

  6. New High Throughput Methods to Estimate Chemical ...

    EPA Pesticide Factsheets

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP

  7. Interactive Visual Analysis of High Throughput Text Streams

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Patton, Robert M; Goodall, John R; Maness, Christopher S; Senter, James K; Potok, Thomas E

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  8. Microfluidics for High-Throughput Quantitative Studies of Early Development.

    PubMed

    Levario, Thomas J; Lim, Bomyi; Shvartsman, Stanislav Y; Lu, Hang

    2016-07-11

    Developmental biology has traditionally relied on qualitative analyses; recently, however, as in other fields of biology, researchers have become increasingly interested in acquiring quantitative knowledge about embryogenesis. Advances in fluorescence microscopy are enabling high-content imaging in live specimens. At the same time, microfluidics and automation technologies are increasing experimental throughput for studies of multicellular models of development. Furthermore, computer vision methods for processing and analyzing bioimage data are now leading the way toward quantitative biology. Here, we review advances in the areas of fluorescence microscopy, microfluidics, and data analysis that are instrumental to performing high-content, high-throughput studies in biology and specifically in development. We discuss a case study of how these techniques have allowed quantitative analysis and modeling of pattern formation in the Drosophila embryo.

  9. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider

  10. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  11. An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery

    PubMed Central

    Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing

    2010-01-01

    The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897

  12. Applications of ambient mass spectrometry in high-throughput screening.

    PubMed

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  13. High-throughput techniques for compound characterization and purification.

    PubMed

    Kyranos, J N; Cai, H; Zhang, B; Goetzinger, W K

    2001-11-01

    A new paradigm in drug discovery is the synthesis of structurally diverse collections of compounds, so-called libraries, followed by high-throughput biological screening. High-throughput characterization and purification techniques are required to provide high-quality compounds and reliable biological data, which has led to the development of faster methods, system automation and parallel approaches. This review summarizes recent advances in support of analytical characterization and preparative purification technologies. Notably, mass spectrometry (MS) and supercritical fluid chromatography (SFC) are among the areas where new developments have had a major impact on defining these high-throughput applications.

  14. Massively Parrell Rogue Cell Detection Using Serial Time-Encoded Amplified Microscopy of Inertially Ordered Cells in High Throughput Flow

    DTIC Science & Technology

    2012-08-01

    To show the utility of the STEAM flow analyzer, we used it to demonstrate high-throughput screening of Saccharomyces cerevisiae , commonly known as...protein therapeutics. Growth in yeast can be studied and optimized by flow cytometry or microscopy – both of which possess specific limitations...and provides poor characterization of the asymmetric growth of yeast in comparison with imaging. A high-throughput microscopy technique such as our

  15. High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)

    EPA Science Inventory

    High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...

  16. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  17. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  18. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  19. High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)

    EPA Science Inventory

    High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...

  20. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  1. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  2. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  3. Development of A High Throughput Method Incorporating Traditional Analytical Devices

    PubMed Central

    White, C. C.; Embree, E.; Byrd, W. E; Patel, A. R.

    2004-01-01

    A high-throughput (high throughput is the ability to process large numbers of samples) and companion informatics system has been developed and implemented. High throughput is defined as the ability to autonomously evaluate large numbers of samples, while an informatics system provides the software control of the physical devices, in addition to the organization and storage of the generated electronic data. This high throughput system includes both an ultra-violet and visible light spectrometer (UV-Vis) and a Fourier transform infrared spectrometer (FTIR) integrated with a multi sample positioning table. This method is designed to quantify changes in polymeric materials occurring from controlled temperature, humidity and high flux UV exposures. The integration of the software control of these analytical instruments within a single computer system is presented. Challenges in enhancing the system to include additional analytical devices are discussed. PMID:27366626

  4. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  5. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  6. Automatic Dendritic Length Quantification for High Throughput Screening of Mature Neurons

    PubMed Central

    Smafield, Timothy; Pasupuleti, Venkat; Sharma, Kamal; Huganir, Richard L.; Ye, Bing

    2015-01-01

    High-throughput automated fluorescent imaging and screening are important for studying neuronal development, functions, and pathogenesis. An automatic approach of analyzing images acquired in automated fashion, and quantifying dendritic characteristics is critical for making such screens high-throughput. However, automatic and effective algorithms and tools, especially for the images of mature mammalian neurons with complex arbors, have been lacking. Here, we present algorithms and a tool for quantifying dendritic length that is fundamental for analyzing growth of neuronal network. We employ a divide-and-conquer framework that tackles the challenges of high-throughput images of neurons and enables the integration of multiple automatic algorithms. Within this framework, we developed algorithms that adapt to local properties to detect faint branches. We also developed a path search that can preserve the curvature change to accurately measure dendritic length with arbor branches and turns. In addition, we proposed an ensemble strategy of three estimation algorithms to further improve the overall efficacy. We tested our tool on images for cultured mouse hippocampal neurons immunostained with a dendritic marker for high-throughput screen. Results demonstrate the effectiveness of our proposed method when comparing the accuracy with previous methods. The software has been implemented as an ImageJ plugin and available for use. PMID:25854493

  7. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline.

  8. RABiT-II: Implementation of a High-Throughput Micronucleus Biodosimetry Assay on Commercial Biotech Robotic Systems.

    PubMed

    Repin, Mikhail; Pampou, Sergey; Karan, Charles; Brenner, David J; Garty, Guy

    2017-02-23

    We demonstrate the use of high-throughput biodosimetry platforms based on commercial high-throughput/high-content screening robotic systems. The cytokinesis-block micronucleus (CBMN) assay, using only 20 μl whole blood from a fingerstick, was implemented on a PerkinElmer cell::explorer and General Electric IN Cell Analyzer 2000. On average 500 binucleated cells per sample were detected by our FluorQuantMN software. A calibration curve was generated in the radiation dose range up to 5.0 Gy using the data from 8 donors and 48,083 binucleated cells in total. The study described here demonstrates that high-throughput radiation biodosimetry is practical using current commercial high-throughput/high-content screening robotic systems, which can be readily programmed to perform and analyze robotics-optimized cytogenetic assays. Application to other commercial high-throughput/high-content screening systems beyond the ones used in this study is clearly practical. This approach will allow much wider access to high-throughput biodosimetric screening for large-scale radiological incidents than is currently available.

  9. Improving the specificity of high-throughput ortholog prediction

    PubMed Central

    Fulton, Debra L; Li, Yvonne Y; Laird, Matthew R; Horsman, Benjamin GS; Roche, Fiona M; Brinkman, Fiona SL

    2006-01-01

    Background Orthologs (genes that have diverged after a speciation event) tend to have similar function, and so their prediction has become an important component of comparative genomics and genome annotation. The gold standard phylogenetic analysis approach of comparing available organismal phylogeny to gene phylogeny is not easily automated for genome-wide analysis; therefore, ortholog prediction for large genome-scale datasets is typically performed using a reciprocal-best-BLAST-hits (RBH) approach. One problem with RBH is that it will incorrectly predict a paralog as an ortholog when incomplete genome sequences or gene loss is involved. In addition, there is an increasing interest in identifying orthologs most likely to have retained similar function. Results To address these issues, we present here a high-throughput computational method named Ortholuge that further evaluates previously predicted orthologs (including those predicted using an RBH-based approach) – identifying which orthologs most closely reflect species divergence and may more likely have similar function. Ortholuge analyzes phylogenetic distance ratios involving two comparison species and an outgroup species, noting cases where relative gene divergence is atypical. It also identifies some cases of gene duplication after species divergence. Through simulations of incomplete genome data/gene loss, we show that the vast majority of genes falsely predicted as orthologs by an RBH-based method can be identified. Ortholuge was then used to estimate the number of false-positives (predominantly paralogs) in selected RBH-predicted ortholog datasets, identifying approximately 10% paralogs in a eukaryotic data set (mouse-rat comparison) and 5% in a bacterial data set (Pseudomonas putida – Pseudomonas syringae species comparison). Higher quality (more precise) datasets of orthologs, which we term "ssd-orthologs" (supporting-species-divergence-orthologs), were also constructed. These datasets, as well as

  10. A simple and sensitive high-throughput GFP screening in woody and herbaceous plants.

    PubMed

    Hily, Jean-Michel; Liu, Zongrang

    2009-03-01

    Green fluorescent protein (GFP) has been used widely as a powerful bioluminescent reporter, but its visualization by existing methods in tissues or whole plants and its utilization for high-throughput screening remains challenging in many species. Here, we report a fluorescence image analyzer-based method for GFP detection and its utility for high-throughput screening of transformed plants. Of three detection methods tested, the Typhoon fluorescence scanner was able to detect GFP fluorescence in all Arabidopsis thaliana tissues and apple leaves, while regular fluorescence microscopy detected it only in Arabidopsis flowers and siliques but barely in the leaves of either Arabidopsis or apple. The hand-held UV illumination method failed in all tissues of both species. Additionally, the Typhoon imager was able to detect GFP fluorescence in both green and non-green tissues of Arabidopsis seedlings as well as in imbibed seeds, qualifying it as a high-throughput screening tool, which was further demonstrated by screening the seedlings of primary transformed T(0) seeds. Of the 30,000 germinating Arabidopsis seedlings screened, at least 69 GFP-positive lines were identified, accounting for an approximately 0.23% transformation efficiency. About 14,000 seedlings grown in 16 Petri plates could be screened within an hour, making the screening process significantly more efficient and robust than any other existing high-throughput screening method for transgenic plants.

  11. Translational informatics: enabling high-throughput research paradigms

    PubMed Central

    Embi, Peter J.; Sen, Chandan K.

    2009-01-01

    A common thread throughout the clinical and translational research domains is the need to collect, manage, integrate, analyze, and disseminate large-scale, heterogeneous biomedical data sets. However, well-established and broadly adopted theoretical and practical frameworks and models intended to address such needs are conspicuously absent in the published literature or other reputable knowledge sources. Instead, the development and execution of multidisciplinary, clinical, or translational studies are significantly limited by the propagation of “silos” of both data and expertise. Motivated by this fundamental challenge, we report upon the current state and evolution of biomedical informatics as it pertains to the conduct of high-throughput clinical and translational research and will present both a conceptual and practical framework for the design and execution of informatics-enabled studies. The objective of presenting such findings and constructs is to provide the clinical and translational research community with a common frame of reference for discussing and expanding upon such models and methodologies. PMID:19737991

  12. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  13. Analysis of DNA Sequence Variants Detected by High Throughput Sequencing

    PubMed Central

    Adams, David R; Sincan, Murat; Fajardo, Karin Fuentes; Mullikin, James C; Pierson, Tyler M; Toro, Camilo; Boerkoel, Cornelius F; Tifft, Cynthia J; Gahl, William A; Markello, Tom C

    2014-01-01

    The Undiagnosed Diseases Program at the National Institutes of Health uses High Throughput Sequencing (HTS) to diagnose rare and novel diseases. HTS techniques generate large numbers of DNA sequence variants, which must be analyzed and filtered to find candidates for disease causation. Despite the publication of an increasing number of successful exome-based projects, there has been little formal discussion of the analytic steps applied to HTS variant lists. We present the results of our experience with over 30 families for whom HTS sequencing was used in an attempt to find clinical diagnoses. For each family, exome sequence was augmented with high-density SNP-array data. We present a discussion of the theory and practical application of each analytic step and provide example data to illustrate our approach. The paper is designed to provide an analytic roadmap for variant analysis, thereby enabling a wide range of researchers and clinical genetics practitioners to perform direct analysis of HTS data for their patients and projects. PMID:22290882

  14. Efficient Management of High-Throughput Screening Libraries with SAVANAH.

    PubMed

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen; Christiansen, Helle; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2017-02-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects) sample information from the library to experimental results from the assay plates. All results can be exported to the R statistical environment or piped into HiTSeekR ( http://hitseekr.compbio.sdu.dk ) for comprehensive follow-up analyses. In summary, SAVANAH supports the HTS community in managing and analyzing HTS experiments with an emphasis on serially diluted molecular libraries.

  15. High-throughput screening and biophysical interrogation of hepatotropic AAV.

    PubMed

    Murphy, Samuel L; Bhagwat, Anand; Edmonson, Shyrie; Zhou, Shangzhen; High, Katherine A

    2008-12-01

    We set out to analyze the fundamental biological differences between AAV2 and AAV8 that may contribute to their different performances in vivo. High-throughput protein interaction screens were used to identify binding partners for each serotype. Of the >8,000 proteins probed, 115 and 134 proteins were identified that interact with AAV2 and AAV8, respectively. Notably, 76 of these protein interactions were shared between the two serotypes. CDK2/cyclinA kinase was identified as a binding partner for both serotypes in the screen. Subsequent analysis confirmed direct binding of CDK2/cyclinA by AAV2 and AAV8. Inhibition of CDK2/cyclinA resulted in increased levels of vector transduction. Biophysical study of vector particle stability and genome uncoating demonstrated slightly greater thermostability for AAV8 than for AAV2. Heat-induced genome uncoating occurred at the same temperature as particle degradation, suggesting that these two processes may be intrinsically related for adeno-associated virus (AAV). Together, these analyses provide insight into commonalities and divergences in the biology of functionally distinct hepatotropic AAV serotypes.

  16. Hypothesis testing in high-throughput screening for drug discovery.

    PubMed

    Prummer, Michael

    2012-04-01

    Following the success of small-molecule high-throughput screening (HTS) in drug discovery, other large-scale screening techniques are currently revolutionizing the biological sciences. Powerful new statistical tools have been developed to analyze the vast amounts of data in DNA chip studies, but have not yet found their way into compound screening. In HTS, characterization of single-point hit lists is often done only in retrospect after the results of confirmation experiments are available. However, for prioritization, for optimal use of resources, for quality control, and for comparison of screens it would be extremely valuable to predict the rates of false positives and false negatives directly from the primary screening results. Making full use of the available information about compounds and controls contained in HTS results and replicated pilot runs, the Z score and from it the p value can be estimated for each measurement. Based on this consideration, we have applied the concept of p-value distribution analysis (PVDA), which was originally developed for gene expression studies, to HTS data. PVDA allowed prediction of all relevant error rates as well as the rate of true inactives, and excellent agreement with confirmation experiments was found.

  17. Savant: genome browser for high-throughput sequencing data.

    PubMed

    Fiume, Marc; Williams, Vanessa; Brook, Andrew; Brudno, Michael

    2010-08-15

    The advent of high-throughput sequencing (HTS) technologies has made it affordable to sequence many individuals' genomes. Simultaneously the computational analysis of the large volumes of data generated by the new sequencing machines remains a challenge. While a plethora of tools are available to map the resulting reads to a reference genome, and to conduct primary analysis of the mappings, it is often necessary to visually examine the results and underlying data to confirm predictions and understand the functional effects, especially in the context of other datasets. We introduce Savant, the Sequence Annotation, Visualization and ANalysis Tool, a desktop visualization and analysis browser for genomic data. Savant was developed for visualizing and analyzing HTS data, with special care taken to enable dynamic visualization in the presence of gigabases of genomic reads and references the size of the human genome. Savant supports the visualization of genome-based sequence, point, interval and continuous datasets, and multiple visualization modes that enable easy identification of genomic variants (including single nucleotide polymorphisms, structural and copy number variants), and functional genomic information (e.g. peaks in ChIP-seq data) in the context of genomic annotations. Savant is freely available at http://compbio.cs.toronto.edu/savant.

  18. High Throughput Sequencing: An Overview of Sequencing Chemistry.

    PubMed

    Ambardar, Sheetal; Gupta, Rikita; Trakroo, Deepika; Lal, Rup; Vakhlu, Jyoti

    2016-12-01

    In the present century sequencing is to the DNA science, what gel electrophoresis was to it in the last century. From 1977 to 2016 three generation of the sequencing technologies of various types have been developed. Second and third generation sequencing technologies referred commonly to as next generation sequencing technology, has evolved significantly with increase in sequencing speed, decrease in sequencing cost, since its inception in 2004. GS FLX by 454 Life Sciences/Roche diagnostics, Genome Analyzer, HiSeq, MiSeq and NextSeq by Illumina, Inc., SOLiD by ABI, Ion Torrent by Life Technologies are various type of the sequencing platforms available for second generation sequencing. The platforms available for the third generation sequencing are Helicos™ Genetic Analysis System by SeqLL, LLC, SMRT Sequencing by Pacific Biosciences, Nanopore sequencing by Oxford Nanopore's, Complete Genomics by Beijing Genomics Institute and GnuBIO by BioRad, to name few. The present article is an overview of the principle and the sequencing chemistry of these high throughput sequencing technologies along with brief comparison of various types of sequencing platforms available.

  19. RNA Secondary Structure Prediction Using High-throughput SHAPE

    PubMed Central

    Purzycka, Katarzyna J.; Rausch, Jason W.; Le Grice, Stuart F.J.

    2013-01-01

    Understanding the function of RNA involved in biological processes requires a thorough knowledge of RNA structure. Toward this end, the methodology dubbed "high-throughput selective 2' hydroxyl acylation analyzed by primer extension", or SHAPE, allows prediction of RNA secondary structure with single nucleotide resolution. This approach utilizes chemical probing agents that preferentially acylate single stranded or flexible regions of RNA in aqueous solution. Sites of chemical modification are detected by reverse transcription of the modified RNA, and the products of this reaction are fractionated by automated capillary electrophoresis (CE). Since reverse transcriptase pauses at those RNA nucleotides modified by the SHAPE reagents, the resulting cDNA library indirectly maps those ribonucleotides that are single stranded in the context of the folded RNA. Using ShapeFinder software, the electropherograms produced by automated CE are processed and converted into nucleotide reactivity tables that are themselves converted into pseudo-energy constraints used in the RNAStructure (v5.3) prediction algorithm. The two-dimensional RNA structures obtained by combining SHAPE probing with in silico RNA secondary structure prediction have been found to be far more accurate than structures obtained using either method alone. PMID:23748604

  20. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  1. Experimental Design for Combinatorial and High Throughput Materials Development

    NASA Astrophysics Data System (ADS)

    Cawse, James N.

    2002-12-01

    In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.

  2. High-throughput screening of a Corynebacterium glutamicum mutant library on genomic and metabolic level.

    PubMed

    Reimer, Lorenz C; Spura, Jana; Schmidt-Hohagen, Kerstin; Schomburg, Dietmar

    2014-01-01

    Due to impressive achievements in genomic research, the number of genome sequences has risen quickly, followed by an increasing number of genes with unknown or hypothetical function. This strongly calls for development of high-throughput methods in the fields of transcriptomics, proteomics and metabolomics. Of these platforms, metabolic profiling has the strongest correlation with the phenotype. We previously published a high-throughput metabolic profiling method for C. glutamicum as well as the automatic GC/MS processing software MetaboliteDetector. Here, we added a high-throughput transposon insertion determination for our C. glutamicum mutant library. The combination of these methods allows the parallel analysis of genotype/phenotype correlations for a large number of mutants. In a pilot project we analyzed the insertion points of 722 transposon mutants and found that 36% of the affected genes have unknown functions. This underlines the need for further information gathered by high-throughput techniques. We therefore measured the metabolic profiles of 258 randomly chosen mutants. The MetaboliteDetector software processed this large amount of GC/MS data within a few hours with a low relative error of 11.5% for technical replicates. Pairwise correlation analysis of metabolites over all genotypes showed dependencies of known and unknown metabolites. For a first insight into this large data set, a screening for interesting mutants was done by a pattern search, focusing on mutants with changes in specific pathways. We show that our transposon mutant library is not biased with respect to insertion points. A comparison of the results for specific mutants with previously published metabolic results on a deletion mutant of the same gene confirmed the concept of high-throughput metabolic profiling. Altogether the described method could be applied to whole mutant libraries and thereby help to gain comprehensive information about genes with unknown, hypothetical and known

  3. Combinatorial and high-throughput screening approaches for strain engineering.

    PubMed

    Liu, Wenshan; Jiang, Rongrong

    2015-03-01

    Microbes have long been used in the industry to produce valuable biochemicals. Combinatorial engineering approaches, new strain engineering tools derived from inverse metabolic engineering, have started to attract attention in recent years, including genome shuffling, error-prone DNA polymerase, global transcription machinery engineering (gTME), random knockout/overexpression libraries, ribosome engineering, multiplex automated genome engineering (MAGE), customized optimization of metabolic pathways by combinatorial transcriptional engineering (COMPACTER), and library construction of "tunable intergenic regions" (TIGR). Since combinatorial approaches and high-throughput screening methods are fundamentally interconnected, color/fluorescence-based, growth-based, and biosensor-based high-throughput screening methods have been reviewed. We believe that with the help of metabolic engineering tools and new combinatorial approaches, plus effective high-throughput screening methods, researchers will be able to achieve better results on improving microorganism performance under stress or enhancing biochemical yield.

  4. High-throughput minor histocompatibility antigen prediction.

    PubMed

    DeLuca, David S; Eiz-Vesper, Britta; Ladas, Nektarios; Khattab, Barbara Anna-Maria; Blasczyk, Rainer

    2009-09-15

    Minor histocompatibility antigens (mHags) are a diverse collection of MHC-bound peptides that have immunological implications in the context of allogeneic transplantation because of their differential presence in donor and host, and thus play a critical role in the induction of the detrimental graft-versus-host disease (GvHD) or in the development of the beneficial graft-versus-leukemia (GvL) effect. Therefore, the search for mHags has implications not only for preventing GvHD, but also for therapeutic applications involving leukemia-specific T cells. We have created a web-based system, named PeptideCheck, which aims to augment the experimental discovery of mHags using bioinformatic means. Analyzing peptide elution data to search for mHags and predicting mHags from polymorphism and protein databases are the core features. Comparison with known mHag data reveals that some but not all of the previously known mHags can be reproduced. By applying a system of filtering and ranking, we were able to produce an ordered list of potential mHag candidates in which HA-1, HA-3 and HA-8 occur in the best 0.25%. By combining single nucleotide polymorphism, protein, tissue expression and genotypic frequency data, together with antigen presentation prediction algorithms, we propose a list of the best peptide candidates which could potentially induce the GvL effect without causing GvFD. http://www.peptidecheck.org.

  5. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  6. Perspective: Data infrastructure for high throughput materials discovery

    NASA Astrophysics Data System (ADS)

    Pfeif, E. A.; Kroenlein, K.

    2016-05-01

    Computational capability has enabled materials design to evolve from trial-and-error towards more informed methodologies that require large amounts of data. Expert-designed tools and their underlying databases facilitate modern-day high throughput computational methods. Standard data formats and communication standards increase the impact of traditional data, and applying these technologies to a high throughput experimental design provides dense, targeted materials data that are valuable for material discovery. Integrated computational materials engineering requires both experimentally and computationally derived data. Harvesting these comprehensively requires different methods of varying degrees of automation to accommodate variety and volume. Issues of data quality persist independent of type.

  7. Screening and synthesis: high throughput technologies applied to parasitology.

    PubMed

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  8. Implementation of high throughput experimentation techniques for kinetic reaction testing.

    PubMed

    Nagy, Anton J

    2012-02-01

    Successful implementation of High throughput Experimentation (EE) tools has resulted in their increased acceptance as essential tools in chemical, petrochemical and polymer R&D laboratories. This article provides a number of concrete examples of EE systems, which have been designed and successfully implemented in studies, which focus on deriving reaction kinetic data. The implementation of high throughput EE tools for performing kinetic studies of both catalytic and non-catalytic systems results in a significantly faster acquisition of high-quality kinetic modeling data, required to quantitatively predict the behavior of complex, multistep reactions.

  9. Advances in high throughput DNA sequence data compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz

    2016-06-01

    Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.

  10. Droplet microfluidics for high-throughput biological assays.

    PubMed

    Guo, Mira T; Rotem, Assaf; Heyman, John A; Weitz, David A

    2012-06-21

    Droplet microfluidics offers significant advantages for performing high-throughput screens and sensitive assays. Droplets allow sample volumes to be significantly reduced, leading to concomitant reductions in cost. Manipulation and measurement at kilohertz speeds enable up to 10(8) samples to be screened in one day. Compartmentalization in droplets increases assay sensitivity by increasing the effective concentration of rare species and decreasing the time required to reach detection thresholds. Droplet microfluidics combines these powerful features to enable currently inaccessible high-throughput screening applications, including single-cell and single-molecule assays.

  11. Insights to transcriptional networks by using high throughput RNAi strategies.

    PubMed

    Mattila, Jaakko; Puig, Oscar

    2010-01-01

    RNA interference (RNAi) is a powerful method to unravel the role of a given gene in eukaryotic cells. The development of high throughput assay platforms such as fluorescence plate readers and high throughput microscopy has allowed the design of genome wide RNAi screens to systemically discern members of regulatory networks around various cellular processes. Here we summarize the different strategies employed in RNAi screens to reveal regulators of transcriptional networks. We focus our discussion in experimental approaches designed to uncover regulatory interactions modulating transcription factor activity.

  12. High-throughput screening for modulators of cellular contractile force†

    PubMed Central

    Park, Chan Young; Zhou, Enhua H.; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J.; Marinkovic, Aleksandar; Tschumperlin, Daniel J.; Burger, Stephanie; Frykenberg, Matthew; Butler, James P.; Stamer, W. Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J.

    2015-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signalling intermediates with poorly defined relationships to such a physiological endpoint. Using cellular force as the target, here we report a new screening technology and demonstrate its applications using human airway smooth muscle cells in the context of asthma and Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery. PMID:25953078

  13. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    SciTech Connect

    Ni, Jing

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  14. An improved high throughput sequencing method for studying oomycete communities.

    PubMed

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-03-01

    Culture-independent studies using next generation sequencing have revolutionized microbial ecology, however, oomycete ecology in soils is severely lagging behind. The aim of this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomycete communities. The well-known primer sets ITS4, ITS6 and ITS7 were used in the study in a semi-nested PCR approach to target the internal transcribed spacer (ITS) 1 of ribosomal DNA in a next generation sequencing protocol. These primers have been used in similar studies before, but with limited success. We were able to increase the proportion of retrieved oomycete sequences dramatically mainly by increasing the annealing temperature during PCR. The optimized protocol was validated using three mock communities and the method was further evaluated using total DNA from 26 soil samples collected from different agricultural fields in Denmark, and 11 samples from carrot tissue with symptoms of Pythium infection. Sequence data from the Pythium and Phytophthora mock communities showed that our strategy successfully detected all included species. Taxonomic assignments of OTUs from 26 soil sample showed that 95% of the sequences could be assigned to oomycetes including Pythium, Aphanomyces, Peronospora, Saprolegnia and Phytophthora. A high proportion of oomycete reads was consistently present in all 26 soil samples showing the versatility of the strategy. A large diversity of Pythium species including pathogenic and saprophytic species were dominating in cultivated soil. Finally, we analyzed amplicons from carrots with symptoms of cavity spot. This resulted in 94% of the reads belonging to oomycetes with a dominance of species of Pythium that are known to be involved in causing cavity spot, thus demonstrating the usefulness of the method not only in soil DNA but also in a plant DNA background. In conclusion, we demonstrate a successful approach for pyrosequencing of oomycete

  15. Automated Segmentation and Classification of High Throughput Yeast Assay Spots

    PubMed Central

    Jafari-Khouzani, Kourosh; Soltanian-Zadeh, Hamid; Fotouhi, Farshad; Parrish, Jodi R.; Finley, Russell L.

    2009-01-01

    Several technologies for characterizing genes and proteins from humans and other organisms use yeast growth or color development as read outs. The yeast two-hybrid assay, for example, detects protein-protein interactions by measuring the growth of yeast on a specific solid medium, or the ability of the yeast to change color when grown on a medium containing a chromogenic substrate. Current systems for analyzing the results of these types of assays rely on subjective and inefficient scoring of growth or color by human experts. Here an image analysis system is described for scoring yeast growth and color development in high throughput biological assays. The goal is to locate the spots and score them in color images of two types of plates named “X-Gal” and “growth assay” plates, with uniformly placed spots (cell areas) on each plate (both plates in one image). The scoring system relies on color for the X-Gal spots, and texture properties for the growth assay spots. A maximum likelihood projection-based segmentation is developed to automatically locate spots of yeast on each plate. Then color histogram and wavelet texture features are extracted for scoring using an optimal linear transformation. Finally an artificial neural network is used to score the X-Gal and growth assay spots using the extracted features. The performance of the system is evaluated using spots of 60 images. After training the networks using training and validation sets, the system was assessed on the test set. The overall accuracies of 95.4% and 88.2% are achieved respectively for scoring the X-Gal and growth assay spots. PMID:17948730

  16. Automatic Spot Identification for High Throughput Microarray Analysis

    PubMed Central

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  17. Environmental Impact on Vascular Development Predicted by High Throughput Screening

    EPA Science Inventory

    Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...

  18. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  19. Environmental Impact on Vascular Development Predicted by High Throughput Screening

    EPA Science Inventory

    Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...

  20. High-throughput production of two disulphide-bridge toxins.

    PubMed

    Upert, Grégory; Mourier, Gilles; Pastor, Alexandra; Verdenaud, Marion; Alili, Doria; Servent, Denis; Gilles, Nicolas

    2014-08-07

    A quick and efficient production method compatible with high-throughput screening was developed using 36 toxins belonging to four different families of two disulphide-bridge toxins. Final toxins were characterized using HPLC co-elution, CD and pharmacological studies.

  1. High Throughput Assays and Exposure Science (ISES annual meeting)

    EPA Science Inventory

    High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...

  2. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    EPA Science Inventory

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  3. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  4. High Throughput Assays and Exposure Science (ISES annual meeting)

    EPA Science Inventory

    High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...

  5. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    EPA Science Inventory

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  6. New High Throughput Methods to Estimate Chemical Exposure

    EPA Science Inventory

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...

  7. A Functional High-Throughput Assay of Myelination in Vitro

    DTIC Science & Technology

    2014-07-01

    potential therapies for myelin disorders such as multiple sclerosis . Tissues engineered from human induced pluripotent stem (iPS) may be effective at...Human induced pluripotent stem cells , hydrogels, 3D culture, electrophysiology, high-throughput assay 16. SECURITY CLASSIFICATION OF: 17...or remyelination would substantially speed the development and testing of potential therapies for myelin disorders such as multiple sclerosis

  8. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  9. HTTK: R Package for High-Throughput Toxicokinetics

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  10. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  11. New High Throughput Methods to Estimate Chemical Exposure

    EPA Science Inventory

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...

  12. Accounting For Uncertainty in The Application Of High Throughput Datasets

    EPA Science Inventory

    The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...

  13. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  14. 20170612 - Fun with High Throughput Toxicokinetics (CalEPA webinar)

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  15. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  16. Evaluating and Refining High Throughput Tools for Toxicokinetics

    EPA Science Inventory

    This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...

  17. Evaluating and Refining High Throughput Tools for Toxicokinetics

    EPA Science Inventory

    This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...

  18. Accounting For Uncertainty in The Application Of High Throughput Datasets

    EPA Science Inventory

    The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...

  19. High-throughput miniaturized microfluidic microscopy with radially parallelized channel geometry.

    PubMed

    Jagannadh, Veerendra Kalyan; Bhat, Bindu Prabhath; Nirupa Julius, Lourdes Albina; Gorthi, Sai Siva

    2016-03-01

    In this article, we present a novel approach to throughput enhancement in miniaturized microfluidic microscopy systems. Using the presented approach, we demonstrate an inexpensive yet high-throughput analytical instrument. Using the high-throughput analytical instrument, we have been able to achieve about 125,880 cells per minute (more than one hundred and twenty five thousand cells per minute), even while employing cost-effective low frame rate cameras (120 fps). The throughput achieved here is a notable progression in the field of diagnostics as it enables rapid quantitative testing and analysis. We demonstrate the applicability of the instrument to point-of-care diagnostics, by performing blood cell counting. We report a comparative analysis between the counts (in cells per μl) obtained from our instrument, with that of a commercially available hematology analyzer.

  20. Application of high-throughput sequencing in understanding human oral microbiome related with health and disease

    PubMed Central

    Chen, Hui; Jiang, Wen

    2014-01-01

    The oral microbiome is one of most diversity habitat in the human body and they are closely related with oral health and disease. As the technique developing, high-throughput sequencing has become a popular approach applied for oral microbial analysis. Oral bacterial profiles have been studied to explore the relationship between microbial diversity and oral diseases such as caries and periodontal disease. This review describes the application of high-throughput sequencing for characterization of oral microbiota and analyzing the changes of the microbiome in the states of health or disease. Deep understanding the knowledge of microbiota will pave the way for more effective prevent dentistry and contribute to the development of personalized dental medicine. PMID:25352835

  1. Targeted DNA methylation analysis by high throughput sequencing in porcine peri-attachment embryos.

    PubMed

    Morrill, Benson H; Cox, Lindsay; Ward, Anika; Heywood, Sierra; Prather, Randall S; Isom, S Clay

    2013-01-01

    The purpose of this experiment was to implement and evaluate the effectiveness of a next-generation sequencing-based method for DNA methylation analysis in porcine embryonic samples. Fourteen discrete genomic regions were amplified by PCR using bisulfite-converted genomic DNA derived from day 14 in vivo-derived (IVV) and parthenogenetic (PA) porcine embryos as template DNA. Resulting PCR products were subjected to high-throughput sequencing using the Illumina Genome Analyzer IIx platform. The average depth of sequencing coverage was 14,611 for IVV and 17,068 for PA. Quantitative analysis of the methylation profiles of both input samples for each genomic locus showed distinct differences in methylation profiles between IVV and PA samples for six of the target loci, and subtle differences in four loci. It was concluded that high throughput sequencing technologies can be effectively applied to provide a powerful, cost-effective approach to targeted DNA methylation analysis of embryonic and other reproductive tissues.

  2. High-throughput analysis of algal crude oils using high resolution mass spectrometry.

    PubMed

    Lee, Young Jin; Leverence, Rachael C; Smith, Erica A; Valenstein, Justin S; Kandel, Kapil; Trewyn, Brian G

    2013-03-01

    Lipid analysis often needs to be specifically optimized for each class of compounds due to its wide variety of chemical and physical properties. It becomes a serious bottleneck in the development of algae-based next generation biofuels when high-throughput analysis becomes essential for the optimization of various process conditions. We propose a high-resolution mass spectrometry-based high-throughput assay as a 'quick-and-dirty' protocol to monitor various lipid classes in algal crude oils. Atmospheric pressure chemical ionization was determined to be most effective for this purpose to cover a wide range of lipid classes. With an autosampler-LC pump set-up, we could analyze algal crude samples every one and half minutes, monitoring several lipid species such as TAG, DAG, squalene, sterols, and chlorophyll a. High-mass resolution and high-mass accuracy of the orbitrap mass analyzer provides confidence in the identification of these lipid compounds. MS/MS and MS3 analysis could be performed in parallel for further structural information, as demonstrated for TAG and DAG. This high-throughput method was successfully demonstrated for semi-quantitative analysis of algal oils after treatment with various nanoparticles.

  3. An improved high-throughput lipid extraction method for the analysis of human brain lipids.

    PubMed

    Abbott, Sarah K; Jenner, Andrew M; Mitchell, Todd W; Brown, Simon H J; Halliday, Glenda M; Garner, Brett

    2013-03-01

    We have developed a protocol suitable for high-throughput lipidomic analysis of human brain samples. The traditional Folch extraction (using chloroform and glass-glass homogenization) was compared to a high-throughput method combining methyl-tert-butyl ether (MTBE) extraction with mechanical homogenization utilizing ceramic beads. This high-throughput method significantly reduced sample handling time and increased efficiency compared to glass-glass homogenizing. Furthermore, replacing chloroform with MTBE is safer (less carcinogenic/toxic), with lipids dissolving in the upper phase, allowing for easier pipetting and the potential for automation (i.e., robotics). Both methods were applied to the analysis of human occipital cortex. Lipid species (including ceramides, sphingomyelins, choline glycerophospholipids, ethanolamine glycerophospholipids and phosphatidylserines) were analyzed via electrospray ionization mass spectrometry and sterol species were analyzed using gas chromatography mass spectrometry. No differences in lipid species composition were evident when the lipid extraction protocols were compared, indicating that MTBE extraction with mechanical bead homogenization provides an improved method for the lipidomic profiling of human brain tissue.

  4. Automated High-Throughput Identification and Characterization of Clinically Important Bacteria and Fungi using Rapid Evaporative Ionization Mass Spectrometry.

    PubMed

    Bolt, Frances; Cameron, Simon J S; Karancsi, Tamas; Simon, Daniel; Schaffer, Richard; Rickards, Tony; Hardiman, Kate; Burke, Adam; Bodai, Zsolt; Perdones-Montero, Alvaro; Rebec, Monica; Balog, Julia; Takats, Zoltan

    2016-10-04

    Rapid evaporative ionization mass spectrometry (REIMS) has been shown to quickly and accurately speciate microorganisms based upon their species-specific lipid profile. Previous work by members of this group showed that the use of a hand-held bipolar probe allowed REIMS to analyze microbial cultures directly from culture plates without any prior preparation. However, this method of analysis would likely be unsuitable for a high-throughput clinical microbiology laboratory. Here, we report the creation of a customized platform that enables automated, high-throughput REIMS analysis that requires minimal user input and operation and is suitable for use in clinical microbiology laboratories. The ability of this high-throughput platform to speciate clinically important microorganisms was tested through the analysis of 375 different clinical isolates collected from distinct patient samples from 25 microbial species. After optimization of our data analysis approach, we achieved substantially similar results between the two REIMS approaches. For hand-held bipolar probe REIMS, a speciation accuracy of 96.3% was achieved, whereas for high-throughput REIMS, an accuracy of 93.9% was achieved. Thus, high-throughput REIMS offers an alternative mass spectrometry based method for the rapid and accurate identification of clinically important microorganisms in clinical laboratories without any preanalysis preparative steps.

  5. Digital fragment analysis of short tandem repeats by high-throughput amplicon sequencing.

    PubMed

    Darby, Brian J; Erickson, Shay F; Hervey, Samuel D; Ellis-Felege, Susan N

    2016-07-01

    High-throughput sequencing has been proposed as a method to genotype microsatellites and overcome the four main technical drawbacks of capillary electrophoresis: amplification artifacts, imprecise sizing, length homoplasy, and limited multiplex capability. The objective of this project was to test a high-throughput amplicon sequencing approach to fragment analysis of short tandem repeats and characterize its advantages and disadvantages against traditional capillary electrophoresis. We amplified and sequenced 12 muskrat microsatellite loci from 180 muskrat specimens and analyzed the sequencing data for precision of allele calling, propensity for amplification or sequencing artifacts, and for evidence of length homoplasy. Of the 294 total alleles, we detected by sequencing, only 164 alleles would have been detected by capillary electrophoresis as the remaining 130 alleles (44%) would have been hidden by length homoplasy. The ability to detect a greater number of unique alleles resulted in the ability to resolve greater population genetic structure. The primary advantages of fragment analysis by sequencing are the ability to precisely size fragments, resolve length homoplasy, multiplex many individuals and many loci into a single high-throughput run, and compare data across projects and across laboratories (present and future) with minimal technical calibration. A significant disadvantage of fragment analysis by sequencing is that the method is only practical and cost-effective when performed on batches of several hundred samples with multiple loci. Future work is needed to optimize throughput while minimizing costs and to update existing microsatellite allele calling and analysis programs to accommodate sequence-aware microsatellite data.

  6. Ice-cap. A high-throughput method for capturing plant tissue samples for genotype analysis.

    PubMed

    Krysan, Patrick

    2004-07-01

    High-throughput genotype screening is rapidly becoming a standard research tool in the post-genomic era. A major bottleneck currently exists, however, that limits the utility of this approach in the plant sciences. The rate-limiting step in current high-throughput pipelines is that tissue samples from living plants must be collected manually, one plant at a time. In this article I describe a novel method for harvesting tissue samples from living seedlings that eliminates this bottleneck. The method has been named Ice-Cap to reflect the fact that ice is used to capture the tissue samples. The planting of seeds, growth of seedlings, and harvesting of tissue are all performed in a 96-well format. I demonstrate the utility of this system by using tissue harvested by Ice-Cap to genotype a population of Arabidopsis seedlings that is segregating a previously characterized mutation. Because the harvesting of tissue is performed in a nondestructive manner, plants with the desired genotype can be transferred to soil and grown to maturity. I also show that Ice-Cap can be used to analyze genomic DNA from rice (Oryza sativa) seedlings. It is expected that this method will be applicable to high-throughput screening with many different plant species, making it a useful technology for performing marker assisted selection.

  7. Applications of Luminex xMAP technology for rapid, high-throughput multiplexed nucleic acid detection.

    PubMed

    Dunbar, Sherry A

    2006-01-01

    As we enter the post-genome sequencing era and begin to sift through the enormous amount of genetic information now available, the need for technologies that allow rapid, cost-effective, high-throughput detection of specific nucleic acid sequences becomes apparent. Multiplexing technologies, which allow for simultaneous detection of multiple nucleic acid sequences in a single reaction, can greatly reduce the time, cost and labor associated with single reaction detection technologies. The Luminex xMAP system is a multiplexed microsphere-based suspension array platform capable of analyzing and reporting up to 100 different reactions in a single reaction vessel. This technology provides a new platform for high-throughput nucleic acid detection and is being utilized with increasing frequency. Here we review specific applications of xMAP technology for nucleic acid detection in the areas of single nucleotide polymorphism (SNP) genotyping, genetic disease screening, gene expression profiling, HLA DNA typing and microbial detection. These studies demonstrate the speed, efficiency and utility of xMAP technology for simultaneous, rapid, sensitive and specific nucleic acid detection, and its capability to meet the current and future requirements of the molecular laboratory for high-throughput nucleic acid detection.

  8. High throughput and automatic colony formation assay based on impedance measurement technique.

    PubMed

    Lei, Kin Fong; Kao, Chich-Hao; Tsang, Ngan-Ming

    2017-03-02

    To predict the response of in vivo tumors, in vitro culture of cell colonies was suggested to be a standard assay to achieve high clinical relevance. To describe the responses of cell colonies, the most widely used quantification method is to count the number and size of cell colonies under microscope. That makes the colony formation assay infeasible to be high throughput and automated. In this work, in situ analysis of cell colonies suspended in soft hydrogel was developed based on impedance measurement technique. Cell colonies cultured between a pair of parallel plate electrodes were successfully analyzed by coating a layer of base hydrogel on one side of electrode. Real-time and label-free monitoring of cell colonies was realized during the culture course. Impedance magnitude and phase angle respectively represented the summation effect of colony responses and size of colonies. In addition, dynamic response of drug-treated colonies was demonstrated. High throughput and automatic colony formation assay was realized to facilitate more objective assessments in cancer research. Graphical Abstract High throughput and automatic colony formation assay was realized by in situ impedimetric analysis across a pair of parallel plate electrodes in a culture chamber. Cell colonies suspended in soft hydrogel were cultured under the tested substance and their dynamic response was represented by impedance data.

  9. A high-throughput microcultivation protocol for FTIR spectroscopic characterization and identification of fungi.

    PubMed

    Shapaval, Volha; Møretrø, Trond; Suso, Henri-Pierre; Asli, Anette Wold; Schmitt, Jürgen; Lillehaug, Dag; Martens, Harald; Böcker, Ulrike; Kohler, Achim

    2010-08-01

    Characterization and identification of fungi in food industry is an important issue both for routine analysis and trouble-shooting incidences. Present microbial techniques for fungal characterization suffer from a low throughput and are time consuming. In this study we present a protocol for high-throughput microcultivation and spectral characterization of fungi by Fourier transform infrared spectroscopy. For the study 11 species of in total five different fungal genera (Alternaria, Aspergillus, Mucor, Paecilomyces, and Phoma) were analyzed by FTIR spectroscopy. All the strains were isolated from trouble-shooting incidents in the production of low and high acid beverages. The cultivation was performed in malt extract broth (liquid medium) in a Bioscreen C system, allowing high-throughput cultivation of 200 samples at the same time. Mycelium was subsequently investigated by high-throughput Fourier transform infrared spectroscopy. Four spectral regions, fatty acids + lipid (3200-2800 cm(-1), 1300-1000 cm(-1)), protein-lipid (1800-1200 cm(-1)), carbohydrates (1200-700 cm(-1)) and "finger print" (900-700 cm(-1)) were evaluated for reproducibility and discrimination ability. The results show that all spectral regions evaluated can be used as spectroscopic biomarkers for differentiation of fungi by FTIR. The influence of different growth times on the ability of species discrimination by FTIR spectroscopy was investigated, and an optimal separation of all five genera was observed after five days of growth. This work presents a novel concept for high-throughput cultivation of fungi for FTIR spectroscopy that enables characterization or identification of hundreds of strains per day. (c) 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Demonstration of High-Throughput Water Isotopologue Measurements Using Cavity Ring-Down Spectroscopy

    NASA Astrophysics Data System (ADS)

    van Pelt, A. D.; Gupta, P.; Green, I.

    2009-12-01

    The ability to measure the δ18O and δD isotopic content of water has long relied on cumbersome methods that require well equipped laboratories, highly qualified technicians and frequently calibrated instruments. The advent of commercial analyzers based on Wavelength Scanned Cavity Ring-Down Spectroscopy (WS-CRDS) for isotopic water measurements has opened up new possibilities for mobile laboratory and field deployable isotopic instruments. For many laboratories, sample throughput has been a major bottleneck - either real-time sampling of stream flow or simply the number of samples gathered during a campaign can be a daunting challenge. It is not uncommon for users to have a huge backlog on the water samples that need to be analyzed within a short period of time. We present results of a new high throughput water analyzer based on WS-CRDS technology. This high throughput method comes with negligible impact on the precision and memory and absolutely no impact on the drift characteristics of the analyzer. In order to provide confidence in the data collected, even in the most challenging environments, there can be no comprise on the consistency or reproducibility of the instrument performance. The new high throughput isotopic water analyzer measures isotopologues of water with a typical precision of better than 0.15‰ for δ18O and better than 0.6‰ for δD and can execute over 380 injections per day. The analyzer has extremely low drift of < ±0.3‰ for δ18O and < ±0.9‰ for δD. This presentation demonstrates these capabilities of the high throughput isotopic water analyzer. This water isotope analyzer can be configured to analyze water vapor, liquid, or alternate between vapor and liquid. The alternating configuration enables the periodic recalibration of water vapor measurements using liquid water isotopic standards. The results of this study clearly demonstrate that the precision of the analyzer is very high and the memory and drift are exceptional even

  11. High-throughput Titration of Luciferase-expressing Recombinant Viruses

    PubMed Central

    Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon

    2014-01-01

    Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536

  12. High-throughput titration of luciferase-expressing recombinant viruses.

    PubMed

    Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon

    2014-09-19

    Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay.

  13. High-throughput patterning of photonic structures with tunable periodicity

    PubMed Central

    Kempa, Thomas J.; Bediako, D. Kwabena; Kim, Sun-Kyung; Park, Hong-Gyu; Nocera, Daniel G.

    2015-01-01

    A patterning method termed “RIPPLE” (reactive interface patterning promoted by lithographic electrochemistry) is applied to the fabrication of arrays of dielectric and metallic optical elements. This method uses cyclic voltammetry to impart patterns onto the working electrode of a standard three-electrode electrochemical setup. Using this technique and a template stripping process, periodic arrays of Ag circular Bragg gratings are patterned in a high-throughput fashion over large substrate areas. By varying the scan rate of the cyclically applied voltage ramps, the periodicity of the gratings can be tuned in situ over micrometer and submicrometer length scales. Characterization of the periodic arrays of periodic gratings identified point-like and annular scattering modes at different planes above the structured surface. Facile, reliable, and rapid patterning techniques like RIPPLE may enable the high-throughput and low-cost fabrication of photonic elements and metasurfaces for energy conversion and sensing applications. PMID:25870280

  14. A System for Performing High Throughput Assays of Synaptic Function

    PubMed Central

    Hempel, Chris M.; Sivula, Michael; Levenson, Jonathan M.; Rose, David M.; Li, Bing; Sirianni, Ana C.; Xia, Eva; Ryan, Timothy A.; Gerber, David J.; Cottrell, Jeffrey R.

    2011-01-01

    Unbiased, high-throughput screening has proven invaluable for dissecting complex biological processes. Application of this general approach to synaptic function would have a major impact on neuroscience research and drug discovery. However, existing techniques for studying synaptic physiology are labor intensive and low-throughput. Here, we describe a new high-throughput technology for performing assays of synaptic function in primary neurons cultured in microtiter plates. We show that this system can perform 96 synaptic vesicle cycling assays in parallel with high sensitivity, precision, uniformity, and reproducibility and can detect modulators of presynaptic function. By screening libraries of pharmacologically defined compounds on rat forebrain cultures, we have used this system to identify novel effects of compounds on specific aspects of presynaptic function. As a system for unbiased compound as well as genomic screening, this technology has significant applications for basic neuroscience research and for the discovery of novel, mechanism-based treatments for central nervous system disorders. PMID:21998743

  15. High-throughput physical organic chemistry--Hammett parameter evaluation.

    PubMed

    Portal, Christophe F; Bradley, Mark

    2006-07-15

    High-throughput analysis techniques were developed to allow the rapid assessment of a range of Hammett parameters utilizing positive electrospray mass spectrometry (ESI+ -MS) as the sole quantitative tool, with the core of the approach being a so-called "analytical construct". Hammett substituent parameters were determined for a range of meta- and para-substituted anilines by high-throughput (HT) assessment of relative reaction rates for competitive amide bond formation reaction with up to five parameters determined in a single pot reaction. Sensitivity of the reaction to substituents' effects (materialized by Hammett's rho parameter) was determined in the first instance, with HT Hammett's sigma substituent parameter assessment then carried out successfully for over 30 anilines, with excellent correlation observed between the HT ESI+ -MS method of determination and literature values.

  16. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  17. Direct assembling methodologies for high-throughput bioscreening

    PubMed Central

    Rodríguez-Dévora, Jorge I.; Shi, Zhi-dong; Xu, Tao

    2012-01-01

    Over the last few decades, high-throughput (HT) bioscreening, a technique that allows rapid screening of biochemical compound libraries against biological targets, has been widely used in drug discovery, stem cell research, development of new biomaterials, and genomics research. To achieve these ambitions, scaffold-free (or direct) assembly of biological entities of interest has become critical. Appropriate assembling methodologies are required to build an efficient HT bioscreening platform. The development of contact and non-contact assembling systems as a practical solution has been driven by a variety of essential attributes of the bioscreening system, such as miniaturization, high throughput, and high precision. The present article reviews recent progress on these assembling technologies utilized for the construction of HT bioscreening platforms. PMID:22021162

  18. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  19. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  20. High-throughput screening in the C. elegans nervous system.

    PubMed

    Kinser, Holly E; Pincus, Zachary

    2016-06-03

    The nematode Caenorhabditis elegans is widely used as a model organism in the field of neurobiology. The wiring of the C. elegans nervous system has been entirely mapped, and the animal's optical transparency allows for in vivo observation of neuronal activity. The nematode is also small in size, self-fertilizing, and inexpensive to cultivate and maintain, greatly lending to its utility as a whole-animal model for high-throughput screening (HTS) in the nervous system. However, the use of this organism in large-scale screens presents unique technical challenges, including reversible immobilization of the animal, parallel single-animal culture and containment, automation of laser surgery, and high-throughput image acquisition and phenotyping. These obstacles require significant modification of existing techniques and the creation of new C. elegans-based HTS platforms. In this review, we outline these challenges in detail and survey the novel technologies and methods that have been developed to address them.

  1. High-throughput theoretical design of lithium battery materials

    NASA Astrophysics Data System (ADS)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  2. High throughput screening of starch structures using carbohydrate microarrays

    PubMed Central

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Motawia, Mohammed Saddik; Shaik, Shahnoor Sultana; Mikkelsen, Maria Dalgaard; Krunic, Susanne Langgaard; Fangel, Jonatan Ulrik; Willats, William George Tycho; Blennow, Andreas

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers the potential for rapidly analysing resistant and slowly digested dietary starches. PMID:27468930

  3. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  4. A high-throughput label-free nanoparticle analyser

    NASA Astrophysics Data System (ADS)

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M.; Ruoslahti, Erkki; Cleland, Andrew N.

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10-6 l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  5. Applications of high-throughput DNA sequencing to benign hematology

    PubMed Central

    Gallagher, Patrick G.

    2013-01-01

    The development of novel technologies for high-throughput DNA sequencing is having a major impact on our ability to measure and define normal and pathologic variation in humans. This review discusses advances in DNA sequencing that have been applied to benign hematologic disorders, including those affecting the red blood cell, the neutrophil, and other white blood cell lineages. Relevant examples of how these approaches have been used for disease diagnosis, gene discovery, and studying complex traits are provided. High-throughput DNA sequencing technology holds significant promise for impacting clinical care. This includes development of improved disease detection and diagnosis, better understanding of disease progression and stratification of risk of disease-specific complications, and development of improved therapeutic strategies, particularly patient-specific pharmacogenomics-based therapy, with monitoring of therapy by genomic biomarkers. PMID:24021670

  6. A high-throughput microRNA expression profiling system.

    PubMed

    Guo, Yanwen; Mastriano, Stephen; Lu, Jun

    2014-01-01

    As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.

  7. Sensitivity study of reliable, high-throughput resolution metricsfor photoresists

    SciTech Connect

    Anderson, Christopher N.; Naulleau, Patrick P.

    2007-07-30

    The resolution of chemically amplified resists is becoming an increasing concern, especially for lithography in the extreme ultraviolet (EUV) regime. Large-scale screening and performance-based down-selection is currently underway to identify resist platforms that can support shrinking feature sizes. Resist screening efforts, however, are hampered by the absence of reliable resolution metrics that can objectively quantify resist resolution in a high-throughput fashion. Here we examine two high-throughput metrics for resist resolution determination. After summarizing their details and justifying their utility, we characterize the sensitivity of both metrics to two of the main experimental uncertainties associated with lithographic exposure tools, namely: limited focus control and limited knowledge of optical aberrations. For an implementation at EUV wavelengths, we report aberration and focus limited error bars in extracted resolution of {approx} 1.25 nm RMS for both metrics making them attractive candidates for future screening and down-selection efforts.

  8. High-throughput single-molecule optofluidic analysis

    PubMed Central

    Kim, Soohong; Streets, Aaron M; Lin, Ron R; Quake, Stephen R; Weiss, Shimon; Majumdar, Devdoot S

    2011-01-01

    We describe a high-throughput, automated single-molecule measurement system, equipped with microfluidics. the microfluidic mixing device has integrated valves and pumps to accurately accomplish titration of biomolecules with picoliter resolution. We demonstrate that the approach enabled rapid sampling of biomolecule conformational landscape and of enzymatic activity, in the form of transcription by Escherichia coli RNA polymerase, as a function of the chemical environment. PMID:21297618

  9. Generating barcoded libraries for multiplex high-throughput sequencing.

    PubMed

    Knapp, Michael; Stiller, Mathias; Meyer, Matthias

    2012-01-01

    Molecular barcoding is an essential tool to use the high throughput of next generation sequencing platforms optimally in studies involving more than one sample. Various barcoding strategies allow for the incorporation of short recognition sequences (barcodes) into sequencing libraries, either by ligation or polymerase chain reaction (PCR). Here, we present two approaches optimized for generating barcoded sequencing libraries from low copy number extracts and amplification products typical of ancient DNA studies.

  10. A fully automated robotic system for high throughput fermentation.

    PubMed

    Zimmermann, Hartmut F; Rieth, Jochen

    2007-03-01

    High throughput robotic systems have been used since the 1990s to carry out biochemical assays in microtiter plates. However, before the application of such systems in industrial fermentation process development, some important specific demands should be taken into account. These are sufficient oxygen supply, optimal growth temperature, minimized sample evaporation, avoidance of contaminations, and simple but reliable process monitoring. A fully automated solution where all these aspects have been taken into account is presented.

  11. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    PubMed Central

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-01-01

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925

  12. High throughput screening operations at the University of Kansas.

    PubMed

    Roy, Anuradha

    2014-05-01

    The High Throughput Screening Laboratory at University of Kansas plays a critical role in advancing academic interest in the identification of chemical probes as tools to better understand the biological and biochemical basis of new therapeutic targets. The HTS laboratory has an open service policy and collaborates with internal and external academia as well as for-profit organizations to execute projects requiring HTS-compatible assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization.

  13. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy.

    PubMed

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-06-09

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  14. Rapid Methods for High-Throughput Detection of Sulfoxides▿

    PubMed Central

    Shainsky, Janna; Derry, Netta-Lee; Leichtmann-Bardoogo, Yael; Wood, Thomas K.; Fishman, Ayelet

    2009-01-01

    Enantiopure sulfoxides are prevalent in drugs and are useful chiral auxiliaries in organic synthesis. The biocatalytic enantioselective oxidation of prochiral sulfides is a direct and economical approach for the synthesis of optically pure sulfoxides. The selection of suitable biocatalysts requires rapid and reliable high-throughput screening methods. Here we present four different methods for detecting sulfoxides produced via whole-cell biocatalysis, three of which were exploited for high-throughput screening. Fluorescence detection based on the acid activation of omeprazole was utilized for high-throughput screening of mutant libraries of toluene monooxygenases, but no active variants have been discovered yet. The second method is based on the reduction of sulfoxides to sulfides, with the coupled release and measurement of iodine. The availability of solvent-resistant microtiter plates enabled us to modify the method to a high-throughput format. The third method, selective inhibition of horse liver alcohol dehydrogenase, was used to rapidly screen highly active and/or enantioselective variants at position V106 of toluene ortho-monooxygenase in a saturation mutagenesis library, using methyl-p-tolyl sulfide as the substrate. A success rate of 89% (i.e., 11% false positives) was obtained, and two new mutants were selected. The fourth method is based on the colorimetric detection of adrenochrome, a back-titration procedure which measures the concentration of the periodate-sensitive sulfide. Due to low sensitivity during whole-cell screening, this method was found to be useful only for determining the presence or absence of sulfoxide in the reaction. The methods described in the present work are simple and inexpensive and do not require special equipment. PMID:19465532

  15. Web-based visual analysis for high-throughput genomics.

    PubMed

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  16. Novel High-throughput Approach for Purification of Infectious Virions

    PubMed Central

    James, Kevin T.; Cooney, Brad; Agopsowicz, Kate; Trevors, Mary Ann; Mohamed, Adil; Stoltz, Don; Hitt, Mary; Shmulevitz, Maya

    2016-01-01

    Viruses are extensively studied as pathogens and exploited as molecular tools and therapeutic agents. Existing methods to purify viruses such as gradient ultracentrifugation or chromatography have limitations, for example demand for technical expertise or specialized equipment, high time consumption, and restricted capacity. Our laboratory explores mutations in oncolytic reovirus that could improve oncolytic activity, and makes routine use of numerous virus variants, genome reassortants, and reverse engineered mutants. Our research pace was limited by the lack of high-throughput virus purification methods that efficiently remove confounding cellular contaminants such as cytokines and proteases. To overcome this shortcoming, we evaluated a commercially available resin (Capto Core 700) that captures molecules smaller than 700 kDa. Capto. Core 700 chromatography produced virion purity and infectivity indistinguishable from CsCl density gradient ultracentrifugation as determined by electron microscopy, gel electrophoresis analysis and plaque titration. Capto Core 700 resin was then effectively adapted to a rapid in-slurry pull-out approach for high-throughput purification of reovirus and adenovirus. The in-slurry purification approach offered substantially increased virus purity over crude cell lysates, media, or high-spin preparations and would be especially useful for high-throughput virus screening applications where density gradient ultracentrifugation is not feasible. PMID:27827454

  17. Fluorescent biosensors for high throughput screening of protein kinase inhibitors.

    PubMed

    Prével, Camille; Pellerano, Morgan; Van, Thi Nhu Ngoc; Morris, May C

    2014-02-01

    High throughput screening assays aim to identify small molecules that interfere with protein function, activity, or conformation, which can serve as effective tools for chemical biology studies of targets involved in physiological processes or pathways of interest or disease models, as well as templates for development of therapeutics in medicinal chemistry. Fluorescent biosensors constitute attractive and powerful tools for drug discovery programs, from high throughput screening assays, to postscreen characterization of hits, optimization of lead compounds, and preclinical evaluation of candidate drugs. They provide a means of screening for inhibitors that selectively target enzymatic activity, conformation, and/or function in vitro. Moreover, fluorescent biosensors constitute useful tools for cell- and image-based, multiplex and multiparametric, high-content screening. Application of fluorescence-based sensors to screen large and complex libraries of compounds in vitro, in cell-based formats or whole organisms requires several levels of optimization to establish robust and reproducible assays. In this review, we describe the different fluorescent biosensor technologies which have been applied to high throughput screens, and discuss the prerequisite criteria underlying their successful application. Special emphasis is placed on protein kinase biosensors, since these enzymes constitute one of the most important classes of therapeutic targets in drug discovery.

  18. FLASH assembly of TALENs for high-throughput genome editing.

    PubMed

    Reyon, Deepak; Tsai, Shengdar Q; Khayter, Cyd; Foden, Jennifer A; Sander, Jeffry D; Joung, J Keith

    2012-05-01

    Engineered transcription activator–like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published, and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the fast ligation-based automatable solid-phase high-throughput (FLASH) system, a rapid and cost-effective method for large-scale assembly of TALENs. We tested 48 FLASH-assembled TALEN pairs in a human cell–based EGFP reporter system and found that all 48 possessed efficient gene-modification activities. We also used FLASH to assemble TALENs for 96 endogenous human genes implicated in cancer and/or epigenetic regulation and found that 84 pairs were able to efficiently introduce targeted alterations. Our results establish the robustness of TALEN technology and demonstrate that FLASH facilitates high-throughput genome editing at a scale not currently possible with other genome modification technologies.

  19. High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila

    PubMed Central

    Chiaraviglio, Lucius

    2015-01-01

    Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509

  20. A microdroplet dilutor for high-throughput screening

    NASA Astrophysics Data System (ADS)

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B.; Demello, Andrew J.

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  1. A microdroplet dilutor for high-throughput screening.

    PubMed

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B; deMello, Andrew J

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  2. High-Throughput Toxicity Testing: New Strategies for ...

    EPA Pesticide Factsheets

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it

  3. MEGARes: an antimicrobial resistance database for high throughput sequencing

    PubMed Central

    Lakin, Steven M.; Dean, Chris; Noyes, Noelle R.; Dettenwanger, Adam; Ross, Anne Spencer; Doster, Enrique; Rovira, Pablo; Abdo, Zaid; Jones, Kenneth L.; Ruiz, Jaime; Belk, Keith E.; Morley, Paul S.; Boucher, Christina

    2017-01-01

    Antimicrobial resistance has become an imminent concern for public health. As methods for detection and characterization of antimicrobial resistance move from targeted culture and polymerase chain reaction to high throughput metagenomics, appropriate resources for the analysis of large-scale data are required. Currently, antimicrobial resistance databases are tailored to smaller-scale, functional profiling of genes using highly descriptive annotations. Such characteristics do not facilitate the analysis of large-scale, ecological sequence datasets such as those produced with the use of metagenomics for surveillance. In order to overcome these limitations, we present MEGARes (https://megares.meglab.org), a hand-curated antimicrobial resistance database and annotation structure that provides a foundation for the development of high throughput acyclical classifiers and hierarchical statistical analysis of big data. MEGARes can be browsed as a stand-alone resource through the website or can be easily integrated into sequence analysis pipelines through download. Also via the website, we provide documentation for AmrPlusPlus, a user-friendly Galaxy pipeline for the analysis of high throughput sequencing data that is pre-packaged for use with the MEGARes database. PMID:27899569

  4. High throughput biotechnology in traditional fermented food industry.

    PubMed

    Yang, Yong; Xu, Rong-man; Song, Jia; Wang, Wei-min

    2010-11-01

    Traditional fermented food is not only the staple food for most of developing countries but also the key healthy food for developed countries. As the healthy function of these foods are gradually discovered, more and more high throughput biotechnologies are being used to promote the old and new industry. As a result, the microflora, manufacturing processes and product healthy function of these foods were pushed forward either in the respect of profundity or extensiveness nowadays. The application and progress of the high throughput biotechnologies into traditional fermented food industries were different from each other, which was reviewed and detailed by the catalogues of fermented milk products (yogurt, cheese), fermented sausages, fermented vegetables (kimchi, sauerkraut), fermented cereals (sourdough) and fermented beans (tempeh, natto). Given the further promotion by high throughput biotechnologies, the middle and/or down-stream process of traditional fermented foods would be optimized and the process of industrialization of local traditional fermented food having many functional factors but in small quantity would be accelerated. The article presents some promising patents on traditional fermented food industry.

  5. MEGARes: an antimicrobial resistance database for high throughput sequencing.

    PubMed

    Lakin, Steven M; Dean, Chris; Noyes, Noelle R; Dettenwanger, Adam; Ross, Anne Spencer; Doster, Enrique; Rovira, Pablo; Abdo, Zaid; Jones, Kenneth L; Ruiz, Jaime; Belk, Keith E; Morley, Paul S; Boucher, Christina

    2017-01-04

    Antimicrobial resistance has become an imminent concern for public health. As methods for detection and characterization of antimicrobial resistance move from targeted culture and polymerase chain reaction to high throughput metagenomics, appropriate resources for the analysis of large-scale data are required. Currently, antimicrobial resistance databases are tailored to smaller-scale, functional profiling of genes using highly descriptive annotations. Such characteristics do not facilitate the analysis of large-scale, ecological sequence datasets such as those produced with the use of metagenomics for surveillance. In order to overcome these limitations, we present MEGARes (https://megares.meglab.org), a hand-curated antimicrobial resistance database and annotation structure that provides a foundation for the development of high throughput acyclical classifiers and hierarchical statistical analysis of big data. MEGARes can be browsed as a stand-alone resource through the website or can be easily integrated into sequence analysis pipelines through download. Also via the website, we provide documentation for AmrPlusPlus, a user-friendly Galaxy pipeline for the analysis of high throughput sequencing data that is pre-packaged for use with the MEGARes database.

  6. High-throughput 2D root system phenotyping platform facilitates genetic analysis of root growth and development

    USDA-ARS?s Scientific Manuscript database

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyz...

  7. Uncertainty Quantification in High Throughput Screening: Applications to Models of Endocrine Disruption, Cytotoxicity, and Zebrafish Development (GRC Drug Safety)

    EPA Science Inventory

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...

  8. Evaluation of High-throughput Genotoxicity Assays Used in Profiling the US EPA ToxCast Chemicals

    EPA Science Inventory

    Three high-throughput screening (HTS) genotoxicity assays-GreenScreen HC GADD45a-GFP (Gentronix Ltd.), CellCiphr p53 (Cellumen Inc.) and CellSensor p53RE-bla (Invitrogen Corp.)-were used to analyze the collection of 320 predominantly pesticide active compounds being tested in Pha...

  9. Evaluation of High-throughput Genotoxicity Assays Used in Profiling the US EPA ToxCast Chemicals

    EPA Science Inventory

    Three high-throughput screening (HTS) genotoxicity assays-GreenScreen HC GADD45a-GFP (Gentronix Ltd.), CellCiphr p53 (Cellumen Inc.) and CellSensor p53RE-bla (Invitrogen Corp.)-were used to analyze the collection of 320 predominantly pesticide active compounds being tested in Pha...

  10. ScreenMill: a freely available software suite for growth measurement, analysis and visualization of high-throughput screen data.

    PubMed

    Dittmar, John C; Reid, Robert Jd; Rothstein, Rodney

    2010-06-28

    Many high-throughput genomic experiments, such as Synthetic Genetic Array and yeast two-hybrid, use colony growth on solid media as a screen metric. These experiments routinely generate over 100,000 data points, making data analysis a time consuming and painstaking process. Here we describe ScreenMill, a new software suite that automates image analysis and simplifies data review and analysis for high-throughput biological experiments. The ScreenMill, software suite includes three software tools or "engines": an open source Colony Measurement Engine (CM Engine) to quantitate colony growth data from plate images, a web-based Data Review Engine (DR Engine) to validate and analyze quantitative screen data, and a web-based Statistics Visualization Engine (SV Engine) to visualize screen data with statistical information overlaid. The methods and software described here can be applied to any screen in which growth is measured by colony size. In addition, the DR Engine and SV Engine can be used to visualize and analyze other types of quantitative high-throughput data. ScreenMill automates quantification, analysis and visualization of high-throughput screen data. The algorithms implemented in ScreenMill are transparent allowing users to be confident about the results ScreenMill produces. Taken together, the tools of ScreenMill offer biologists a simple and flexible way of analyzing their data, without requiring programming skills.

  11. ScreenMill: A freely available software suite for growth measurement, analysis and visualization of high-throughput screen data

    PubMed Central

    2010-01-01

    Background Many high-throughput genomic experiments, such as Synthetic Genetic Array and yeast two-hybrid, use colony growth on solid media as a screen metric. These experiments routinely generate over 100,000 data points, making data analysis a time consuming and painstaking process. Here we describe ScreenMill, a new software suite that automates image analysis and simplifies data review and analysis for high-throughput biological experiments. Results The ScreenMill, software suite includes three software tools or "engines": an open source Colony Measurement Engine (CM Engine) to quantitate colony growth data from plate images, a web-based Data Review Engine (DR Engine) to validate and analyze quantitative screen data, and a web-based Statistics Visualization Engine (SV Engine) to visualize screen data with statistical information overlaid. The methods and software described here can be applied to any screen in which growth is measured by colony size. In addition, the DR Engine and SV Engine can be used to visualize and analyze other types of quantitative high-throughput data. Conclusions ScreenMill automates quantification, analysis and visualization of high-throughput screen data. The algorithms implemented in ScreenMill are transparent allowing users to be confident about the results ScreenMill produces. Taken together, the tools of ScreenMill offer biologists a simple and flexible way of analyzing their data, without requiring programming skills. PMID:20584323

  12. High-throughput time-stretch microscopy with morphological and chemical specificity

    NASA Astrophysics Data System (ADS)

    Lei, Cheng; Ugawa, Masashi; Nozawa, Taisuke; Ideguchi, Takuro; Di Carlo, Dino; Ota, Sadao; Ozeki, Yasuyuki; Goda, Keisuke

    2016-03-01

    Particle analysis is an effective method in analytical chemistry for sizing and counting microparticles such as emulsions, colloids, and biological cells. However, conventional methods for particle analysis, which fall into two extreme categories, have severe limitations. Sieving and Coulter counting are capable of analyzing particles with high throughput, but due to their lack of detailed information such as morphological and chemical characteristics, they can only provide statistical results with low specificity. On the other hand, CCD or CMOS image sensors can be used to analyze individual microparticles with high content, but due to their slow charge download, the frame rate (hence, the throughput) is significantly limited. Here by integrating a time-stretch optical microscope with a three-color fluorescent analyzer on top of an inertial-focusing microfluidic device, we demonstrate an optofluidic particle analyzer with a sub-micrometer spatial resolution down to 780 nm and a high throughput of 10,000 particles/s. In addition to its morphological specificity, the particle analyzer provides chemical specificity to identify chemical expressions of particles via fluorescence detection. Our results indicate that we can identify different species of microparticles with high specificity without sacrificing throughput. Our method holds promise for high-precision statistical particle analysis in chemical industry and pharmaceutics.

  13. Controlling high-throughput manufacturing at the nano-scale

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  14. A GUINIER CAMERA FOR SR POWDER DIFFRACTION: HIGH RESOLUTION AND HIGH THROUGHPUT.

    SciTech Connect

    SIDDONS,D.P.; HULBERT, S.L.; STEPHENS, P.W.

    2006-05-28

    The paper describe a new powder diffraction instrument for synchrotron radiation sources which combines the high throughput of a position-sensitive detector system with the high resolution normally only provided by a crystal analyzer. It uses the Guinier geometry which is traditionally used with an x-ray tube source. This geometry adapts well to the synchrotron source, provided proper beam conditioning is applied. The high brightness of the SR source allows a high resolution to be achieved. When combined with a photon-counting silicon microstrip detector array, the system becomes a powerful instrument for radiation-sensitive samples or time-dependent phase transition studies.

  15. High-throughput karyotyping of human pluripotent stem cells.

    PubMed

    Lund, Riikka J; Nikula, Tuomas; Rahkonen, Nelly; Närvä, Elisa; Baker, Duncan; Harrison, Neil; Andrews, Peter; Otonkoski, Timo; Lahesmaa, Riitta

    2012-11-01

    Genomic integrity of human pluripotent stem cell (hPSC) lines requires routine monitoring. We report here that novel karyotyping assay, utilizing bead-bound bacterial artificial chromosome probes, provides a fast and easy tool for detection of chromosomal abnormalities in hPSC lines. The analysis can be performed from low amounts of DNA isolated from whole cell pools with simple data analysis interface. The method enables routine screening of stem cell lines in a cost-efficient high-throughput manner. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. High-throughput karyotyping of human pluripotent stem cells

    PubMed Central

    Lund, Riikka J.; Nikula, Tuomas; Rahkonen, Nelly; Närvä, Elisa; Baker, Duncan; Harrison, Neil; Andrews, Peter; Otonkoski, Timo; Lahesmaa, Riitta

    2012-01-01

    Genomic integrity of human pluripotent stem cell (hPSC) lines requires routine monitoring. We report here that novel karyotyping assay, utilizing bead-bound bacterial artificial chromosome probes, provides a fast and easy tool for detection of chromosomal abnormalities in hPSC lines. The analysis can be performed from low amounts of DNA isolated from whole cell pools with simple data analysis interface. The method enables routine screening of stem cell lines in a cost-efficient high-throughput manner. PMID:22877823

  17. Developing soluble polymers for high-throughput synthetic chemistry.

    PubMed

    Spanka, Carsten; Wentworth, Paul; Janda, Kim D

    2002-05-01

    Soluble polymers have emerged as viable alternatives to resin supports across the broad spectrum of high-throughput organic chemistry. As the application of these supports become more widespread, issues such as broad-spectrum solubility and loading are becoming limiting factors and therefore new polymers are required to overcome such limitations. This article details the approach made within our group to new soluble polymer supports and specifically focuses on parallel libraries of block copolymers, de novo poly(styrene-co-chloromethylstyrene), PEG- stealth stars, and substituted poly(norbornylene)s.

  18. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  19. Extended length microchannels for high density high throughput electrophoresis systems

    DOEpatents

    Davidson, James C.; Balch, Joseph W.

    2000-01-01

    High throughput electrophoresis systems which provide extended well-to-read distances on smaller substrates, thus compacting the overall systems. The electrophoresis systems utilize a high density array of microchannels for electrophoresis analysis with extended read lengths. The microchannel geometry can be used individually or in conjunction to increase the effective length of a separation channel while minimally impacting the packing density of channels. One embodiment uses sinusoidal microchannels, while another embodiment uses plural microchannels interconnected by a via. The extended channel systems can be applied to virtually any type of channel confined chromatography.

  20. Quick 96FASP for high throughput quantitative proteome analysis.

    PubMed

    Yu, Yanbao; Bekele, Shiferaw; Pieper, Rembert

    2017-08-23

    Filter aided sample preparation (FASP) is becoming a central method for proteomic sample cleanup and peptide generation prior to LC-MS analysis. We previously adapted this method to a 96-well filter plate, and applied to prepare protein digests from cell lysate and body fluid samples in a high throughput quantitative manner. While the 96FASP approach is scalable and can handle multiple samples simultaneously, two key advantages compared to single FASP, it is also time-consuming. The centrifugation-based liquid transfer on the filter plate takes 3-5 times longer than single filter. To address this limitation, we now present a quick 96FASP (named q96FASP) approach that, relying on the use of filter membranes with a large MWCO size (~30kDa), significantly reduces centrifugal times. We show that q96FASP allows the generation of protein digests derived from whole cell lysates and body fluids in a quality similar to that of the single FASP method. Processing a sample in multiple wells in parallel, we observed excellent experimental repeatability by label-free quantitation approach. We conclude that the q96FASP approach promises to be a promising cost- and time-effective method for shotgun proteomics and will be particularly useful in large scale biomarker discovery studies. High throughput sample processing is of particular interests for quantitative proteomics. The previously developed 96FASP is high throughput and appealing, however it is time-consuming in the context of centrifugation-based liquid transfer (~1.5h per spin). This study presents a truly high throughput sample preparation method based on large cut-off 96-well filter plate, which shortens the spin time to ~20min. To our knowledge, this is the first multi-well method that is entirely comparable with conventional FASP. This study thoroughly examined two types of filter plates and performed side-by-side comparisons with single FASP. Two types of samples, whole cell lysate of a UTI (urinary tract infection

  1. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  2. SSFinder: high throughput CRISPR-Cas target sites prediction tool.

    PubMed

    Upadhyay, Santosh Kumar; Sharma, Shailesh

    2014-01-01

    Clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated protein (Cas) system facilitates targeted genome editing in organisms. Despite high demand of this system, finding a reliable tool for the determination of specific target sites in large genomic data remained challenging. Here, we report SSFinder, a python script to perform high throughput detection of specific target sites in large nucleotide datasets. The SSFinder is a user-friendly tool, compatible with Windows, Mac OS, and Linux operating systems, and freely available online.

  3. Genomic outlier detection in high-throughput data analysis.

    PubMed

    Ghosh, Debashis

    2013-01-01

    In the analysis of high-throughput data, a very common goal is the detection of genes or of differential expression between two groups or classes. A recent finding from the scientific literature in prostate cancer demonstrates that by searching for a different pattern of differential expression, new candidate oncogenes might be found. In this chapter, we discuss the statistical problem, termed oncogene outlier detection, and discuss a variety of proposals to this problem. A statistical model in the multiclass situation is described; links with multiple testing concepts are established. Some new nonparametric procedures are described and compared to existing methods using simulation studies.

  4. High Throughput WAN Data Transfer with Hadoop-based Storage

    NASA Astrophysics Data System (ADS)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  5. High-throughput expression in microplate format in Saccharomyces cerevisiae.

    PubMed

    Holz, Caterina; Lang, Christine

    2004-01-01

    We have developed a high-throughput technology that allows parallel expression, purification, and analysis of large numbers of cloned cDNAs in the yeast Saccharomyces cerevisiae. The technology is based on a vector for intracellular protein expression under control of the inducible CUP1 promoter, where the gene products are fused to specific peptide sequences. These N-terminal and C-terminal epitope tags allow the immunological identification and purification of the gene products independent of the protein produced. By introducing the method of recombinational cloning we avoid time-consuming re-cloning steps and enable the easy switching between different expression vectors and host systems.

  6. A High-Throughput Strategy for Dissecting Mammalian Genetic Interactions

    PubMed Central

    Stockman, Victoria B.; Ghamsari, Lila; Lasso, Gorka; Honig, Barry

    2016-01-01

    Comprehensive delineation of complex cellular networks requires high-throughput interrogation of genetic interactions. To address this challenge, we describe the development of a multiplex combinatorial strategy to assess pairwise genetic interactions using CRISPR-Cas9 genome editing and next-generation sequencing. We characterize the performance of combinatorial genome editing and analysis using different promoter and gRNA designs and identified regions of the chimeric RNA that are compatible with next-generation sequencing preparation and quantification. This approach is an important step towards elucidating genetic networks relevant to human diseases and the development of more efficient Cas9-based therapeutics. PMID:27936040

  7. Elimination of redundant protein identifications in high throughput proteomics.

    PubMed

    Kearney, Robert; Blondeau, Francois; McPherson, Peter; Bell, Alex; Servant, Florence; Drapeau, Mathieu; de Grandpre, Sebastien; Jm Bergeron, John

    2005-01-01

    Tandem mass spectrometry followed by data base search is the preferred method for protein identification in high throughput proteomics. However, standard analysis methods give rise to highly redundant lists of proteins with many proteins identified by the same sets of peptides. In essence, this is a list of all proteins that might be present in the sample. Here we present an algorithm that eliminates redundancy and determines the minimum number of proteins needed to explain the peptides observed. We demonstrate that application of the algorithm results in a significantly smaller set of proteins and greatly reduces the number of "shared" peptides.

  8. High-Throughput Sequencing: A Roadmap Toward Community Ecology

    PubMed Central

    Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique

    2013-01-01

    High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines. PMID:23610649

  9. Live Cell Optical Sensing for High Throughput Applications

    NASA Astrophysics Data System (ADS)

    Fang, Ye

    Live cell optical sensing employs label-free optical biosensors to non-invasively measure stimulus-induced dynamic mass redistribution (DMR) in live cells within the sensing volume of the biosensor. The resultant DMR signal is an integrated cellular response, and reflects cell signaling mediated through the cellular target(s) with which the stimulus intervenes. This article describes the uses of live cell optical sensing for probing cell biology and ligand pharmacology, with an emphasis of resonant waveguide grating biosensor cellular assays for high throughput applications.

  10. Orchestrating high-throughput genomic analysis with Bioconductor.

    PubMed

    Huber, Wolfgang; Carey, Vincent J; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D; Irizarry, Rafael A; Lawrence, Michael; Love, Michael I; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-02-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors.

  11. Computational Proteomics: High-throughput Analysis for Systems Biology

    SciTech Connect

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  12. Analysis of High-Throughput ELISA Microarray Data

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    2011-02-23

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  13. High-throughput quantitative real-time PCR.

    PubMed

    Arany, Zoltan P

    2008-07-01

    Recent technical advances in quantitative real-time PCR (qRT-PCR) have allowed for extensive miniaturization, thereby rendering the technique amenable to high-throughput assays. Large numbers of different nucleic acids can now rapidly be measured quantitatively. Many investigations can benefit from this approach, including determination of gene expression in hundreds of samples, determination of hundreds of genes in a few samples, or even quantification of nucleic acids other than mRNA. A simple technique is described here to quantify 1880 transcripts of choice from any number of starting RNA samples.

  14. Human transcriptome array for high-throughput clinical studies.

    PubMed

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N; Schweitzer, Anthony C; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D; Moldawer, Lyle L; Maier, Ronald V; Tompkins, Ronald G; Wong, Wing Hung; Davis, Ronald W; Xiao, Wenzhong

    2011-03-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays.

  15. Evaluation of a high throughput starch analysis optimised for wood.

    PubMed

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  16. Genotype-Frequency Estimation from High-Throughput Sequencing Data.

    PubMed

    Maruki, Takahiro; Lynch, Michael

    2015-10-01

    Rapidly improving high-throughput sequencing technologies provide unprecedented opportunities for carrying out population-genomic studies with various organisms. To take full advantage of these methods, it is essential to correctly estimate allele and genotype frequencies, and here we present a maximum-likelihood method that accomplishes these tasks. The proposed method fully accounts for uncertainties resulting from sequencing errors and biparental chromosome sampling and yields essentially unbiased estimates with minimal sampling variances with moderately high depths of coverage regardless of a mating system and structure of the population. Moreover, we have developed statistical tests for examining the significance of polymorphisms and their genotypic deviations from Hardy-Weinberg equilibrium. We examine the performance of the proposed method by computer simulations and apply it to low-coverage human data generated by high-throughput sequencing. The results show that the proposed method improves our ability to carry out population-genomic analyses in important ways. The software package of the proposed method is freely available from https://github.com/Takahiro-Maruki/Package-GFE.

  17. High-throughput gene mapping in Caenorhabditis elegans.

    PubMed

    Swan, Kathryn A; Curtis, Damian E; McKusick, Kathleen B; Voinov, Alexander V; Mapa, Felipa A; Cancilla, Michael R

    2002-07-01

    Positional cloning of mutations in model genetic systems is a powerful method for the identification of targets of medical and agricultural importance. To facilitate the high-throughput mapping of mutations in Caenorhabditis elegans, we have identified a further 9602 putative new single nucleotide polymorphisms (SNPs) between two C. elegans strains, Bristol N2 and the Hawaiian mapping strain CB4856, by sequencing inserts from a CB4856 genomic DNA library and using an informatics pipeline to compare sequences with the canonical N2 genomic sequence. When combined with data from other laboratories, our marker set of 17,189 SNPs provides even coverage of the complete worm genome. To date, we have confirmed >1099 evenly spaced SNPs (one every 91 +/- 56 kb) across the six chromosomes and validated the utility of our SNP marker set and new fluorescence polarization-based genotyping methods for systematic and high-throughput identification of genes in C. elegans by cloning several proprietary genes. We illustrate our approach by recombination mapping and confirmation of the mutation in the cloned gene, dpy-18.

  18. High-Throughput Gene Mapping in Caenorhabditis elegans

    PubMed Central

    Swan, Kathryn A.; Curtis, Damian E.; McKusick, Kathleen B.; Voinov, Alexander V.; Mapa, Felipa A.; Cancilla, Michael R.

    2002-01-01

    Positional cloning of mutations in model genetic systems is a powerful method for the identification of targets of medical and agricultural importance. To facilitate the high-throughput mapping of mutations in Caenorhabditis elegans, we have identified a further 9602 putative new single nucleotide polymorphisms (SNPs) between two C. elegans strains, Bristol N2 and the Hawaiian mapping strain CB4856, by sequencing inserts from a CB4856 genomic DNA library and using an informatics pipeline to compare sequences with the canonical N2 genomic sequence. When combined with data from other laboratories, our marker set of 17,189 SNPs provides even coverage of the complete worm genome. To date, we have confirmed >1099 evenly spaced SNPs (one every 91 ± 56 kb) across the six chromosomes and validated the utility of our SNP marker set and new fluorescence polarization-based genotyping methods for systematic and high-throughput identification of genes in C. elegans by cloning several proprietary genes. We illustrate our approach by recombination mapping and confirmation of the mutation in the cloned gene, dpy-18. [The sequence data described in this paper have been submitted to the NCBI dbSNP data library under accession nos. 4388625–4389689 and GenBank dbSTS under accession nos. 973810–974874. The following individuals and institutions kindly provided reagents, samples, or unpublished information as indicated in the paper: The C. elegans Sequencing Consortium and The Caenorhabditis Genetics Center.] PMID:12097347

  19. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  20. Discriminative motif analysis of high-throughput dataset

    PubMed Central

    Yao, Zizhen; MacQuarrie, Kyle L.; Fong, Abraham P.; Tapscott, Stephen J.; Ruzzo, Walter L.; Gentleman, Robert C.

    2014-01-01

    Motivation: High-throughput ChIP-seq studies typically identify thousands of peaks for a single transcription factor (TF). It is common for traditional motif discovery tools to predict motifs that are statistically significant against a naïve background distribution but are of questionable biological relevance. Results: We describe a simple yet effective algorithm for discovering differential motifs between two sequence datasets that is effective in eliminating systematic biases and scalable to large datasets. Tested on 207 ENCODE ChIP-seq datasets, our method identifies correct motifs in 78% of the datasets with known motifs, demonstrating improvement in both accuracy and efficiency compared with DREME, another state-of-art discriminative motif discovery tool. More interestingly, on the remaining more challenging datasets, we identify common technical or biological factors that compromise the motif search results and use advanced features of our tool to control for these factors. We also present case studies demonstrating the ability of our method to detect single base pair differences in DNA specificity of two similar TFs. Lastly, we demonstrate discovery of key TF motifs involved in tissue specification by examination of high-throughput DNase accessibility data. Availability: The motifRG package is publically available via the bioconductor repository. Contact: yzizhen@fhcrc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24162561

  1. Plant chip for high-throughput phenotyping of Arabidopsis.

    PubMed

    Jiang, Huawei; Xu, Zhen; Aluru, Maneesha R; Dong, Liang

    2014-04-07

    We report on the development of a vertical and transparent microfluidic chip for high-throughput phenotyping of Arabidopsis thaliana plants. Multiple Arabidopsis seeds can be germinated and grown hydroponically over more than two weeks in the chip, thus enabling large-scale and quantitative monitoring of plant phenotypes. The novel vertical arrangement of this microfluidic device not only allows for normal gravitropic growth of the plants but also, more importantly, makes it convenient to continuously monitor phenotypic changes in plants at the whole organismal level, including seed germination and root and shoot growth (hypocotyls, cotyledons, and leaves), as well as at the cellular level. We also developed a hydrodynamic trapping method to automatically place single seeds into seed holding sites of the device and to avoid potential damage to seeds that might occur during manual loading. We demonstrated general utility of this microfluidic device by showing clear visible phenotypes of the immutans mutant of Arabidopsis, and we also showed changes occurring during plant-pathogen interactions at different developmental stages. Arabidopsis plants grown in the device maintained normal morphological and physiological behaviour, and distinct phenotypic variations consistent with a priori data were observed via high-resolution images taken in real time. Moreover, the timeline for different developmental stages for plants grown in this device was highly comparable to growth using a conventional agar plate method. This prototype plant chip technology is expected to lead to the establishment of a powerful experimental and cost-effective framework for high-throughput and precise plant phenotyping.

  2. A Quantitative High Throughput Assay for Identifying Gametocytocidal Compounds

    PubMed Central

    Tanaka, Takeshi Q.; Dehdashti, Seameen J.; Nguyen, Dac-Trung; McKew, John C.; Zheng, Wei; Williamson, Kim C.

    2013-01-01

    Current antimalarial drug treatment does not effectively kill mature Plasmodium falciparum gametocytes, the parasite stage responsible for malaria transmission from human to human via a mosquito. Consequently, following standard therapy malaria can still be transmitted for over a week after the clearance of asexual parasites. A new generation of malaria drugs with gametocytocidal properties, or a gametocytocidal drug that could be used in combinational therapy with currently available antimalarials, is needed to control the spread of the disease and facilitate eradication efforts. We have developed a 1,536-well gametocyte viability assay for the high throughput screening of large compound collections to identify novel compounds with gametocytocidal activity. The signal-to-basal ratio and Z′-factor for this assay were 3.2-fold and 0.68, respectively. The IC50 value of epoxomicin, the positive control compound, was 1.42 ± 0.09 nM that is comparable to previously reported values. This miniaturized assay significantly reduces the number of gametocytes required for the alamarBlue viability assay, and enables high throughput screening for lead discovery efforts. Additionally, the screen does not require a specialized parasite line, gametocytes from any strain, including field isolates, can be tested. A pilot screen utilizing the commercially available LOPAC library, consisting of 1,280 known compounds, revealed two selective gametocytocidal compounds having 54 and 7.8-fold gametocytocidal selectivity in comparison to their cell cytotoxicity effect against the mammalian SH-SY5Y cell line. PMID:23454872

  3. High-throughput fragment screening by affinity LC-MS.

    PubMed

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in <4 h (corresponding to >3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  4. Discovery of novel targets with high throughput RNA interference screening.

    PubMed

    Kassner, Paul D

    2008-03-01

    High throughput technologies have the potential to affect all aspects of drug discovery. Considerable attention is paid to high throughput screening (HTS) for small molecule lead compounds. The identification of the targets that enter those HTS campaigns had been driven by basic research until the advent of genomics level data acquisition such as sequencing and gene expression microarrays. Large-scale profiling approaches (e.g., microarrays, protein analysis by mass spectrometry, and metabolite profiling) can yield vast quantities of data and important information. However, these approaches usually require painstaking in silico analysis and low-throughput basic wet-lab research to identify the function of a gene and validate the gene product as a potential therapeutic drug target. Functional genomic screening offers the promise of direct identification of genes involved in phenotypes of interest. In this review, RNA interference (RNAi) mediated loss-of-function screens will be discussed and as well as their utility in target identification. Some of the genes identified in these screens should produce similar phenotypes if their gene products are antagonized with drugs. With a carefully chosen phenotype, an understanding of the biology of RNAi and appreciation of the limitations of RNAi screening, there is great potential for the discovery of new drug targets.

  5. High-throughput technology for novel SO2 oxidation catalysts

    PubMed Central

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427

  6. Application of computational and high-throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy to be used as a substitute for the current EDSP Ti

  7. Application of Computational and High-Throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, with a focus on their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy t

  8. High throughput instruments, methods, and informatics for systems biology.

    SciTech Connect

    Sinclair, Michael B.; Cowie, Jim R.; Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D.; Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C.; Mosquera-Caro, Monica P.; Martinez, M. Juanita; Martin, Shawn Bryan; Willman, Cheryl L.

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  9. High-throughput characterization for solar fuels materials discovery

    NASA Astrophysics Data System (ADS)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  10. Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms

    PubMed Central

    Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas

    2016-01-01

    Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640

  11. High throughput electrophysiology: new perspectives for ion channel drug discovery.

    PubMed

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter; Jensen, Bo Skaaning; Korsgaard, Mads P G; Christophersen, Palle

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels. A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening. The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.

  12. A Microchip for High-throughput Axon Growth Drug Screening

    PubMed Central

    Kim, Hyun Soo; Jeong, Sehoon; Koo, Chiwan; Han, Arum; Park, Jaewon

    2016-01-01

    It has been recently known that not only the presence of inhibitory molecules associated with myelin but also the reduced growth capability of the axons limit mature central nervous system (CNS) axonal regeneration after injury. Conventional axon growth studies are typically conducted using multi-well cell culture plates that are very challenging to investigate localized effects of drugs and limited to low throughput. Unfortunately, there is currently no other in vitro tools that allow investigating localized axonal responses to biomolecules in high-throughput for screening potential drugs that might promote axonal growth. We have developed a compartmentalized neuron culture platform enabling localized biomolecular treatments in parallel to axons that are physically and fluidically isolated from their neuronal somata. The 24 axon compartments in the developed platform are designed to perform four sets of six different localized biomolecular treatments simultaneously on a single device. In addition, the novel microfluidic configuration allows culture medium of 24 axon compartments to be replenished altogether by a single aspiration process, making high-throughput drug screening a reality. PMID:27928514

  13. High-throughput assays for DNA gyrase and other topoisomerases

    PubMed Central

    Maxwell, Anthony; Burton, Nicolas P.; O'Hagan, Natasha

    2006-01-01

    We have developed high-throughput microtitre plate-based assays for DNA gyrase and other DNA topoisomerases. These assays exploit the fact that negatively supercoiled plasmids form intermolecular triplexes more efficiently than when they are relaxed. Two assays are presented, one using capture of a plasmid containing a single triplex-forming sequence by an oligonucleotide tethered to the surface of a microtitre plate and subsequent detection by staining with a DNA-specific fluorescent dye. The other uses capture of a plasmid containing two triplex-forming sequences by an oligonucleotide tethered to the surface of a microtitre plate and subsequent detection by a second oligonucleotide that is radiolabelled. The assays are shown to be appropriate for assaying DNA supercoiling by Escherichia coli DNA gyrase and DNA relaxation by eukaryotic topoisomerases I and II, and E.coli topoisomerase IV. The assays are readily adaptable to other enzymes that change DNA supercoiling (e.g. restriction enzymes) and are suitable for use in a high-throughput format. PMID:16936317

  14. 20150325 - Application of High-Throughput In Vitro Assays for ...

    EPA Pesticide Factsheets

    Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos

  15. High-throughput nanoparticle catalysis: partial oxidation of propylene.

    PubMed

    Duan, Shici; Kahn, Michael; Senkan, Selim

    2007-02-01

    Partial oxidation of propylene was investigated at 1 atm pressure over Rh/TiO(2) catalysts as a function of reaction temperature, metal loading and particle size using high-throughput methods. Catalysts were prepared by ablating thin sheets of pure rhodium metal using an excimer laser and by collecting the nanoparticles created on the external surfaces of TiO(2) pellets that were placed inside the ablation plume. Rh nanoparticles before the experiments were characterized by transmission electron microscopy (TEM) by collecting them on carbon film. Catalyst evaluations were performed using a high-throughput array channel microreactor system coupled to quadrupole mass spectrometry (MS) and gas chromatography (GC). The reaction conditions were 23% C(3)H(6), 20% O(2) and the balance helium in the feed, 20,000 h(-1) GHSV and a temperature range of 250-325 degrees C. The reaction products included primarily acetone (AT) and to a lesser degree propionaldehyde (PaL) as the C(3) products, together with deep oxidation products COx.

  16. A medium or high throughput protein refolding assay.

    PubMed

    Cowieson, Nathan P; Wensley, Beth; Robin, Gautier; Guncar, Gregor; Forwood, Jade; Hume, David A; Kobe, Bostjan; Martin, Jennifer L

    2008-01-01

    Expression of insoluble protein in E. coli is a major bottleneck of high throughput structural biology projects. Refolding proteins into native conformations from inclusion bodies could significantly increase the number of protein targets that can be taken on to structural studies. This chapter presents a simple assay for screening insoluble protein targets and identifying those that are most amenable to refolding. The assay is based on the observation that when proteins are refolded while bound to metal affinity resin, misfolded proteins are generally not eluted by imidazole. This difference is exploited here to distinguish between folded and misfolded proteins. Two implementations of the assay are described. The assay fits well into a standard high throughput structural biology pipeline, because it begins with the inclusion body preparations that are a byproduct of small-scale, automated expression and purification trials and does not require additional facilities. Two formats of the assay are described, a manual assay that is useful for screening small numbers of targets, and an automated implementation that is useful for large numbers of targets.

  17. High-throughput screening with micro-x-ray fluorescence

    SciTech Connect

    Havrilla, George J.; Miller, Thomasin C.

    2005-06-15

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity.

  18. High-throughput screening with micro-x-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-06-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity.

  19. High throughput SNP detection system based on magnetic nanoparticles separation.

    PubMed

    Liu, Bin; Jia, Yingying; Ma, Man; Li, Zhiyang; Liu, Hongna; Li, Song; Deng, Yan; Zhang, Liming; Lu, Zhuoxuan; Wang, Wei; He, Nongyue

    2013-02-01

    Single-nucleotide polymorphism (SNP) was one-base variations in DNA sequence that can often be helpful to find genes associations for hereditary disease, communicable disease and so on. We developed a high throughput SNP detection system based on magnetic nanoparticles (MNPs) separation and dual-color hybridization or single base extension. This system includes a magnetic separation unit for sample separation, three high precision robot arms for pipetting and microtiter plate transferring respectively, an accurate temperature control unit for PCR and DNA hybridization and a high accurate and sensitive optical signal detection unit for fluorescence detection. The cyclooxygenase-2 gene promoter region--65G > C polymorphism locus SNP genotyping experiment for 48 samples from the northern Jiangsu area has been done to verify that if this system can simplify manual operation of the researchers, save time and improve efficiency in SNP genotyping experiments. It can realize sample preparation, target sequence amplification, signal detection and data analysis automatically and can be used in clinical molecule diagnosis and high throughput fluorescence immunological detection and so on.

  20. Iterative ACORN as a high throughput tool in structural genomics.

    PubMed

    Selvanayagam, S; Velmurugan, D; Yamane, T

    2006-08-01

    High throughput macromolecular structure determination is very essential in structural genomics as the available number of sequence information far exceeds the number of available 3D structures. ACORN, a freely available resource in the CCP4 suite of programs is a comprehensive and efficient program for phasing in the determination of protein structures, when atomic resolution data are available. ACORN with the automatic model-building program ARP/wARP and refinement program REFMAC is a suitable combination for the high throughput structural genomics. ACORN can also be run with secondary structural elements like helices and sheets as inputs with high resolution data. In situations, where ACORN phasing is not sufficient for building the protein model, the fragments (incomplete model/dummy atoms) can again be used as a starting input. Iterative ACORN is proved to work efficiently in the subsequent model building stages in congerin (PDB-ID: lis3) and catalase (PDB-ID: 1gwe) for which models are available.

  1. Computational analysis of high-throughput flow cytometry data

    PubMed Central

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  2. High-throughput technology for novel SO2 oxidation catalysts

    NASA Astrophysics Data System (ADS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F.

    2011-10-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  3. High-Throughput Models for Exposure-Based Chemical ...

    EPA Pesticide Factsheets

    The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research program to prioritize chemical inventories for potential hazard. Similar capabilities for estimating exposure potential would support rapid risk-based prioritization for chemicals with limited information; here, we propose a framework for high-throughput exposure assessment. To demonstrate application, an analysis was conducted that predicts human exposure potential for chemicals and estimates uncertainty in these predictions by comparison to biomonitoring data. We evaluated 1936 chemicals using far-field mass balance human exposure models (USEtox and RAIDAR) and an indicator for indoor and/or consumer use. These predictions were compared to exposures inferred by Bayesian analysis from urine concentrations for 82 chemicals reported in the National Health and Nutrition Examination Survey (NHANES). Joint regression on all factors provided a calibrated consensus prediction, the variance of which serves as an empirical determination of uncertainty for prioritization on absolute exposure potential. Information on use was found to be most predictive; generally, chemicals above the limit of detection in NHANES had consumer/indoor use. Coupled with hazard HTS, exposure HTS can place risk earlie

  4. A High-Throughput Cidality Screen for Mycobacterium Tuberculosis

    PubMed Central

    Kaur, Parvinder; Ghosh, Anirban; Krishnamurthy, Ramya Vadageri; Bhattacharjee, Deepa Gagwani; Achar, Vijayashree; Datta, Santanu; Narayanan, Shridhar; Anbarasu, Anand; Ramaiah, Sudha

    2015-01-01

    Exposure to Mycobacterium tuberculosis (Mtb) aerosols is a major threat to tuberculosis (TB) researchers, even in bio-safety level-3 (BSL-3) facilities. Automation and high-throughput screens (HTS) in BSL3 facilities are essential for minimizing manual aerosol-generating interventions and facilitating TB research. In the present study, we report the development and validation of a high-throughput, 24-well ‘spot-assay’ for selecting bactericidal compounds against Mtb. The bactericidal screen concept was first validated in the fast-growing surrogate Mycobacterium smegmatis (Msm) and subsequently confirmed in Mtb using the following reference anti-tubercular drugs: rifampicin, isoniazid, ofloxacin and ethambutol (RIOE, acting on different targets). The potential use of the spot-assay to select bactericidal compounds from a large library was confirmed by screening on Mtb, with parallel plating by the conventional gold standard method (correlation, r2 = 0.808). An automated spot-assay further enabled an MBC90 determination on resistant and sensitive Mtb clinical isolates. The implementation of the spot-assay in kinetic screens to enumerate residual Mtb after either genetic silencing (anti-sense RNA, AS-RNA) or chemical inhibition corroborated its ability to detect cidality. This relatively simple, economical and quantitative HTS considerably minimized the bio-hazard risk and enabled the selection of novel vulnerable Mtb targets and mycobactericidal compounds. Thus, spot-assays have great potential to impact the TB drug discovery process. PMID:25693161

  5. A High Throughput Mechanical Screening Device for Cartilage Tissue Engineering

    PubMed Central

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Greg R.; Cosgrove, Brian D.; Dodge, George R.; Mauck, Robert L.

    2014-01-01

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying ‘hits’, or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. PMID:24275442

  6. Image quantification of high-throughput tissue microarray

    NASA Astrophysics Data System (ADS)

    Wu, Jiahua; Dong, Junyu; Zhou, Huiyu

    2006-03-01

    Tissue microarray (TMA) technology allows rapid visualization of molecular targets in thousands of tissue specimens at a time and provides valuable information on expression of proteins within tissues at a cellular and sub-cellular level. TMA technology overcomes the bottleneck of traditional tissue analysis and allows it to catch up with the rapid advances in lead discovery. Studies using TMA on immunohistochemistry (IHC) can produce a large amount of images for interpretation within a very short time. Manual interpretation does not allow accurate quantitative analysis of staining to be undertaken. Automatic image capture and analysis has been shown to be superior to manual interpretation. The aims of this work is to develop a truly high-throughput and fully automated image capture and analysis system. We develop a robust colour segmentation algorithm using hue-saturation-intensity (HSI) colour space to provide quantification of signal intensity and partitioning of staining on high-throughput TMA. Initial segmentation results and quantification data have been achieved on 16,000 TMA colour images over 23 different tissue types.

  7. High resolution hyperspectral imaging with a high throughput virtual slit

    NASA Astrophysics Data System (ADS)

    Gooding, Edward A.; Gunn, Thomas; Cenko, Andrew T.; Hajian, Arsen R.

    2016-05-01

    Hyperspectral imaging (HSI) device users often require both high spectral resolution, on the order of 1 nm, and high light-gathering power. A wide entrance slit assures reasonable étendue but degrades spectral resolution. Spectrometers built using High Throughput Virtual Slit™ (HTVS) technology optimize both parameters simultaneously. Two remote sensing use cases that require high spectral resolution are discussed. First, detection of atmospheric gases with intrinsically narrow absorption lines, such as hydrocarbon vapors or combustion exhaust gases such as NOx and CO2. Detecting exhaust gas species with high precision has become increasingly important in the light of recent events in the automobile industry. Second, distinguishing reflected daylight from emission spectra in the visible and NIR (VNIR) regions is most easily accomplished using the Fraunhofer absorption lines in solar spectra. While ground reflectance spectral features in the VNIR are generally quite broad, the Fraunhofer lines are narrow and provide a signature of intrinsic vs. extrinsic illumination. The High Throughput Virtual Slit enables higher spectral resolution than is achievable with conventional spectrometers by manipulating the beam profile in pupil space. By reshaping the instrument pupil with reflective optics, HTVS-equipped instruments create a tall, narrow image profile at the exit focal plane, typically delivering 5X or better the spectral resolution achievable with a conventional design.

  8. High throughput, quantitative analysis of human osteoclast differentiation and activity.

    PubMed

    Diepenhorst, Natalie A; Nowell, Cameron J; Rueda, Patricia; Henriksen, Kim; Pierce, Tracie; Cook, Anna E; Pastoureau, Philippe; Sabatini, Massimo; Charman, William N; Christopoulos, Arthur; Summers, Roger J; Sexton, Patrick M; Langmead, Christopher J

    2017-02-15

    Osteoclasts are multinuclear cells that degrade bone under both physiological and pathophysiological conditions. Osteoclasts are therefore a major target of osteoporosis therapeutics aimed at preserving bone. Consequently, analytical methods for osteoclast activity are useful for the development of novel biomarkers and/or pharmacological agents for the treatment of osteoporosis. The nucleation state of an osteoclast is indicative of its maturation and activity. To date, activity is routinely measured at the population level with only approximate consideration of the nucleation state (an 'osteoclast population' is typically defined as cells with ≥3 nuclei). Using a fluorescent substrate for tartrate-resistant acid phosphatase (TRAP), a routinely used marker of osteoclast activity, we developed a multi-labelled imaging method for quantitative measurement of osteoclast TRAP activity at the single cell level. Automated image analysis enables interrogation of large osteoclast populations in a high throughput manner using open source software. Using this methodology, we investigated the effects of receptor activator of nuclear factor kappa-B ligand (RANK-L) on osteoclast maturation and activity and demonstrated that TRAP activity directly correlates with osteoclast maturity (i.e. nuclei number). This method can be applied to high throughput screening of osteoclast-targeting compounds to determine changes in maturation and activity.

  9. Piezo-thermal Probe Array for High Throughput Applications

    PubMed Central

    Gaitas, Angelo; French, Paddy

    2012-01-01

    Microcantilevers are used in a number of applications including atomic-force microscopy (AFM). In this work, deflection-sensing elements along with heating elements are integrated onto micromachined cantilever arrays to increase sensitivity, and reduce complexity and cost. An array of probes with 5–10 nm gold ultrathin film sensors on silicon substrates for high throughput scanning probe microscopy is developed. The deflection sensitivity is 0.2 ppm/nm. Plots of the change in resistance of the sensing element with displacement are used to calibrate the probes and determine probe contact with the substrate. Topographical scans demonstrate high throughput and nanometer resolution. The heating elements are calibrated and the thermal coefficient of resistance (TCR) is 655 ppm/K. The melting temperature of a material is measured by locally heating the material with the heating element of the cantilever while monitoring the bending with the deflection sensing element. The melting point value measured with this method is in close agreement with the reported value in literature. PMID:23641125

  10. High-throughput electrical characterization for robust overlay lithography control

    NASA Astrophysics Data System (ADS)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  11. New high-throughput methods of investigating polymer electrolytes

    NASA Astrophysics Data System (ADS)

    Alcock, Hannah J.; White, Oliver C.; Jegelevicius, Grazvydas; Roberts, Matthew R.; Owen, John R.

    2011-03-01

    Polymer electrolyte films have been prepared by solution casting techniques from precursor solutions of a poly(vinylidene fluoride-co-hexafluoropropylene) (PVdF-HFP), lithium-bis(trifluoromethane) sulfonimide (LiTFSI), and propylene carbonate (PC). Arrays of graded composition were characterised by electrochemical impedance spectroscopy (EIS), differential scanning calorimetry (DSC) and X-ray diffraction (XRD) using high throughput techniques. Impedance analysis showed the resistance of the films as a function of LiTFSI, PC and polymer content. The ternary plot of conductivity shows an area that combines a solid-like mechanical stability with high conductivity, 1 × 10-5 S cm-1 at the composition 0.55/0.15/0.30 wt% PVdF-HFP/LiTFSI/PC, increasing with PC content. In regions with less than a 50 wt% fraction of PVdF-HFP the films were too soft to give meaningful results by this method. The DSC measurements on solvent free, salt-doped polymers show a reduced crystallinity, and high throughput XRD patterns show that non-polar crystalline phases are suppressed by the presence of LiTFSI and PC.

  12. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  13. Arabidopsis Seed Content QTL Mapping Using High-Throughput Phenotyping: The Assets of Near Infrared Spectroscopy

    PubMed Central

    Jasinski, Sophie; Lécureuil, Alain; Durandet, Monique; Bernard-Moulin, Patrick; Guerche, Philippe

    2016-01-01

    Seed storage compounds are of crucial importance for human diet, feed and industrial uses. In oleo-proteaginous species like rapeseed, seed oil and protein are the qualitative determinants that conferred economic value to the harvested seed. To date, although the biosynthesis pathways of oil and storage protein are rather well-known, the factors that determine how these types of reserves are partitioned in seeds have to be identified. With the aim of implementing a quantitative genetics approach, requiring phenotyping of 100s of plants, our first objective was to establish near-infrared reflectance spectroscopic (NIRS) predictive equations in order to estimate oil, protein, carbon, and nitrogen content in Arabidopsis seed with high-throughput level. Our results demonstrated that NIRS is a powerful non-destructive, high-throughput method to assess the content of these four major components studied in Arabidopsis seed. With this tool in hand, we analyzed Arabidopsis natural variation for these four components and illustrated that they all displayed a wide range of variation. Finally, NIRS was used in order to map QTL for these four traits using seeds from the Arabidopsis thaliana Ct-1 × Col-0 recombinant inbred line population. Some QTL co-localized with QTL previously identified, but others mapped to chromosomal regions never identified so far for such traits. This paper illustrates the usefulness of NIRS predictive equations to perform accurate high-throughput phenotyping of Arabidopsis seed content, opening new perspectives in gene identification following QTL mapping and genome wide association studies. PMID:27891138

  14. Scaling and automation of a high-throughput single-cell-derived tumor sphere assay chip.

    PubMed

    Cheng, Yu-Heng; Chen, Yu-Chih; Brien, Riley; Yoon, Euisik

    2016-10-07

    Recent research suggests that cancer stem-like cells (CSCs) are the key subpopulation for tumor relapse and metastasis. Due to cancer plasticity in surface antigen and enzymatic activity markers, functional tumorsphere assays are promising alternatives for CSC identification. To reliably quantify rare CSCs (1-5%), thousands of single-cell suspension cultures are required. While microfluidics is a powerful tool in handling single cells, previous works provide limited throughput and lack automatic data analysis capability required for high-throughput studies. In this study, we present the scaling and automation of high-throughput single-cell-derived tumor sphere assay chips, facilitating the tracking of up to ∼10 000 cells on a chip with ∼76.5% capture rate. The presented cell capture scheme guarantees sampling a representative population from the bulk cells. To analyze thousands of single-cells with a variety of fluorescent intensities, a highly adaptable analysis program was developed for cell/sphere counting and size measurement. Using a Pluronic® F108 (poly(ethylene glycol)-block-poly(propylene glycol)-block-poly(ethylene glycol)) coating on polydimethylsiloxane (PDMS), a suspension culture environment was created to test a controversial hypothesis: whether larger or smaller cells are more stem-like defined by the capability to form single-cell-derived spheres. Different cell lines showed different correlations between sphere formation rate and initial cell size, suggesting heterogeneity in pathway regulation among breast cancer cell lines. More interestingly, by monitoring hundreds of spheres, we identified heterogeneity in sphere growth dynamics, indicating the cellular heterogeneity even within CSCs. These preliminary results highlight the power of unprecedented high-throughput and automation in CSC studies.

  15. MassCode Liquid Arrays as a Tool for Multiplexed High-Throughput Genetic Profiling

    PubMed Central

    Richmond, Gregory S.; Khine, Htet; Zhou, Tina T.; Ryan, Daniel E.; Brand, Tony; McBride, Mary T.; Killeen, Kevin

    2011-01-01

    Multiplexed detection assays that analyze a modest number of nucleic acid targets over large sample sets are emerging as the preferred testing approach in such applications as routine pathogen typing, outbreak monitoring, and diagnostics. However, very few DNA testing platforms have proven to offer a solution for mid-plexed analysis that is high-throughput, sensitive, and with a low cost per test. In this work, an enhanced genotyping method based on MassCode technology was devised and integrated as part of a high-throughput mid-plexing analytical system that facilitates robust qualitative differential detection of DNA targets. Samples are first analyzed using MassCode PCR (MC-PCR) performed with an array of primer sets encoded with unique mass tags. Lambda exonuclease and an array of MassCode probes are then contacted with MC-PCR products for further interrogation and target sequences are specifically identified. Primer and probe hybridizations occur in homogeneous solution, a clear advantage over micro- or nanoparticle suspension arrays. The two cognate tags coupled to resultant MassCode hybrids are detected in an automated process using a benchtop single quadrupole mass spectrometer. The prospective value of using MassCode probe arrays for multiplexed bioanalysis was demonstrated after developing a 14plex proof of concept assay designed to subtype a select panel of Salmonella enterica serogroups and serovars. This MassCode system is very flexible and test panels can be customized to include more, less, or different markers. PMID:21544191

  16. Towards high-throughput single cell/clone cultivation and analysis.

    PubMed

    Lindström, Sara; Larsson, Rolf; Svahn, Helene Andersson

    2008-03-01

    In order to better understand cellular processes and behavior, a controlled way of studying high numbers of single cells and their clone formation is greatly needed. Numerous ways of ordering single cells into arrays have previously been described, but platforms in which each cell/clone can be addressed to an exact position in the microplate, cultivated for weeks and treated separately in a high-throughput manner have until now been missing. Here, a novel microplate developed for high-throughput single cell/clone cultivation and analysis is presented. Rapid single cell seeding into microwells, using conventional flow cytometry, allows several thousands of single cells to be cultivated, short-term (72 h) or long-term (10-14 days), and analyzed individually. By controlled sorting of individual cells to predefined locations in the microplate, analysis of single cell heterogeneity and clonogenic properties related to drug sensitivity can be accomplished. Additionally, the platform requires remarkably low number of cells, a major advantage when screening limited amounts of patient cell samples. By seeding single cells into the microplate it is possible to analyze the cells for over 14 generations, ending up with more than 10 000 cells in each well. Described here is a proof-of-concept on compartmentalization and cultivation of thousands of individual cells enabling heterogeneity analysis of various cells/clones and their response to different drugs.

  17. Automatic 3D Cell Analysis in High-Throughput Microarray Using Micropillar and Microwell Chips.

    PubMed

    Lee, Dong Woo; Lee, Moo-Yeal; Ku, Bosung; Nam, Do-Hyun

    2015-10-01

    Area-based and intensity-based 3D cell viability measurement methods are compared in high-throughput screening in order to analyze their effects on the assay results (doubling time and IC50) and their repeatability. Many other 3D cell-based high-throughput screening platforms had been previously introduced, but these had not clearly addressed the effects of the two methods on the assay results and assay repeatability. In this study, the optimal way to analyze 3D cultured cells is achieved by comparing day-to-day data of doubling times and IC50 values obtained from the two methods. In experiments, the U251 cell line is grown in chips. The doubling time, based on the area of the 3D cells, was 27.8 ± 1.8 h (standard deviation: 6.6%) and 27.8 ± 3.8 h (standard deviation: 13.7%) based on the intensity of the 3D cells. The doubling time calculated by area shows a smaller standard deviation than one calculated by intensity. IC50 values calculated by both methods are very similar. The standard deviations of IC50 values for the two methods were within ± 3-fold. The IC50 variations of the 12 compounds were similar regardless of the viability measurement methods and were highly related to the shape of the dose-response curves.

  18. Multicenter Evaluation of a New High-Throughput HbA1c Testing Platform.

    PubMed

    Imdahl, R; Roddiger, R; Casis-Saenz, E

    2016-12-01

    This non-interventional, multicenter study with anonymized leftover patient samples was performed to evaluate the reliability and analytical performance of the novel high-throughput HbA1c cobas c 513 analyzer. A performance evaluation was carried out at three sites to validate the overall system functionality, user interaction and analytical performance of the new cobas c 513 analyzer using the Tina-quant® HbA1c Gen. 3 assay. HbA1c applications for both whole blood and hemolysate samples show a high precision using both quality control materials and pools of whole blood or hemolysates. The method comparison of HbA1c Gen. 3 on the cobas c 513 with HbA1c Gen. 2 on the Menarini HA-8180V using 249 whole blood samples shows high concordance. Moreover, analyte concentrations as measured by the cobas c 513 and Tosoh G8 and HbA1c Gen. 2 on COBAS INTEGRA® 800 CTS are comparable. The cobas c 513 has proven to be a reliable system with excellent analytical performance of the Tinaquant® HbA1c Gen. 3 assay in high throughput laboratories.

  19. High-throughput microcavitation bubble induced cellular mechanotransduction

    NASA Astrophysics Data System (ADS)

    Compton, Jonathan Lee

    inhibitor to IP 3 induced Ca2+ release. This capability opens the development of a high-throughput screening platform for molecules that modulate cellular mechanotransduction. We have applied this approach to screen the effects of a small set of small molecules, in a 96-well plate in less than an hour. These detailed studies offer a basis for the design, development, and implementation of a novel high-throughput mechanotransduction assay to rapidly screen the effect of small molecules on cellular mechanotransduction at high throughput.

  20. High-throughput purification of affinity-tagged recombinant proteins.

    PubMed

    Wiesler, Simone C; Weinzierl, Robert O J

    2012-08-26

    X-ray crystallography is the method of choice for obtaining a detailed view of the structure of proteins. Such studies need to be complemented by further biochemical analyses to obtain detailed insights into structure/function relationships. Advances in oligonucleotide- and gene synthesis technology make large-scale mutagenesis strategies increasingly feasible, including the substitution of target residues by all 19 other amino acids. Gain- or loss-of-function phenotypes then allow systematic conclusions to be drawn, such as the contribution of particular residues to catalytic activity, protein stability and/or protein-protein interaction specificity. In order to attribute the different phenotypes to the nature of the mutation--rather than to fluctuating experimental conditions--it is vital to purify and analyse the proteins in a controlled and reproducible manner. High-throughput strategies and the automation of manual protocols on robotic liquid-handling platforms have created opportunities to perform such complex molecular biological procedures with little human intervention and minimal error rates. Here, we present a general method for the purification of His-tagged recombinant proteins in a high-throughput manner. In a recent study, we applied this method to a detailed structure-function investigation of TFIIB, a component of the basal transcription machinery. TFIIB is indispensable for promoter-directed transcription in vitro and is essential for the recruitment of RNA polymerase into a preinitiation complex. TFIIB contains a flexible linker domain that penetrates the active site cleft of RNA polymerase. This linker domain confers two biochemically quantifiable activities on TFIIB, namely (i) the stimulation of the catalytic activity during the 'abortive' stage of transcript initiation, and (ii) an additional contribution to the specific recruitment of RNA polymerase into the preinitiation complex. We exploited the high-throughput purification method to

  1. Microgradient-heaters as tools for high-throughput experimentation.

    PubMed

    Meyer, Robert; Hamann, Sven; Ehmann, Michael; Thienhaus, Sigurd; Jaeger, Stefanie; Thiede, Tobias; Devi, Anjana; Fischer, Roland A; Ludwig, Alfred

    2012-10-08

    A microgradient-heater (MGH) was developed, and its feasibility as a tool for high-throughput materials science experimentation was tested. The MGH is derived from microhot plate (MHP) systems and allows combinatorial thermal processing on the micronano scale. The temperature gradient is adjustable by the substrate material. For an Au-coated MGH membrane a temperature drop from 605 to 100 °C was measured over a distance of 965 μm, resulting in an average temperature change of 0.52 K/μm. As a proof of principle, we demonstrate the feasibility of MGHs on the example of a chemical vapor deposition (CVD) process. The achieved results show discontinuous changes in surface morphology within a continuous TiO2 film. Furthermore the MGH can be used to get insights into the energetic relations of film growth processes, giving it the potential for microcalorimetry measurements.

  2. Multifunctional encoded particles for high-throughput biomolecule analysis.

    PubMed

    Pregibon, Daniel C; Toner, Mehmet; Doyle, Patrick S

    2007-03-09

    High-throughput screening for genetic analysis, combinatorial chemistry, and clinical diagnostics benefits from multiplexing, which allows for the simultaneous assay of several analytes but necessitates an encoding scheme for molecular identification. Current approaches for multiplexed analysis involve complicated or expensive processes for encoding, functionalizing, or decoding active substrates (particles or surfaces) and often yield a very limited number of analyte-specific codes. We present a method based on continuous-flow lithography that combines particle synthesis and encoding and probe incorporation into a single process to generate multifunctional particles bearing over a million unique codes. By using such particles, we demonstrate a multiplexed, single-fluorescence detection of DNA oligomers with encoded particle libraries that can be scanned rapidly in a flow-through microfluidic channel. Furthermore, we demonstrate with high specificity the same multiplexed detection using individual multiprobe particles.

  3. High-throughput ballistic injection nanorheology to measure cell mechanics

    PubMed Central

    Wu, Pei-Hsun; Hale, Christopher M; Chen, Wei-Chiang; Lee, Jerry S H; Tseng, Yiider; Wirtz, Denis

    2015-01-01

    High-throughput ballistic injection nanorheology is a method for the quantitative study of cell mechanics. Cell mechanics are measured by ballistic injection of submicron particles into the cytoplasm of living cells and tracking the spontaneous displacement of the particles at high spatial resolution. The trajectories of the cytoplasm-embedded particles are transformed into mean-squared displacements, which are subsequently transformed into frequency-dependent viscoelastic moduli and time-dependent creep compliance of the cytoplasm. This method allows for the study of a wide range of cellular conditions, including cells inside a 3D matrix, cell subjected to shear flows and biochemical stimuli, and cells in a live animal. Ballistic injection lasts < 1 min and is followed by overnight incubation. Multiple particle tracking for one cell lasts < 1 min. Forty cells can be examined in < 1 h. PMID:22222790

  4. High throughput sequencing reveals a novel fabavirus infecting sweet cherry.

    PubMed

    Villamor, D E V; Pillai, S S; Eastwell, K C

    2017-03-01

    The genus Fabavirus currently consists of five species represented by viruses that infect a wide range of hosts but none reported from temperate climate fruit trees. A virus with genomic features resembling fabaviruses (tentatively named Prunus virus F, PrVF) was revealed by high throughput sequencing of extracts from a sweet cherry tree (Prunus avium). PrVF was subsequently shown to be graft transmissible and further identified in three other non-symptomatic Prunus spp. from different geographical locations. Two genetic variants of RNA1 and RNA2 coexisted in the same samples. RNA1 consisted of 6,165 and 6,163 nucleotides, and RNA2 consisted of 3,622 and 3,468 nucleotides.

  5. Statistically invalid classification of high throughput gene expression data.

    PubMed

    Barbash, Shahar; Soreq, Hermona

    2013-01-01

    Classification analysis based on high throughput data is a common feature in neuroscience and other fields of science, with a rapidly increasing impact on both basic biology and disease-related studies. The outcome of such classifications often serves to delineate novel biochemical mechanisms in health and disease states, identify new targets for therapeutic interference, and develop innovative diagnostic approaches. Given the importance of this type of studies, we screened 111 recently-published high-impact manuscripts involving classification analysis of gene expression, and found that 58 of them (53%) based their conclusions on a statistically invalid method which can lead to bias in a statistical sense (lower true classification accuracy then the reported classification accuracy). In this report we characterize the potential methodological error and its scope, investigate how it is influenced by different experimental parameters, and describe statistically valid methods for avoiding such classification mistakes.

  6. Predicting Novel Bulk Metallic Glasses via High- Throughput Calculations

    NASA Astrophysics Data System (ADS)

    Perim, E.; Lee, D.; Liu, Y.; Toher, C.; Gong, P.; Li, Y.; Simmons, W. N.; Levy, O.; Vlassak, J.; Schroers, J.; Curtarolo, S.

    Bulk metallic glasses (BMGs) are materials which may combine key properties from crystalline metals, such as high hardness, with others typically presented by plastics, such as easy processability. However, the cost of the known BMGs poses a significant obstacle for the development of applications, which has lead to a long search for novel, economically viable, BMGs. The emergence of high-throughput DFT calculations, such as the library provided by the AFLOWLIB consortium, has provided new tools for materials discovery. We have used this data to develop a new glass forming descriptor combining structural factors with thermodynamics in order to quickly screen through a large number of alloy systems in the AFLOWLIB database, identifying the most promising systems and the optimal compositions for glass formation. National Science Foundation (DMR-1436151, DMR-1435820, DMR-1436268).

  7. High-throughput sequencing in veterinary infection biology and diagnostics.

    PubMed

    Belák, S; Karlsson, O E; Leijon, M; Granberg, F

    2013-12-01

    Sequencing methods have improved rapidly since the first versions of the Sanger techniques, facilitating the development of very powerful tools for detecting and identifying various pathogens, such as viruses, bacteria and other microbes. The ongoing development of high-throughput sequencing (HTS; also known as next-generation sequencing) technologies has resulted in a dramatic reduction in DNA sequencing costs, making the technology more accessible to the average laboratory. In this White Paper of the World Organisation for Animal Health (OIE) Collaborating Centre for the Biotechnology-based Diagnosis of Infectious Diseases in Veterinary Medicine (Uppsala, Sweden), several approaches and examples of HTS are summarised, and their diagnostic applicability is briefly discussed. Selected future aspects of HTS are outlined, including the need for bioinformatic resources, with a focus on improving the diagnosis and control of infectious diseases in veterinary medicine.

  8. Resolving postglacial phylogeography using high-throughput sequencing

    PubMed Central

    Emerson, Kevin J.; Merz, Clayton R.; Catchen, Julian M.; Hohenlohe, Paul A.; Cresko, William A.; Bradshaw, William E.; Holzapfel, Christina M.

    2010-01-01

    The distinction between model and nonmodel organisms is becoming increasingly blurred. High-throughput, second-generation sequencing approaches are being applied to organisms based on their interesting ecological, physiological, developmental, or evolutionary properties and not on the depth of genetic information available for them. Here, we illustrate this point using a low-cost, efficient technique to determine the fine-scale phylogenetic relationships among recently diverged populations in a species. This application of restriction site-associated DNA tags (RAD tags) reveals previously unresolved genetic structure and direction of evolution in the pitcher plant mosquito, Wyeomyia smithii, from a southern Appalachian Mountain refugium following recession of the Laurentide Ice Sheet at 22,000–19,000 B.P. The RAD tag method can be used to identify detailed patterns of phylogeography in any organism regardless of existing genomic data, and, more broadly, to identify incipient speciation and genome-wide variation in natural populations in general. PMID:20798348

  9. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    PubMed

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H

    2017-09-21

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  10. High-throughput electronic biology: mining information for drug discovery.

    PubMed

    Loging, William; Harland, Lee; Williams-Jones, Bryn

    2007-03-01

    The vast range of in silico resources that are available in life sciences research hold much promise towards aiding the drug discovery process. To fully realize this opportunity, computational scientists must consider the practical issues of data integration and identify how best to apply these resources scientifically. In this article we describe in silico approaches that are driven towards the identification of testable laboratory hypotheses; we also address common challenges in the field. We focus on flexible, high-throughput techniques, which may be initiated independently of 'wet-lab' experimentation, and which may be applied to multiple disease areas. The utility of these approaches in drug discovery highlights the contribution that in silico techniques can make and emphasizes the need for collaboration between the areas of disease research and computational science.

  11. High-throughput plastic microlenses fabricated using microinjection molding techniques

    NASA Astrophysics Data System (ADS)

    Appasamy, Sreeram; Li, Weizhuo; Lee, Se Hwan; Boyd, Joseph T.; Ahn, Chong H.

    2005-12-01

    A novel fabrication scheme to develop high-throughput plastic microlenses using injection-molding techniques is realized. The initial microlens mold is fabricated using the well-known reflow technique. The reflow process is optimized to obtain reliable and repeatable microlens patterns. The master mold insert for the injection-molding process is fabricated using metal electroforming. The electroplating process is optimized for obtaining a low stress electroform. Two new plastic materials, cyclo olefin copolymer (COC) and Poly IR 2 are introduced in this work for fabricating microlenses. The plastic microlenses have been characterized for their focal lengths that range from 200 µm to 1.9 mm. This technique enables high-volume production of plastic microlenses with cycle times for a single chip being of the order of 60 s.

  12. High-throughput drawing and testing of metallic glass nanostructures.

    PubMed

    Hasan, Molla; Kumar, Golden

    2017-03-02

    Thermoplastic embossing of metallic glasses promises direct imprinting of metal nanostructures using templates. However, embossing high-aspect-ratio nanostructures faces unworkable flow resistance due to friction and non-wetting conditions at the template interface. Herein, we show that these inherent challenges of embossing can be reversed by thermoplastic drawing using templates. The flow resistance not only remains independent of wetting but also decreases with increasing feature aspect-ratio. Arrays of assembled nanotips, nanowires, and nanotubes with aspect-ratios exceeding 1000 can be produced through controlled elongation and fracture of metallic glass structures. In contrast to embossing, the drawing approach generates two sets of nanostructures upon final fracture; one set remains anchored to the metallic glass substrate while the second set is assembled on the template. This method can be readily adapted for high-throughput fabrication and testing of nanoscale tensile specimens, enabling rapid screening of size-effects in mechanical behavior.

  13. Towards high throughput screening of nanoparticle flotation collectors.

    PubMed

    Abarca, Carla; Yang, Songtao; Pelton, Robert H

    2015-12-15

    To function as flotation collectors for mineral processing, polymeric nanoparticles require a delicate balance of surface properties to give mineral-specific deposition and colloidal stability in high ionic strength alkaline media, while remaining sufficiently hydrophobic to promote flotation. Combinatorial nanoparticle surface modification, in conjunction with high throughput screening, is a promising approach for nanoparticle development. However, efficient automated screening assays are required to reject ineffective particles without having to undergo time consuming flotation testing. Herein we demonstrate that determining critical coagulation concentrations of sodium carbonate in combination with measuring the advancing water contact angle of nanoparticle-saturated glass surfaces can be used to screen ineffective nanoparticles. Finally, none of our first nanoparticle library based on poly(ethylene glycol) methyl ether methacrylate (PEG-methacrylate) were effective flotation collectors because the nanoparticles were too hydrophilic.

  14. A Colloidal Stability Assay Suitable for High-Throughput Screening.

    PubMed

    Abarca, Carla; Ali, M Monsur; Yang, Songtao; Dong, Xiaofei; Pelton, Robert H

    2016-03-01

    A library of 32 polystyrene copolymer latexes, with diameters ranging between 53 and 387 nm, was used to develop and demonstrate a high-throughput assay using a 96-well microplate platform to measure critical coagulation concentrations, a measure of colloidal stability. The most robust assay involved an automated centrifugation-decantation step to remove latex aggregates before absorbance measurements, eliminating aggregate interference with optical measurements made through the base of the multiwell plates. For smaller nanoparticles (diameter <150 nm), the centrifugation-decantation step was not required as the interference was less than with larger particles. Parallel measurements with a ChemiDoc MP plate scanner gave indications of aggregation; however, the results were less sensitive than the absorbance measurements.

  15. High throughput x-ray optics: an overview.

    PubMed

    Gorenstein, P

    1988-04-15

    Several x-ray astronomy missions of the 1990s will contain focusing telescopes with significantly more collecting power than the Einstein Observatory. There is increasing emphasis on spectroscopy. ESA's XMM with 10(4) cm(2) of effective area will be the largest. A high throughput facility with over 10(5) cm(2) of effective area and 20-sec of arc angular resolution is needed ultimately for various scientific studies such as high resolution spectroscopic observations of QSOs. At least one of the following techniques currently being developed for fabricating x-ray telescopes including automated figuring of flats as parabolic reflectors, replication of cylindrical shells, and the alignment of thin lacquer-coated conical foils is likely to permit the construction of modular arrays of telescopes with the area and angular resolution required.

  16. Noise and non-linearities in high-throughput data

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Anh; Koukolíková-Nicola, Zdena; Bagnoli, Franco; Lió, Pietro

    2009-01-01

    High-throughput data analyses are becoming common in biology, communications, economics and sociology. The vast amounts of data are usually represented in the form of matrices and can be considered as knowledge networks. Spectra-based approaches have proved useful in extracting hidden information within such networks and for estimating missing data, but these methods are based essentially on linear assumptions. The physical models of matching, when applicable, often suggest non-linear mechanisms, that may sometimes be identified as noise. The use of non-linear models in data analysis, however, may require the introduction of many parameters, which lowers the statistical weight of the model. According to the quality of data, a simpler linear analysis may be more convenient than more complex approaches. In this paper, we show how a simple non-parametric Bayesian model may be used to explore the role of non-linearities and noise in synthetic and experimental data sets.

  17. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    PubMed

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Numerical techniques for high-throughput reflectance interference biosensing

    NASA Astrophysics Data System (ADS)

    Sevenler, Derin; Ünlü, M. Selim

    2016-06-01

    We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.

  19. High throughput fingerprint analysis of large-insert clones.

    PubMed

    Marra, M A; Kucaba, T A; Dietrich, N L; Green, E D; Brownstein, B; Wilson, R K; McDonald, K M; Hillier, L W; McPherson, J D; Waterston, R H

    1997-11-01

    As part of the Human Genome Project, the Washington University Genome Sequencing Center has commenced systematic sequencing of human chromsome 7. To organize and supply the effort, we have undertaken the construction of sequence-ready physical maps for defined chromosomal intervals. Map construction is a serial process composed of three main activities. First, candidate STS-positive large-insert PAC and BAC clones are identified. Next, these candidate clones are subjected to fingerprint analysis. Finally, the fingerprint data are used to assemble sequence-ready maps. The fingerprinting method we have devised is key to the success of the overall approach. We present here the details of the method and show that the fingerprints are of sufficient quality to permit the construction of megabase-size contigs in defined regions of the human genome. We anticipate that the high throughput and precision characteristic of our fingerprinting method will make it of general utility.

  20. UAV-based high-throughput phenotyping in legume crops

    NASA Astrophysics Data System (ADS)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (p<0.05) with seed yield of legume crops. Results endorse the potential of UAS-based sensing technology to rapidly measure those phenotyping traits.

  1. Proposed high throughput electrorefining treatment for spent N- Reactor fuel

    SciTech Connect

    Gay, E.C.; Miller, W.E.; Laidler, J.J.

    1996-05-01

    A high-throughput electrorefining process is being adapted to treat spent N-Reactor fuel for ultimate disposal in a geologic repository. Anodic dissolution tests were made with unirradiated N-Reactor fuel to determine the type of fragmentation necessary to provide fuel segments suitable for this process. Based on these tests, a conceptual design was produced of a plant-scale electrorefiner. In this design, the diameter of an electrode assembly is about 1.07 m (42 in.). Three of these assemblies in an electrorefiner would accommodate a 3-metric-ton batch of N-Reactor fuel that would be processed at a rate of 42 kg of uranium per hour.

  2. Muscle plasticity and high throughput gene expression studies.

    PubMed

    Reggiani, Carlo; Kronnie, Geertruuy Te

    2004-01-01

    Changes in gene expression are known to contribute to muscle plasticity. Until recently most studies have described differences of one or few genes at a time, in the last few years, however, the development of new technology of high throughput mRNA expression analysis has allowed the study of a large part if not all transcripts in the same experiment. Knowledge on any muscle adaptive response has already gained from the application of this novel approach, but the most important new findings have come from studies on muscle atrophy. A new and unexpected groups of genes, which increase their expression during atrophy and are, therefore, designated as atrogins, have been discovered. In spite of the impressive power of the new technology many problems are still to be resolved to optimize the experimental design and to extract all information which are provided by the outcome of the global mRNA assessment.

  3. High-throughput ab-initio dilute solute diffusion database

    PubMed Central

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  4. Microfluidic cell chips for high-throughput drug screening.

    PubMed

    Chi, Chun-Wei; Ahmed, Ah Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong

    2016-05-01

    The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell-drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers.

  5. Single-platelet nanomechanics measured by high-throughput cytometry

    NASA Astrophysics Data System (ADS)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2016-10-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  6. Proton Diffusion Model for High-Throughput Calculations

    NASA Astrophysics Data System (ADS)

    Wisesa, Pandu; Mueller, Tim

    2013-03-01

    Solid oxide fuel cells (SOFCs) have many advantages over other fuel cells with high efficiency, myriad fuel choices, and low cost. The main issue however is the high operating temperature of SOFCs, which can be lowered by using an electrolyte material with high ionic conductivity, such as proton conducting oxides. Our goal is to identify promising proton-conducting materials in a manner that is time and cost efficient through the utilization of high-throughput calculations. We present a model for proton diffusion developed using machine learning techniques with training data that consists of density functional theory (DFT) calculations on various metal oxides. The built model is tested against other DFT results to see how it performs. The results of the DFT calculations and how the model fares are discussed, with focus on hydrogen diffusion pathways inside the bulk material.

  7. Macromolecular Crystallography conventional and high-throughput methods

    SciTech Connect

    Wasserman, Stephen R.; Smith, David W.; D'Amico, Kevin L.; Koss, John W.; Morisco, Laura L.; Burley, Stephen K.

    2007-09-27

    High-throughput data collection requires the seamless interoperation of various hardware components. User-supplied descriptions of protein crystals must also be directly linked with the diffraction data. Such linkages can be achieved efficiently with computer databases. A database that tracks production of the protein samples, crystallization, and diffraction from the resultant crystals serves as the glue that holds the entire gene-to-structure process together. This chapter begins by discussing data collection processes and hardware. It then illustrates how a well-constructed database ensures information flow through the steps of data acquisition. Such a database allows synchrotron beamline measurements to be directly and efficiently integrated into the process of protein crystallographic structure determination.

  8. Automated sample area definition for high-throughput microscopy.

    PubMed

    Zeder, M; Ellrott, A; Amann, R

    2011-04-01

    High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.

  9. Automated high-throughput nanoliter-scale protein crystallization screening.

    PubMed

    Li, Fenglei; Robinson, Howard; Yeung, Edward S

    2005-12-01

    A highly efficient method is developed for automated high-throughput screening of nanoliter-scale protein crystallization. The system integrates liquid dispensing, crystallization and detection. The automated liquid dispensing system handles nanoliters of protein and various combinations of precipitants in parallel to access diverse regions of the phase diagram. A new detection scheme, native fluorescence, with complementary visible-light detection is employed for monitoring the progress of crystallization. This detection mode can distinguish protein crystals from inorganic crystals in a nondestructive manner. A gas-permeable membrane covering the microwells simplifies evaporation rate control and probes extended conditions in the phase diagram. The system was successfully demonstrated for the screening of lysozyme crystallization under 81 different conditions.

  10. Characterizing immune repertoires by high throughput sequencing: strategies and applications

    PubMed Central

    Calis, Jorg J.A.; Rosenberg, Brad R.

    2014-01-01

    As the key cellular effectors of adaptive immunity, T and B lymphocytes utilize specialized receptors to recognize, respond to, and neutralize a diverse array of extrinsic threats. These receptors (immunoglobulins in B lymphocytes, T cell receptors in T lymphocytes) are incredibly variable, the products of specialized genetic diversification mechanisms that generate complex lymphocyte repertoires with extensive collections of antigen specificities. Recent advances in high throughput sequencing (HTS) technologies have transformed our ability to examine antigen receptor repertoires at single nucleotide, and more recently, single cell, resolution. Here we review current approaches to examining antigen receptor repertoires by HTS, and discuss inherent biological and technical challenges. We further describe emerging applications of this powerful methodology for exploring the adaptive immune system. PMID:25306219

  11. Measuring growth rate in high-throughput growth phenotyping.

    PubMed

    Blomberg, Anders

    2011-02-01

    Growth rate is an important variable and parameter in biology with a central role in evolutionary, functional genomics, and systems biology studies. In this review the pros and cons of the different technologies presently available for high-throughput measurements of growth rate are discussed. Growth rate can be measured in liquid microcultivation of individual strains, in competition between strains, as growing colonies on agar, as division of individual cells, and estimated from molecular reporters. Irrespective of methodology, statistical issues such as spatial biases and batch effects are crucial to investigate and correct for to ensure low false discovery rates. The rather low correlations between studies indicate that cross-laboratory comparison and standardization are pressing issue to assure high-quality and comparable growth-rate data. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Quantitative High-throughput Luciferase Screening in Identifying CAR Modulators

    PubMed Central

    Lynch, Caitlin; Zhao, Jinghua; Wang, Hongbing; Xia, Menghang

    2017-01-01

    Summary The constitutive androstane receptor (CAR, NR1I3) is responsible for the transcription of multiple drug metabolizing enzymes and transporters. There are two possible methods of activation for CAR, direct ligand binding and a ligand-independent method, which makes this a unique nuclear receptor. Both of these mechanisms require translocation of CAR from the cytoplasm into the nucleus. Interestingly, CAR is constitutively active in immortalized cell lines due to the basal nuclear location of this receptor. This creates an important challenge in most in vitro assay models because immortalized cells cannot be used without inhibiting the basal activity. In this book chapter, we go into detail of how to perform quantitative high-throughput screens to identify hCAR1 modulators through the employment of a double stable cell line. Using this line, we are able to identify activators, as well as deactivators, of the challenging nuclear receptor, CAR. PMID:27518621

  13. High-Throughput Automation in Chemical Process Development.

    PubMed

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  14. High-throughput process development for biopharmaceutical drug substances.

    PubMed

    Bhambure, Rahul; Kumar, Kaushal; Rathore, Anurag S

    2011-03-01

    Quality by Design (QbD) is gaining industry acceptance as an approach towards development and commercialization of biotechnology therapeutic products that are expressed via microbial or mammalian cell lines. In QbD, the process is designed and controlled to deliver specified quality attributes consistently. To acquire the enhanced understanding that is necessary to achieve the above, however, requires more extensive experimentation to establish the design space for the process and the product. With biotechnology companies operating under ever-increasing pressure towards lowering the cost of manufacturing, the use of high-throughput tools has emerged as a necessary enabler of QbD in a time- and resource-constrained environment. We review this topic for those in academia and industry that are engaged in drug substance process development. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Native mass spectrometry: towards high-throughput structural proteomics.

    PubMed

    Kondrat, Frances D L; Struwe, Weston B; Benesch, Justin L P

    2015-01-01

    Native mass spectrometry (MS) has become a sensitive method for structural proteomics, allowing practitioners to gain insight into protein self-assembly, including stoichiometry and three-dimensional architecture, as well as complementary thermodynamic and kinetic aspects. Although MS is typically performed in vacuum, a body of literature has described how native solution-state structure is largely retained on the timescale of the experiment. Native MS offers the benefit that it requires substantially smaller quantities of a sample than traditional structural techniques such as NMR and X-ray crystallography, and is therefore well suited to high-throughput studies. Here we first describe the native MS approach and outline the structural proteomic data that it can deliver. We then provide practical details of experiments to examine the structural and dynamic properties of protein assemblies, highlighting potential pitfalls as well as principles of best practice.

  16. EDITORIAL: Combinatorial and High-Throughput Materials Research

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Takeuchi, Ichiro

    2005-01-01

    The success of combinatorial and high-throughput methodologies relies greatly on the availability of various characterization tools with new and improved capabilities [1]. Indeed, how useful can a combinatorial library of 250, 400, 25 000 or 2 000 000 compounds be [2-5] if one is unable to characterize its properties of interest fairly quickly? How useful can a set of thousands of spectra or chromatograms be if one is unable to analyse them in a timely manner? For these reasons, the development of new approaches for materials characterization is one of the most active areas in combinatorial materials science. The importance of this aspect of research in the field has been discussed in numerous conferences including the Pittsburgh Conferences, the American Chemical Society Meetings, the American Physical Society Meetings, the Materials Research Society Symposia and various Gordon Research Conferences. Naturally, the development of new measurement instrumentation attracts the attention not only of practitioners of combinatorial materials science but also of those who design new software for data manipulation and mining. Experimental designs of combinatorial libraries are pursued with available and realistic synthetic and characterization capabilities in mind. It is becoming increasingly critical to link the design of new equipment for high-throughput parallel materials synthesis with integrated measurement tools in order to enhance the efficacy of the overall experimental strategy. We have received an overwhelming response to our proposal and call for papers for this Special Issue on Combinatorial Materials Science. The papers in this issue of Measurement Science and Technology are a very timely collection that captures the state of modern combinatorial materials science. They demonstrate the significant advances that are taking place in the field. In some cases, characterization tools are now being operated in the factory mode. At the same time, major challenges

  17. Automated, high-throughput IgG-antibody glycoprofiling platform.

    PubMed

    Stöckmann, Henning; Adamczyk, Barbara; Hayes, Jerrard; Rudd, Pauline M

    2013-09-17

    One of today's key challenges is the ability to decode the functions of complex carbohydrates in various biological contexts. To generate high-quality glycomics data in a high-throughput fashion, we developed a robotized and low-cost N-glycan analysis platform for glycoprofiling of immunoglobulin G antibodies (IgG), which are central players of the immune system and of vital importance in the biopharmaceutical industry. The key features include (a) rapid IgG affinity purification and sample concentration, (b) protein denaturation and glycan release on a multiwell filtration device, (c) glycan purification on solid-supported hydrazide, and (d) glycan quantification by ultra performance liquid chromatography. The sample preparation workflow was automated using a robotic liquid-handling workstation, allowing the preparation of 96 samples (or multiples thereof) in 22 h with excellent reproducibility and, thus, should greatly facilitate biomarker discovery and glycosylation monitoring of therapeutic IgGs.

  18. High-throughput ab-initio dilute solute diffusion database

    NASA Astrophysics Data System (ADS)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  19. Single-platelet nanomechanics measured by high-throughput cytometry

    NASA Astrophysics Data System (ADS)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2017-02-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  20. A high throughput mechanical screening device for cartilage tissue engineering.

    PubMed

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  1. Interpretation of mass spectrometry data for high-throughput proteomics.

    PubMed

    Chamrad, Daniel C; Koerting, Gerhard; Gobom, Johan; Thiele, Herbert; Klose, Joachim; Meyer, Helmut E; Blueggel, Martin

    2003-08-01

    Recent developments in proteomics have revealed a bottleneck in bioinformatics: high-quality interpretation of acquired MS data. The ability to generate thousands of MS spectra per day, and the demand for this, makes manual methods inadequate for analysis and underlines the need to transfer the advanced capabilities of an expert human user into sophisticated MS interpretation algorithms. The identification rate in current high-throughput proteomics studies is not only a matter of instrumentation. We present software for high-throughput PMF identification, which enables robust and confident protein identification at higher rates. This has been achieved by automated calibration, peak rejection, and use of a meta search approach which employs various PMF search engines. The automatic calibration consists of a dynamic, spectral information-dependent algorithm, which combines various known calibration methods and iteratively establishes an optimised calibration. The peak rejection algorithm filters signals that are unrelated to the analysed protein by use of automatically generated and dataset-dependent exclusion lists. In the "meta search" several known PMF search engines are triggered and their results are merged by use of a meta score. The significance of the meta score was assessed by simulation of PMF identification with 10,000 artificial spectra resembling a data situation close to the measured dataset. By means of this simulation the meta score is linked to expectation values as a statistical measure. The presented software is part of the proteome database ProteinScape which links the information derived from MS data to other relevant proteomics data. We demonstrate the performance of the presented system with MS data from 1891 PMF spectra. As a result of automatic calibration and peak rejection the identification rate increased from 6% to 44%.

  2. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    PubMed Central

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  3. A robust robotic high-throughput antibody purification platform.

    PubMed

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant.

  4. High-throughput identification of protein localization dependency networks.

    PubMed

    Christen, Beat; Fero, Michael J; Hillson, Nathan J; Bowman, Grant; Hong, Sun-Hae; Shapiro, Lucy; McAdams, Harley H

    2010-03-09

    Bacterial cells are highly organized with many protein complexes and DNA loci dynamically positioned to distinct subcellular sites over the course of a cell cycle. Such dynamic protein localization is essential for polar organelle development, establishment of asymmetry, and chromosome replication during the Caulobacter crescentus cell cycle. We used a fluorescence microscopy screen optimized for high-throughput to find strains with anomalous temporal or spatial protein localization patterns in transposon-generated mutant libraries. Automated image acquisition and analysis allowed us to identify genes that affect the localization of two polar cell cycle histidine kinases, PleC and DivJ, and the pole-specific pili protein CpaE, each tagged with a different fluorescent marker in a single strain. Four metrics characterizing the observed localization patterns of each of the three labeled proteins were extracted for hundreds of cell images from each of 854 mapped mutant strains. Using cluster analysis of the resulting set of 12-element vectors for each of these strains, we identified 52 strains with mutations that affected the localization pattern of the three tagged proteins. This information, combined with quantitative localization data from epitasis experiments, also identified all previously known proteins affecting such localization. These studies provide insights into factors affecting the PleC/DivJ localization network and into regulatory links between the localization of the pili assembly protein CpaE and the kinase localization pathway. Our high-throughput screening methodology can be adapted readily to any sequenced bacterial species, opening the potential for databases of localization regulatory networks across species, and investigation of localization network phylogenies.

  5. Surrogate-assisted feature extraction for high-throughput phenotyping.

    PubMed

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  6. High-throughput analysis of peptide binding modules

    PubMed Central

    Liu, Bernard A.; Engelmann, Brett; Nash, Piers D.

    2014-01-01

    Modular protein interaction domains that recognize linear peptide motifs are found in hundreds of proteins within the human genome. Some protein interaction domains such as SH2, 14-3-3, Chromo and Bromo domains serve to recognize post-translational modification of amino acids (such as phosphorylation, acetylation, methylation etc.) and translate these into discrete cellular responses. Other modules such as SH3 and PDZ domains recognize linear peptide epitopes and serve to organize protein complexes based on localization and regions of elevated concentration. In both cases, the ability to nucleate specific signaling complexes is in large part dependent on the selectivity of a given protein module for its cognate peptide ligand. High throughput analysis of peptide-binding domains by peptide or protein arrays, phage display, mass spectrometry or other HTP techniques provides new insight into the potential protein-protein interactions prescribed by individual or even whole families of modules. Systems level analyses have also promoted a deeper understanding of the underlying principles that govern selective protein-protein interactions and how selectivity evolves. Lastly, there is a growing appreciation for the limitations and potential pitfalls of high-throughput analysis of protein-peptide interactomes. This review will examine some of the common approaches utilized for large-scale studies of protein interaction domains and suggest a set of standards for the analysis and validation of datasets from large-scale studies of peptide-binding modules. We will also highlight how data from large-scale studies of modular interaction domain families can provide insight into systems level properties such as the linguistics of selective interactions. PMID:22610655

  7. A high-throughput capillary isoelectric focusing immunoassay for fingerprinting protein sialylation.

    PubMed

    Markely, Lam Raga Anggara; Cheung, Lila; Choi, Young Jun; Ryll, Thomas; Estes, Scott; Prajapati, Shashi; Turyan, Iva; Frenkel, Ruth; Sosic, Zoran; Lambropoulos, James; Tescione, Lia; Ryll, Thomas; Berman, Melissa

    2016-01-01

    The serum half-life, biological activity, and solubility of many recombinant glycoproteins depend on their sialylation. Monitoring glycoprotein sialylation during cell culture manufacturing is, therefore, critical to ensure product efficacy and safety. Here a high-throughput method for semi-quantitative fingerprinting of glycoprotein sialylation using capillary isoelectric focusing immunoassay on NanoPro (Protein Simple) platform was developed. The method was specific, sensitive, precise, and robust. It could analyze 2 μL of crude cell culture samples without protein purification, and could automatically analyze from 8 samples in 4 h to 96 samples in 14 h without analyst supervision. Furthermore, its capability to detect various changes in sialylation fingerprints during cell culture manufacturing process was indispensable to ensure process robustness and consistency. Moreover, the changes in the sialylation fingerprints analyzed by this method showed strong correlations with intact mass analysis using liquid chromatography and mass spectrometry.

  8. High-throughput sequencing of small RNAs and anatomical characteristics associated with leaf development in celery.

    PubMed

    Jia, Xiao-Ling; Li, Meng-Yao; Jiang, Qian; Xu, Zhi-Sheng; Wang, Feng; Xiong, Ai-Sheng

    2015-06-09

    MicroRNAs (miRNAs) exhibit diverse and important roles in plant growth, development, and stress responses and regulate gene expression at the post-transcriptional level. Knowledge about the diversity of miRNAs and their roles in leaf development in celery remains unknown. To elucidate the roles of miRNAs in celery leaf development, we identified leaf development-related miRNAs through high-throughput sequencing. Small RNA libraries were constructed using leaves from three stages (10, 20, and 30 cm) of celery cv.'Ventura' and then subjected to high-throughput sequencing and bioinformatics analysis. At Stage 1, Stage 2, and Stage 3 of 'Ventura', a total of 333, 329, and 344 conserved miRNAs (belonging to 35, 35, and 32 families, respectively) were identified. A total of 131 miRNAs were identified as novel in 'Ventura'. Potential miRNA target genes were predicted and annotated using the eggNOG, GO, and KEGG databases to explore gene functions. The abundance of five conserved miRNAs and their corresponding potential target genes were validated. Expression profiles of novel potential miRNAs were also detected. Anatomical characteristics of the leaf blades and petioles at three leaf stages were further analyzed. This study contributes to our understanding on the functions and molecular regulatory mechanisms of miRNAs in celery leaf development.

  9. Profiling the main cell wall polysaccharides of grapevine leaves using high-throughput and fractionation methods.

    PubMed

    Moore, John P; Nguema-Ona, Eric; Fangel, Jonatan U; Willats, William G T; Hugo, Annatjie; Vivier, Melané A

    2014-01-01

    Vitis species include Vitis vinifera, the domesticated grapevine, used for wine and grape agricultural production and considered the world's most important fruit crop. A cell wall preparation, isolated from fully expanded photosynthetically active leaves, was fractionated via chemical and enzymatic reagents; and the various extracts obtained were assayed using high-throughput cell wall profiling tools according to a previously optimized and validated workflow. The bulk of the homogalacturonan-rich pectin present was efficiently extracted using CDTA treatment, whereas over half of the grapevine leaf cell wall consisted of vascular veins, comprised of xylans and cellulose. The main hemicellulose component was found to be xyloglucan and an enzymatic oligosaccharide fingerprinting approach was used to analyze the grapevine leaf xyloglucan fraction. When Paenibacillus sp. xyloglucanase was applied the main subunits released were XXFG and XLFG; whereas the less-specific Trichoderma reesei EGII was also able to release the XXXG motif as well as other oligomers likely of mannan and xylan origin. This latter enzyme would thus be useful to screen for xyloglucan, xylan and mannan-linked cell wall alterations in laboratory and field grapevine populations. This methodology is well-suited for high-throughput cell wall profiling of grapevine mutant and transgenic plants for investigating the range of biological processes, specifically plant disease studies and plant-pathogen interactions, where the cell wall plays a crucial role. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. A multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans

    PubMed Central

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-01-01

    The booming nanotech industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials under four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and UV irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano. PMID:25611253

  11. Application of high-throughput sequencing for studying genomic variations in congenital heart disease.

    PubMed

    Dorn, Cornelia; Grunert, Marcel; Sperling, Silke R

    2014-01-01

    Congenital heart diseases (CHD) represent the most common birth defect in human. The majority of cases are caused by a combination of complex genetic alterations and environmental influences. In the past, many disease-causing mutations have been identified; however, there is still a large proportion of cardiac malformations with unknown precise origin. High-throughput sequencing technologies established during the last years offer novel opportunities to further study the genetic background underlying the disease. In this review, we provide a roadmap for designing and analyzing high-throughput sequencing studies focused on CHD, but also with general applicability to other complex diseases. The three main next-generation sequencing (NGS) platforms including their particular advantages and disadvantages are presented. To identify potentially disease-related genomic variations and genes, different filtering steps and gene prioritization strategies are discussed. In addition, available control datasets based on NGS are summarized. Finally, we provide an overview of current studies already using NGS technologies and showing that these techniques will help to further unravel the complex genetics underlying CHD.

  12. Investigation of the fungal community structures of imported wheat using high-throughput sequencing technology

    PubMed Central

    Wang, Ying; Zhang, Guiming; Gao, Ruifang; Xiang, Caiyu; Feng, Jianjun; Lou, Dingfeng; Liu, Ying

    2017-01-01

    This study introduced the application of high-throughput sequencing techniques to the investigation of microbial diversity in the field of plant quarantine. It examined the microbial diversity of wheat imported into China, and established a bioinformatics database of wheat pathogens based on high-throughput sequencing results. This study analyzed the nuclear ribosomal internal transcribed spacer (ITS) region of fungi through Illumina Miseq sequencing to investigate the fungal communities of both seeds and sieve-through. A total of 758,129 fungal ITS sequences were obtained from ten samples collected from five batches of wheat imported from the USA. These sequences were classified into 2 different phyla, 15 classes, 33 orders, 41 families, or 78 genera, suggesting a high fungal diversity across samples. Apairwise analysis revealed that the diversity of the fungal community in the sieve-through is significantly higher than those in the seeds. Taxonomic analysis showed that at the class level, Dothideomycetes dominated in the seeds and Sordariomycetes dominated in the sieve-through. In all, this study revealed the fungal community composition in the seeds and sieve-through of the wheat, and identified key differences in the fungal community between the seeds and sieve-through. PMID:28241020

  13. High throughput imaging and analysis for biological interpretation of agricultural plants and environmental interaction

    NASA Astrophysics Data System (ADS)

    Hong, Hyundae; Benac, Jasenka; Riggsbee, Daniel; Koutsky, Keith

    2014-03-01

    High throughput (HT) phenotyping of crops is essential to increase yield in environments deteriorated by climate change. The controlled environment of a greenhouse offers an ideal platform to study the genotype to phenotype linkages for crop screening. Advanced imaging technologies are used to study plants' responses to resource limitations such as water and nutrient deficiency. Advanced imaging technologies coupled with automation make HT phenotyping in the greenhouse not only feasible, but practical. Monsanto has a state of the art automated greenhouse (AGH) facility. Handling of the soil, pots water and nutrients are all completely automated. Images of the plants are acquired by multiple hyperspectral and broadband cameras. The hyperspectral cameras cover wavelengths from visible light through short wave infra-red (SWIR). Inhouse developed software analyzes the images to measure plant morphological and biochemical properties. We measure phenotypic metrics like plant area, height, and width as well as biomass. Hyperspectral imaging allows us to measure biochemcical metrics such as chlorophyll, anthocyanin, and foliar water content. The last 4 years of AGH operations on crops like corn, soybean, and cotton have demonstrated successful application of imaging and analysis technologies for high throughput plant phenotyping. Using HT phenotyping, scientists have been showing strong correlations to environmental conditions, such as water and nutrient deficits, as well as the ability to tease apart distinct differences in the genetic backgrounds of crops.

  14. Multi-endpoint, high-throughput study of nanomaterial toxicity in Caenorhabditis elegans.

    PubMed

    Jung, Sang-Kyu; Qu, Xiaolei; Aleman-Meza, Boanerges; Wang, Tianxiao; Riepe, Celeste; Liu, Zheng; Li, Qilin; Zhong, Weiwei

    2015-02-17

    The booming nanotechnology industry has raised public concerns about the environmental health and safety impact of engineered nanomaterials (ENMs). High-throughput assays are needed to obtain toxicity data for the rapidly increasing number of ENMs. Here we present a suite of high-throughput methods to study nanotoxicity in intact animals using Caenorhabditis elegans as a model. At the population level, our system measures food consumption of thousands of animals to evaluate population fitness. At the organism level, our automated system analyzes hundreds of individual animals for body length, locomotion speed, and lifespan. To demonstrate the utility of our system, we applied this technology to test the toxicity of 20 nanomaterials at four concentrations. Only fullerene nanoparticles (nC60), fullerol, TiO2, and CeO2 showed little or no toxicity. Various degrees of toxicity were detected from different forms of carbon nanotubes, graphene, carbon black, Ag, and fumed SiO2 nanoparticles. Aminofullerene and ultraviolet-irradiated nC60 also showed small but significant toxicity. We further investigated the effects of nanomaterial size, shape, surface chemistry, and exposure conditions on toxicity. Our data are publicly available at the open-access nanotoxicity database www.QuantWorm.org/nano.

  15. Recent advances in quantitative high throughput and high content data analysis.

    PubMed

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  16. High-throughput synthesis and electrochemical screening of a library of modified electrodes for NADH oxidation.

    PubMed

    Pinczewska, Aleksandra; Sosna, Maciej; Bloodworth, Sally; Kilburn, Jeremy D; Bartlett, Philip N

    2012-10-31

    We report the combinatorial preparation and high-throughput screening of a library of modified electrodes designed to catalyze the oxidation of NADH. Sixty glassy carbon electrodes were covalently modified with ruthenium(II) or zinc(II) complexes bearing the redox active 1,10-phenanthroline-5,6-dione (phendione) ligand by electrochemical functionalization using one of four different linkers, followed by attachment of one of five different phendione metal complexes using combinatorial solid-phase synthesis methodology. This gave a library with three replicates of each of 20 different electrode modifications. This library was electrochemically screened in high-throughput (HTP) mode using cyclic voltammetry. The members of the library were evaluated with regard to the surface coverage, midpeak potential, and voltammetric peak separation for the phendione ligand, and their catalytic activity toward NADH oxidation. The surface coverage was found to depend on the length and flexibility of the linker and the geometry of the metal complex. The choices of linker and metal complex were also found to have significant impact on the kinetics of the reaction between the 1,10-phenanthroline-5,6-dione ligand and NADH. The rate constants for the reaction were obtained by analyzing the catalytic currents as a function of NADH concentration and scan rate, and the influence of the surface molecular architecture on the kinetics was evaluated.

  17. A droplet-based, optofluidic device for high-throughput, quantitative bioanalysis.

    PubMed

    Guo, Feng; Lapsley, Michael Ian; Nawaz, Ahmad Ahsan; Zhao, Yanhui; Lin, Sz-Chin Steven; Chen, Yuchao; Yang, Shikuan; Zhao, Xing-Zhong; Huang, Tony Jun

    2012-12-18

    Analysis of chemical or biomolecular contents in a tiny amount of specimen presents a significant challenge in many biochemical studies and diagnostic applications. In this work, we present a single-layer, optofluidic device for real-time, high-throughput, quantitative analysis of droplet contents. Our device integrates an optical fiber-based, on-chip detection unit with a droplet-based microfluidic unit. It can quantitatively analyze the contents of individual droplets in real-time. It also achieves a detection throughput of 2000 droplets per second, a detection limit of 20 nM, and an excellent reproducibility in its detection results. In a proof-of-concept study, we demonstrate that our device can be used to perform detection of DNA and its mutations by monitoring the fluorescent signal changes of the target DNA/molecular beacon complex in single droplets. Our approach can be immediately extended to a real-time, high-throughput detection of other biomolecules (such as proteins and viruses) in droplets. With its advantages in throughput, functionality, cost, size, and reliability, the droplet-based optofluidic device presented here can be a valuable tool for many medical diagnostic applications.

  18. Representing high throughput expression profiles via perturbation barcodes reveals compound targets

    PubMed Central

    Kutchukian, Peter S.; Li, Jing; Tudor, Matthew

    2017-01-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound’s high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data. PMID:28182661

  19. Representing high throughput expression profiles via perturbation barcodes reveals compound targets.

    PubMed

    Filzen, Tracey M; Kutchukian, Peter S; Hermes, Jeffrey D; Li, Jing; Tudor, Matthew

    2017-02-01

    High throughput mRNA expression profiling can be used to characterize the response of cell culture models to perturbations such as pharmacologic modulators and genetic perturbations. As profiling campaigns expand in scope, it is important to homogenize, summarize, and analyze the resulting data in a manner that captures significant biological signals in spite of various noise sources such as batch effects and stochastic variation. We used the L1000 platform for large-scale profiling of 978 representative genes across thousands of compound treatments. Here, a method is described that uses deep learning techniques to convert the expression changes of the landmark genes into a perturbation barcode that reveals important features of the underlying data, performing better than the raw data in revealing important biological insights. The barcode captures compound structure and target information, and predicts a compound's high throughput screening promiscuity, to a higher degree than the original data measurements, indicating that the approach uncovers underlying factors of the expression data that are otherwise entangled or masked by noise. Furthermore, we demonstrate that visualizations derived from the perturbation barcode can be used to more sensitively assign functions to unknown compounds through a guilt-by-association approach, which we use to predict and experimentally validate the activity of compounds on the MAPK pathway. The demonstrated application of deep metric learning to large-scale chemical genetics projects highlights the utility of this and related approaches to the extraction of insights and testable hypotheses from big, sometimes noisy data.

  20. Quantitative dot blot analysis (QDB), a versatile high throughput immunoblot method.

    PubMed

    Tian, Geng; Tang, Fangrong; Yang, Chunhua; Zhang, Wenfeng; Bergquist, Jonas; Wang, Bin; Mi, Jia; Zhang, Jiandi

    2017-08-29

    Lacking access to an affordable method of high throughput immunoblot analysis for daily use remains a big challenge for scientists worldwide. We proposed here Quantitative Dot Blot analysis (QDB) to meet this demand. With the defined linear range, QDB analysis fundamentally transforms traditional immunoblot method into a true quantitative assay. Its convenience in analyzing large number of samples also enables bench scientists to examine protein expression levels from multiple parameters. In addition, the small amount of sample lysates needed for analysis means significant saving in research sources and efforts. This method was evaluated at both cellular and tissue levels with unexpected observations otherwise would be hard to achieve using conventional immunoblot methods like Western blot analysis. Using QDB technique, we were able to observed an age-dependent significant alteration of CAPG protein expression level in TRAMP mice. We believe that the adoption of QDB analysis would have immediate impact on biological and biomedical research to provide much needed high-throughput information at protein level in this "Big Data" era.

  1. High-throughput mouse phenotyping using non-rigid registration and robust principal component analysis

    NASA Astrophysics Data System (ADS)

    Xie, Zhongliu; Kitamoto, Asanobu; Tamura, Masaru; Shiroishi, Toshihiko; Gillies, Duncan

    2016-03-01

    Intensive international efforts are underway towards phenotyping the mouse genome, by knocking out each of its ≍25,000 genes one-by-one for comparative study. With vast amounts of data to analyze, the traditional method using time-consuming histological examination is clearly impractical, leading to an overwhelming demand for some high-throughput phenotyping framework, especially with the employment of biomedical image informatics to efficiently identify phenotypes concerning morphological abnormality. Existing work has either excessively relied on volumetric analytics which is insensitive to phenotypes associated with no severe volume variations, or tailored for specific defects and thus fails to serve a general phenotyping purpose. Furthermore, the prevailing requirement of an atlas for image segmentation in contrast to its limited availability further complicates the issue in practice. In this paper we propose a high-throughput general-purpose phenotyping framework that is able to efficiently perform batch-wise anomaly detection without prior knowledge of the phenotype and the need for atlas-based segmentation. Anomaly detection is centered on the combined use of group-wise non-rigid image registration and robust principal component analysis (RPCA) for feature extraction and decomposition.

  2. Construction and high-throughput phenotypic screening of Zymoseptoria tritici over-expression strains

    PubMed Central

    Cairns, T.C.; Sidhu, Y.S.; Chaudhari, Y.K.; Talbot, N.J.; Studholme, D.J.; Haynes, K.

    2015-01-01

    Targeted gene deletion has been instrumental in elucidating many aspects of Zymoseptoria tritici pathogenicity. Gene over-expression is a complementary approach that is amenable to rapid strain construction and high-throughput screening, which has not been exploited to analyze Z. tritici, largely due to a lack of available techniques. Here we exploit the Gateway® cloning technology for rapid construction of over-expression vectors and improved homologous integration efficiency of a Z. tritici Δku70 strain to build a pilot over-expression library encompassing 32 genes encoding putative DNA binding proteins, GTPases or kinases. We developed a protocol using a Rotor-HDA robot for rapid and reproducible cell pinning for high-throughput in vitro screening. This screen identified an over-expression strain that demonstrated a marked reduction in hyphal production relative to the isogenic progenitor. This study provides a protocol for rapid generation of Z. tritici over-expression libraries and a technique for functional genomic screening in this important pathogen. PMID:26092797

  3. Construction and high-throughput phenotypic screening ofZymoseptoria tritici over-expression strains.

    PubMed

    Cairns, T C; Sidhu, Y S; Chaudhari, Y K; Talbot, N J; Studholme, D J; Haynes, K

    2015-06-01

    Targeted gene deletion has been instrumental in elucidating many aspects of Zymoseptoria tritici pathogenicity. Gene over-expression is a complementary approach that is amenable to rapid strain construction and high-throughput screening, which has not been exploited to analyze Z. tritici, largely due to a lack of available techniques. Here we exploit the Gateway® cloning technology for rapid construction of over-expression vectors and improved homologous integration efficiency of a Z. tritici Δku70 strain to build a pilot over-expression library encompassing 32 genes encoding putative DNA binding proteins, GTPases or kinases. We developed a protocol using a Rotor-HDA robot for rapid and reproducible cell pinning for high-throughput in vitro screening. This screen identified an over-expression strain that demonstrated a marked reduction in hyphal production relative to the isogenic progenitor. This study provides a protocol for rapid generation of Z. tritici over-expression libraries and a technique for functional genomic screening in this important pathogen.

  4. High-throughput diagnosis of potato cyst nematodes in soil samples.

    PubMed

    Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon

    2015-01-01

    Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.

  5. High-throughput sequencing of the paired human immunoglobulin heavy and light chain repertoire.

    PubMed

    DeKosky, Brandon J; Ippolito, Gregory C; Deschner, Ryan P; Lavinder, Jason J; Wine, Yariv; Rawlings, Brandon M; Varadarajan, Navin; Giesecke, Claudia; Dörner, Thomas; Andrews, Sarah F; Wilson, Patrick C; Hunicke-Smith, Scott P; Willson, C Grant; Ellington, Andrew D; Georgiou, George

    2013-02-01

    Each B-cell receptor consists of a pair of heavy and light chains. High-throughput sequencing can identify large numbers of heavy- and light-chain variable regions (V(H) and V(L)) in a given B-cell repertoire, but information about endogenous pairing of heavy and light chains is lost after bulk lysis of B-cell populations. Here we describe a way to retain this pairing information. In our approach, single B cells (>5 × 10(4) capacity per experiment) are deposited in a high-density microwell plate (125 pl/well) and lysed in situ. mRNA is then captured on magnetic beads, reverse transcribed and amplified by emulsion V(H):V(L) linkage PCR. The linked transcripts are analyzed by Illumina high-throughput sequencing. We validated the fidelity of V(H):V(L) pairs identified by this approach and used the method to sequence the repertoire of three human cell subsets-peripheral blood IgG(+) B cells, peripheral plasmablasts isolated after tetanus toxoid immunization and memory B cells isolated after seasonal influenza vaccination.

  6. Upscaling of hiPS Cell-Derived Neurons for High-Throughput Screening.

    PubMed

    Traub, Stefanie; Stahl, Heiko; Rosenbrock, Holger; Simon, Eric; Heilker, Ralf

    2017-03-01

    The advent of human-induced pluripotent stem (hiPS) cell-derived neurons promised to provide better model cells for drug discovery in the context of the central nervous system. This work demonstrates both the upscaling of cellular expansion and the acceleration of neuronal differentiation to accommodate the immense material needs of a high-throughput screening (HTS) approach. Using GRowth factor-driven expansion and INhibition of NotCH (GRINCH) during maturation, the derived cells are here referred to as GRINCH neurons. GRINCH cells displayed neuronal markers, and their functional activity could be demonstrated by electrophysiological recordings. In an application of GRINCH neurons, the brain-derived neurotrophic factor (BDNF)-mediated activation of tropomyosin receptor kinase (TrkB) was investigated as a promising drug target to treat synaptic dysfunctions. To assess the phosphorylation of endogenous TrkB in the GRINCH cells, the highly sensitive amplified luminescent proximity homogeneous assay LISA (AlphaLISA) format was established as a primary screen. A high-throughput reverse transcription (RT)-PCR format was employed as a secondary assay to analyze TrkB-mediated downstream target gene expression. In summary, an optimized differentiation protocol, highly efficient cell upscaling, and advanced assay miniaturization, combined with increased detection sensitivity, pave the way for a new generation of predictive cell-based drug discovery.

  7. High throughput ab-intio modeling of proton transport in solid electrolytes

    NASA Astrophysics Data System (ADS)

    Balachandran, Janakiraman; Lin, Lianshan; Ganesh, Panchapakesan

    Solid oxide materials that can selectively transport protons have great potential for fuel cell applications. However several fundamental questions remain unanswered such as (a) How do the dopants organize at various dopant concentrations, (b) How spatial organization of dopants influence proton migration energy, (c) How disorder and strain in a material influence its ionic transport. In this work have developed an integrated high throughput framework to calculate proton transport properties by integrating open source packages (such as pymatgen, fireworks) The high throughput framework scales well on supercomputing clusters. We have used this framework to analyze over 100 perovskites compounds with over 12 different dopant atoms. These computational models enable us to obtain insights how the proton transport properties depend on host and dopant atoms. Further, we also perform ab-initio modeling to understand how dopants spatially organize at different dopant concentrations, and how this spatial organization affects proton conductivity. This analysis enabled us to obtain fundamental insights on why proton conductivity decreases in Y doped BaZrO3 at high dopant concentrations.

  8. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    PubMed Central

    Michaeli, Miri; Noga, Hila; Tabibian-Keissar, Hilla; Barshack, Iris; Mehr, Ramit

    2012-01-01

    High-throughput sequencing (HTS) yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig) genes, which are variable and often highly mutated. This paper describes Ig High-Throughput Sequencing Cleaner (Ig-HTS-Cleaner), a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig Insertion—Deletion Identifier (Ig-Indel-Identifier), a program for identifying legitimate and artifact insertions and/or deletions (indels). Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets. PMID:23293637

  9. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    PubMed

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  10. Improving Hierarchical Models Using Historical Data with Applications in High-Throughput Genomics Data Analysis.

    PubMed

    Li, Ben; Li, Yunxiao; Qin, Zhaohui S

    2017-06-01

    Modern high-throughput biotechnologies such as microarray and next generation sequencing produce a massive amount of information for each sample assayed. However, in a typical high-throughput experiment, only limited amount of data are observed for each individual feature, thus the classical 'large p, small n' problem. Bayesian hierarchical model, capable of borrowing strength across features within the same dataset, has been recognized as an effective tool in analyzing such data. However, the shrinkage effect, the most prominent feature of hierarchical features, can lead to undesirable over-correction for some features. In this work, we discuss possible causes of the over-correction problem and propose several alternative solutions. Our strategy is rooted in the fact that in the Big Data era, large amount of historical data are available which should be taken advantage of. Our strategy presents a new framework to enhance the Bayesian hierarchical model. Through simulation and real data analysis, we demonstrated superior performance of the proposed strategy. Our new strategy also enables borrowing information across different platforms which could be extremely useful with emergence of new technologies and accumulation of data from different platforms in the Big Data era. Our method has been implemented in R package "adaptiveHM", which is freely available from https://github.com/benliemory/adaptiveHM.

  11. High-throughput assay and engineering of self-cleaving ribozymes by sequencing

    PubMed Central

    Kobori, Shungo; Nomura, Yoko; Miu, Anh; Yokobayashi, Yohei

    2015-01-01

    Self-cleaving ribozymes are found in all domains of life and are believed to play important roles in biology. Additionally, self-cleaving ribozymes have been the subject of extensive engineering efforts for applications in synthetic biology. These studies often involve laborious assays of multiple individual variants that are either designed rationally or discovered through selection or screening. However, these assays provide only a limited view of the large sequence space relevant to the ribozyme function. Here, we report a strategy that allows quantitative characterization of greater than 1000 ribozyme variants in a single experiment. We generated a library of predefined ribozyme variants that were converted to DNA and analyzed by high-throughput sequencing. By counting the number of cleaved and uncleaved reads of every variant in the library, we obtained a complete activity profile of the ribozyme pool which was used to both analyze and engineer allosteric ribozymes. PMID:25829176

  12. On-chip polarimetry for high-throughput screening of nanoliter and smaller sample volumes

    NASA Technical Reports Server (NTRS)

    Bornhop, Darryl J. (Inventor); Dotson, Stephen (Inventor); Bachmann, Brian O. (Inventor)

    2012-01-01

    A polarimetry technique for measuring optical activity that is particularly suited for high throughput screening employs a chip or substrate (22) having one or more microfluidic channels (26) formed therein. A polarized laser beam (14) is directed onto optically active samples that are disposed in the channels. The incident laser beam interacts with the optically active molecules in the sample, which slightly alter the polarization of the laser beam as it passes multiple times through the sample. Interference fringe patterns (28) are generated by the interaction of the laser beam with the sample and the channel walls. A photodetector (34) is positioned to receive the interference fringe patterns and generate an output signal that is input to a computer or other analyzer (38) for analyzing the signal and determining the rotation of plane polarized light by optically active material in the channel from polarization rotation calculations.

  13. Genome-wide estimation of linkage disequilibrium from population-level high-throughput sequencing data.

    PubMed

    Maruki, Takahiro; Lynch, Michael

    2014-08-01

    Rapidly improving sequencing technologies provide unprecedented opportunities for analyzing genome-wide patterns of polymorphisms. In particular, they have great potential for linkage-disequilibrium analyses on both global and local genetic scales, which will substantially improve our ability to derive evolutionary inferences. However, there are some difficulties with analyzing high-throughput sequencing data, including high error rates associated with base reads and complications from the random sampling of sequenced chromosomes in diploid organisms. To overcome these difficulties, we developed a maximum-likelihood estimator of linkage disequilibrium for use with error-prone sampling data. Computer simulations indicate that the estimator is nearly unbiased with a sampling variance at high coverage asymptotically approaching the value expected when all relevant information is accurately estimated. The estimator does not require phasing of haplotypes and enables the estimation of linkage disequilibrium even when all individual reads cover just single polymorphic sites. Copyright © 2014 by the Genetics Society of America.

  14. High-Throughput Spheroid Screens Using Volume, Resazurin Reduction, and Acid Phosphatase Activity.

    PubMed

    Ivanov, Delyan P; Grabowska, Anna M; Garnett, Martin C

    2017-01-01

    Mainstream adoption of physiologically relevant three-dimensional models has been slow in the last 50 years due to long, manual protocols with poor reproducibility, high price, and closed commercial platforms. This chapter describes high-throughput, low-cost, open methods for spheroid viability assessment which use readily available reagents and open-source software to analyze spheroid volume, metabolism, and enzymatic activity. We provide two ImageJ macros for automated spheroid size determination-for both single images and images in stacks. We also share an Excel template spreadsheet allowing users to rapidly process spheroid size data, analyze plate uniformity (such as edge effects and systematic seeding errors), detect outliers, and calculate dose-response. The methods would be useful to researchers in preclinical and translational research planning to move away from simplistic monolayer studies and explore 3D spheroid screens for drug safety and efficacy without substantial investment in money or time.

  15. A Primer on High-Throughput Computing for Genomic Selection

    PubMed Central

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  16. Silicon microphysiometer for high-throughput drug screening

    NASA Astrophysics Data System (ADS)

    Verhaegen, Katarina; Baert, Christiaan; Puers, Bob; Sansen, Willy; Simaels, Jeannine; Van Driessche, Veerle; Hermans, Lou; Mertens, Robert P.

    1999-06-01

    We report on a micromachined silicon chip that is capable of providing a high-throughput functional assay based on calorimetry. A prototype twin microcalorimeter based on the Seebeck effect has been fabricated by IC technology and micromachined postprocessing techniques. A biocompatible liquid rubber membrane supports two identical 0.5 X 2 cm2 measurement chambers, situated at the cold and hot junction of a 666-junction aluminum/p+-polysilicon thermopile. The chambers can house up to 106 eukaryotic cells cultured to confluence. The advantage of the device over microcalorimeters on the market, is the integration of the measurement channels on chip, rendering microvolume reaction vessels, ranging from 10 to 600 (mu) l, in the closest possible contact with the thermopile sensor (no springs are needed). Power and temperature sensitivity of the sensor are 23 V/W and 130 mV/K, respectively. The small thermal inertia of the microchannels results in the short response time of 70 s, when filled with 50 (mu) l of water. Biological experiments were done with cultured kidney cells of Xenopus laevis (A6). The thermal equilibration time of the device is 45 min. Stimulation of transport mechanisms by reducing bath osmolality by 50% increased metabolism by 20%. Our results show that it is feasible to apply this large-area, small- volume whole-cell biosensor for drug discovery, where the binding assays that are commonly used to provide high- throughput need to be complemented with a functional assay. Solutions are brought onto the sensor by a simple pipette, making the use of an industrial microtiterplate dispenser feasible on a nx96-array of the microcalorimeter biosensor. Such an array of biosensors has been designed based on a new set of requirements as set forth by people in the field as this project moved on. The results obtained from the prototype large-area sensor were used to obtain an accurate model of the calorimeter, checked for by the simulation software ANSYS. At

  17. High throughput optoelectronic smart pixel systems using diffractive optics

    NASA Astrophysics Data System (ADS)

    Chen, Chih-Hao

    1999-12-01

    Recent developments in digital video, multimedia technology and data networks have greatly increased the demand for high bandwidth communication channels and high throughput data processing. Electronics is particularly suited for switching, amplification and logic functions, while optics is more suitable for interconnections and communications with lower energy and crosstalk. In this research, we present the design, testing, integration and demonstration of several optoelectronic smart pixel devices and system architectures. These systems integrate electronic switching/processing capability with parallel optical interconnections to provide high throughput network communication and pipeline data processing. The Smart Pixel Array Cellular Logic processor (SPARCL) is designed in 0.8 m m CMOS and hybrid integrated with Multiple-Quantum-Well (MQW) devices for pipeline image processing. The Smart Pixel Network Interface (SAPIENT) is designed in 0.6 m m GaAs and monolithically integrated with LEDs to implement a highly parallel optical interconnection network. The Translucent Smart Pixel Array (TRANSPAR) design is implemented in two different versions. The first version, TRANSPAR-MQW, is designed in 0.5 m m CMOS and flip-chip integrated with MQW devices to provide 2-D pipeline processing and translucent networking using the Carrier- Sense-MultipleAccess/Collision-Detection (CSMA/CD) protocol. The other version, TRANSPAR-VM, is designed in 1.2 m m CMOS and discretely integrated with VCSEL-MSM (Vertical-Cavity-Surface- Emitting-Laser and Metal-Semiconductor-Metal detectors) chips and driver/receiver chips on a printed circuit board. The TRANSPAR-VM provides an option of using the token ring network protocol in addition to the embedded functions of TRANSPAR-MQW. These optoelectronic smart pixel systems also require micro-optics devices to provide high resolution, high quality optical interconnections and external source arrays. In this research, we describe an innovative

  18. Parallel tools in HEVC for high-throughput processing

    NASA Astrophysics Data System (ADS)

    Zhou, Minhua; Sze, Vivienne; Budagavi, Madhukar

    2012-10-01

    HEVC (High Efficiency Video Coding) is the next-generation video coding standard being jointly developed by the ITU-T VCEG and ISO/IEC MPEG JCT-VC team. In addition to the high coding efficiency, which is expected to provide 50% more bit-rate reduction when compared to H.264/AVC, HEVC has built-in parallel processing tools to address bitrate, pixel-rate and motion estimation (ME) throughput requirements. This paper describes how CABAC, which is also used in H.264/AVC, has been redesigned for improved throughput, and how parallel merge/skip and tiles, which are new tools introduced for HEVC, enable high-throughput processing. CABAC has data dependencies which make it difficult to parallelize and thus limit its throughput. The prediction error/residual, represented as quantized transform coefficients, accounts for the majority of the CABAC workload. Various improvements have been made to the context selection and scans in transform coefficient coding that enable CABAC in HEVC to potentially achieve higher throughput and increased coding gains relative to H.264/AVC. The merge/skip mode is a coding efficiency enhancement tool in HEVC; the parallel merge/skip breaks dependency between the regular and merge/skip ME, which provides flexibility for high throughput and high efficiency HEVC encoder designs. For ultra high definition (UHD) video, such as 4kx2k and 8kx4k resolutions, low-latency and real-time processing may be beyond the capability of a single core codec. Tiles are an effective tool which enables pixel-rate balancing among the cores to achieve parallel processing with a throughput scalable implementation of multi-core UHD video codec. With the evenly divided tiles, a multi-core video codec can be realized by simply replicating single core codec and adding a tile boundary processing core on top of that. These tools illustrate that accounting for implementation cost when designing video coding algorithms can enable higher processing speed and reduce

  19. Quantitative high throughput analytics to support polysaccharide production process development.

    PubMed

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  20. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

  1. A primer on high-throughput computing for genomic selection.

    PubMed

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  2. High Throughput Analysis of Integron Gene Cassettes in Wastewater Environments.

    PubMed

    Gatica, Joao; Tripathi, Vijay; Green, Stefan; Manaia, Celia M; Berendonk, Thomas; Cacace, Damiano; Merlin, Christophe; Kreuzinger, Norbert; Schwartz, Thomas; Fatta-Kassinos, Despo; Rizzo, Luigi; Schwermer, Carsten U; Garelick, Hemda; Jurkevitch, Edouard; Cytryn, Eddie

    2016-11-01

    Integrons are extensively targeted as a proxy for anthropogenic impact in the environment. We developed a novel high-throughput amplicon sequencing pipeline that enables characterization of thousands of integron gene cassette-associated reads, and applied it to acquire a comprehensive overview of gene cassette composition in effluents from wastewater treatment facilities across Europe. Between 38 100 and 172 995 reads per-sample were generated and functionally characterized by screening against nr, SEED, ARDB and β-lactamase databases. Over 75% of the reads were characterized as hypothetical, but thousands were associated with toxin-antitoxin systems, DNA repair, cell membrane function, detoxification and aminoglycoside and β-lactam resistance. Among the reads characterized as β-lactamases, the carbapenemase blaOXA was dominant in most of the effluents, except for Cyprus and Israel where blaGES was also abundant. Quantitative PCR assessment of blaOXA and blaGES genes in the European effluents revealed similar trends to those displayed in the integron amplicon sequencing pipeline described above, corroborating the robustness of this method and suggesting that these integron-associated genes may be excellent targets for source tracking of effluents in downstream environments. Further application of the above analyses revealed several order-of-magnitude reductions in effluent-associated β-lactamase genes in effluent-saturated soils, suggesting marginal persistence in the soil microbiome.

  3. The JCSG high-throughput structural biology pipeline

    PubMed Central

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications. PMID:20944202

  4. High throughput illumination systems for solar simulators and photoresist exposure

    NASA Astrophysics Data System (ADS)

    Feldman, Arkady

    2010-08-01

    High throughput illumination systems are critical component in photolithography, solar simulators, UV curing, microscopy, and spectral analysis. A good refractive condenser system has F/# .60, or N.A .80, but it captures only 10 to 15% of energy emitted by an incandescent or gas-discharge lamp, as these sources emit light in all directions. Systems with ellipsoidal or parabolic reflectors are much more efficient, they capture up to 80% of total energy emitted by lamps. However, these reflectors have large aberrations when working with real sources of finite dimensions, resulting in poor light concentrating capability. These aberrations also increase beam divergence, collimation, and affect edge definition in flood exposure systems. The problem is aggravated by the geometry of high power Arc lamps where, for thermal considerations, the anode has a larger diameter than the cathode and absorbs and obscures part of the energy. This results in an asymmetrical energy distribution emitted by the lamp and makes efficiency of Lamp - reflector configuration dependent on orientation of lamp in the reflector. This paper presents the analysis of different configurations of Lamp - Reflector systems of different power levels and their energy distribution in the image plane. Configuration, which results in significant improvement of brightness, is derived.

  5. High-throughput literature mining to support read-across ...

    EPA Pesticide Factsheets

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie

  6. Probabilistic Assessment of High-Throughput Wireless Sensor Networks.

    PubMed

    Kim, Robin E; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F; Song, Junho

    2016-05-31

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  7. Validation of high throughput sequencing and microbial forensics applications.

    PubMed

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  8. High throughput virus plaque quantitation using a flatbed scanner.

    PubMed

    Sullivan, Kate; Kloess, Johannes; Qian, Chen; Bell, Donald; Hay, Alan; Lin, Yi Pu; Gu, Yan

    2012-01-01

    The plaque assay is a standard technique for measuring influenza virus infectivity and inhibition of virus replication. Counting plaque numbers and quantifying virus infection of cells in multiwell plates quickly, accurately and automatically remain a challenge. Visual inspection relies upon experience, is subjective, often time consuming, and has less reproducibility than automated methods. In this paper, a simple, high throughput imaging-based alternative is proposed which uses a flatbed scanner and image processing software to quantify the infected cell population and plaque formation. Quantitation results were evaluated with reference to visual counting and achieved better than 80% agreement. The method was shown to be particularly advantageous in titration of the number of plaques and infected cells when influenza viruses produce a heterogeneous population of small plaques. It was also shown to be insensitive to the densities of plaques in determination of neutralization titres and IC(50)s of drug susceptibility. In comparison to other available techniques, this approach is cost-effective, relatively accurate, and readily available. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Adaptation to high throughput batch chromatography enhances multivariate screening.

    PubMed

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High-throughput charge exchange recombination spectroscopy system on MAST

    SciTech Connect

    Conway, N. J.; Carolan, P. G.; McCone, J.; Walsh, M. J.; Wisse, M.

    2006-10-15

    A major upgrade to the charge exchange recombination spectroscopy system on MAST has recently been implemented. The new system consists of a high-throughput spectrometer coupled to a total of 224 spatial channels, including toroidal and poloidal views of both neutral heating beams on MAST. Radial resolution is {approx}1 cm, comparable to the ion Larmor radius. The toroidal views are configured with 64 channels per beam, while the poloidal views have 32 channels per beam. Background channels for both poloidal and toroidal views are also provided. A large transmission grating is at the heart of the new spectrometer, with high quality single lens reflex lenses providing excellent imaging performance and permitting the full exploitation of the available etendue of the camera sensor. The charge-coupled device camera chosen has four-tap readout at a maximum aggregate speed of 8.8 MHz, and it is capable of reading out the full set of 224 channels in less than 4 ms. The system normally operates at 529 nm, viewing the C{sup 5+} emission line, but can operate at any wavelength in the range of 400-700 nm. Results from operating the system on MAST are shown, including impurity ion temperature and velocity profiles. The system's excellent spatial resolution is ideal for the study of transport barrier phenomena on MAST, an activity which has already been advanced significantly by data from the new diagnostic.

  11. Tiered High-Throughput Screening Approach to Identify ...

    EPA Pesticide Factsheets

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  12. High Throughput T Epitope Mapping and Vaccine Development

    PubMed Central

    Li Pira, Giuseppina; Ivaldi, Federico; Moretti, Paolo; Manca, Fabrizio

    2010-01-01

    Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th) and by cytolytic T lymphocytes (CTL) is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP) approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost. PMID:20617148

  13. Inter-Individual Variability in High-Throughput Risk ...

    EPA Pesticide Factsheets

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have little or no existing TK data. Chemicals are prioritized based on model estimates of hazard and exposure, to decide which chemicals should be first in line for further study. Hazard may be estimated with in vitro HT screening assays, e.g., U.S. EPA’s ToxCast program. Bioactive ToxCast concentrations can be extrapolated to doses that produce equivalent concentrations in body tissues using a reverse TK approach in which generic TK models are parameterized with 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with physiological parameters for a virtual population. Here we draw physiological parameters from realistic estimates of distributions of demographic and anthropometric quantities in the modern U.S. population, based on the most recent CDC NHANES data. A Monte Carlo approach, accounting for the correlation structure in physiological parameters, is used to estimate ToxCast equivalent doses for the most sensitive portion of the population. To quantify risk, ToxCast equivalent doses are compared to estimates of exposure rates based on Bayesian inferences drawn from NHANES urinary analyte biomonitoring data. The inclusion

  14. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    PubMed

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-12-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  15. High Throughput Profiling of Molecular Shapes in Crystals

    NASA Astrophysics Data System (ADS)

    Spackman, Peter R.; Thomas, Sajesh P.; Jayatilaka, Dylan

    2016-02-01

    Molecular shape is important in both crystallisation and supramolecular assembly, yet its role is not completely understood. We present a computationally efficient scheme to describe and classify the molecular shapes in crystals. The method involves rotation invariant description of Hirshfeld surfaces in terms of of spherical harmonic functions. Hirshfeld surfaces represent the boundaries of a molecule in the crystalline environment, and are widely used to visualise and interpret crystalline interactions. The spherical harmonic description of molecular shapes are compared and classified by means of principal component analysis and cluster analysis. When applied to a series of metals, the method results in a clear classification based on their lattice type. When applied to around 300 crystal structures comprising of series of substituted benzenes, naphthalenes and phenylbenzamide it shows the capacity to classify structures based on chemical scaffolds, chemical isosterism, and conformational similarity. The computational efficiency of the method is demonstrated with an application to over 14 thousand crystal structures. High throughput screening of molecular shapes and interaction surfaces in the Cambridge Structural Database (CSD) using this method has direct applications in drug discovery, supramolecular chemistry and materials design.

  16. High throughput screening for drug discovery of autophagy modulators.

    PubMed

    Shu, Chih-Wen; Liu, Pei-Feng; Huang, Chun-Ming

    2012-11-01

    Autophagy is an evolutionally conserved process in cells for cleaning abnormal proteins and organelles in a lysosome dependent manner. Growing studies have shown that defects or induced autophagy contributes to many diseases including aging, neurodegeneration, pathogen infection, and cancer. However, the precise involvement of autophagy in health and disease remains controversial because the theories are built on limited assays and chemical modulators, indicating that the role of autophagy in diseases may require further verification. Many food and drug administration (FDA) approved drugs modulate autophagy signaling, suggesting that modulation of autophagy with pharmacological agonists or antagonists provides a potential therapy for autophagy-related diseases. This suggestion raises an attractive issue on drug discovery for exploring chemical modulators of autophagy. High throughput screening (HTS) is becoming a powerful tool for drug discovery that may accelerate screening specific autophagy modulators to clarify the role of autophagy in diseases. Herein, this review lays out current autophagy assays to specifically measure autophagy components such as LC3 (mammalian homologue of yeast Atg8) and Atg4. These assays are feasible or successful for HTS with certain chemical libraries, which might be informative for this intensively growing field as research tools and hopefully developing new drugs for autophagy-related diseases.

  17. High-Throughput Screening Using Mass Spectrometry within Drug Discovery.

    PubMed

    Rohman, Mattias; Wingfield, Jonathan

    2016-01-01

    In order to detect a biochemical analyte with a mass spectrometer (MS) it is necessary to ionize the analyte of interest. The analyte can be ionized by a number of different mechanisms, however, one common method is electrospray ionization (ESI). Droplets of analyte are sprayed through a highly charged field, the droplets pick up charge, and this is transferred to the analyte. High levels of salt in the assay buffer will potentially steal charge from the analyte and suppress the MS signal. In order to avoid this suppression of signal, salt is often removed from the sample prior to injection into the MS. Traditional ESI MS relies on liquid chromatography (LC) to remove the salt and reduce matrix effects, however, this is a lengthy process. Here we describe the use of RapidFire™ coupled to a triple-quadrupole MS for high-throughput screening. This system uses solid-phase extraction to de-salt samples prior to injection, reducing processing time such that a sample is injected into the MS ~every 10 s.

  18. High-throughput mass spectrometric cytochrome P450 inhibition screening.

    PubMed

    Lim, Kheng B; Ozbal, Can C; Kassel, Daniel B

    2013-01-01

    We describe here a high-throughput assay to support rapid evaluation of drug discovery compounds for possible drug-drug interaction (DDI). Each compound is evaluated for its DDI potential by incubating over a range of eight concentrations and against a panel of six cytochrome P450 (CYP) enzymes: 1A2, 2C8, 2C9, 2C19, 2D6, and 3A4. The method utilizes automated liquid handling for sample preparation, and online solid-phase extraction/tandem mass spectrometry (SPE/MS/MS) for sample analyses. The system is capable of generating two 96-well assay plates in 30 min, and completes the data acquisition and analysis of both plates in about 30 min. Many laboratories that perform the CYP inhibition screening automate only part of the processes leaving a throughput bottleneck within the workflow. The protocols described in this chapter are aimed to streamline the entire process from assay to data acquisition and processing by incorporating automation and utilizing high-precision instrument to maximize throughput and minimize bottleneck.

  19. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    SciTech Connect

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  20. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Management of High-Throughput DNA Sequencing Projects: Alpheus

    PubMed Central

    Miller, Neil A.; Kingsmore, Stephen F.; Farmer, Andrew; Langley, Raymond J.; Mudge, Joann; Crow, John A.; Gonzalez, Alvaro J.; Schilkey, Faye D.; Kim, Ryan J.; van Velkinburgh, Jennifer; May, Gregory D.; Black, C. Forrest; Myers, M. Kathy; Utsey, John P.; Frost, Nicholas S.; Sugarbaker, David J.; Bueno, Raphael; Gullans, Stephen R.; Baxter, Susan M.; Day, Steve W.; Retzel, Ernest F.

    2009-01-01

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem’s SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis. PMID:20151039

  2. Detecting Alu insertions from high-throughput sequencing data

    PubMed Central

    David, Matei; Mustafa, Harun; Brudno, Michael

    2013-01-01

    High-throughput sequencing technologies have allowed for the cataloguing of variation in personal human genomes. In this manuscript, we present alu-detect, a tool that combines read-pair and split-read information to detect novel Alus and their precise breakpoints directly from either whole-genome or whole-exome sequencing data while also identifying insertions directly in the vicinity of existing Alus. To set the parameters of our method, we use simulation of a faux reference, which allows us to compute the precision and recall of various parameter settings using real sequencing data. Applying our method to 100 bp paired Illumina data from seven individuals, including two trios, we detected on average 1519 novel Alus per sample. Based on the faux-reference simulation, we estimate that our method has 97% precision and 85% recall. We identify 808 novel Alus not previously described in other studies. We also demonstrate the use of alu-detect to study the local sequence and global location preferences for novel Alu insertions. PMID:23921633

  3. Achieving High Throughput for Data Transfer over ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  4. High Throughput Multispectral Image Processing with Applications in Food Science.

    PubMed

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  5. Molecular Pathways: Extracting Medical Knowledge from High Throughput Genomic Data

    PubMed Central

    Goldstein, Theodore; Paull, Evan O.; Ellis, Matthew J.; Stuart, Joshua M.

    2013-01-01

    High-throughput genomic data that measures RNA expression, DNA copy number, mutation status and protein levels provide us with insights into the molecular pathway structure of cancer. Genomic lesions (amplifications, deletions, mutations) and epigenetic modifications disrupt biochemical cellular pathways. While the number of possible lesions is vast, different genomic alterations may result in concordant expression and pathway activities, producing common tumor subtypes that share similar phenotypic outcomes. How can these data be translated into medical knowledge that provides prognostic and predictive information? First generation mRNA expression signatures such as Genomic Health's Oncotype DX already provide prognostic information, but do not provide therapeutic guidance beyond the current standard of care – which is often inadequate in high-risk patients. Rather than building molecular signatures based on gene expression levels, evidence is growing that signatures based on higher-level quantities such as from genetic pathways may provide important prognostic and diagnostic cues. We provide examples of how activities for molecular entities can be predicted from pathway analysis and how the composite of all such activities, referred to here as the “activitome,” help connect genomic events to clinical factors in order to predict the drivers of poor outcome. PMID:23430023

  6. New high throughput screening method for drug release measurements.

    PubMed

    Pelczarska, Aleksandra; Delie, Florence; Domańska, Urszula; Carrupt, Pierre-Alain; Martel, Sophie

    2013-09-01

    In the field of drug delivery systems, microparticles made of polymeric matrix appear as an attractive approach. The in vitro release kinetic profile is crucial information when developing new particulate formulations. These data are essential for batch to batch comparison, quality control as well as for anticipation of in vivo behavior to select the best formulation to go further in preclinical investigations. The methods available present common drawbacks such as the time- and compound-consumption that does not fit with formulation screening requirements in early development stages. In this study, a new microscale high throughput screening (HTS) method has been developed to investigate drug release kinetic from piroxicam-loaded polylactic acid (PLA) and polylactic-co-glycolic acid (PLGA) microparticles. The method is a sample- and separation-based method where separation is performed by filtration using 96-well micro filter plates. 96 experiments can therefore be performed on one plate in one time in a fully automated way and with a very low sample and particle consumption. The influence of different parameters controlling release profiles was also investigated using this technique. The HTS method gave the same release profile than the standard dialysis method. Shaking, particle concentration, and the nature of the release medium were found to be of influence. The HTS method appears as a reliable method to evaluate drug release from particles with smaller standard deviation and less consumption of material.

  7. New Lung Cancer Panel for High-Throughput Targeted Resequencing

    PubMed Central

    Kim, Eun-Hye; Lee, Sunghoon; Park, Jongsun; Lee, Kyusang; Bhak, Jong

    2014-01-01

    We present a new next-generation sequencing-based method to identify somatic mutations of lung cancer. It is a comprehensive mutation profiling protocol to detect somatic mutations in 30 genes found frequently in lung adenocarcinoma. The total length of the target regions is 107 kb, and a capture assay was designed to cover 99% of it. This method exhibited about 97% mean coverage at 30× sequencing depth and 42% average specificity when sequencing of more than 3.25 Gb was carried out for the normal sample. We discovered 513 variations from targeted exome sequencing of lung cancer cells, which is 3.9-fold higher than in the normal sample. The variations in cancer cells included previously reported somatic mutations in the COSMIC database, such as variations in TP53, KRAS, and STK11 of sample H-23 and in EGFR of sample H-1650, especially with more than 1,000× coverage. Among the somatic mutations, up to 91% of single nucleotide polymorphisms from the two cancer samples were validated by DNA microarray-based genotyping. Our results demonstrated the feasibility of high-throughput mutation profiling with lung adenocarcinoma samples, and the profiling method can be used as a robust and effective protocol for somatic variant screening. PMID:25031567

  8. High-throughput purification of single compounds and libraries.

    PubMed

    Schaffrath, Mathias; von Roedern, Erich; Hamley, Peter; Stilz, Hans Ulrich

    2005-01-01

    The need for increasing productivity in medicinal chemistry and associated improvements in automated synthesis technologies for compound library production during the past few years have resulted in a major challenge for compound purification technology and its organization. To meet this challenge, we have recently set up three full-service chromatography units with the aid of in-house engineers, different HPLC suppliers, and several companies specializing in custom laboratory automation technologies. Our goal was to combine high-throughput purification with the high attention to detail which would be afforded by a dedicated purification service. The resulting final purification laboratory can purify up to 1000 compounds/week in amounts ranging from 5 to 300 mg, whereas the two service intermediate purification units take 100 samples per week from 0.3 to 100 g. The technologies consist of normal-phase and reversed-phase chromatography, robotic fraction pooling and reformatting, a bottling system, an automated external solvent supply and removal system, and a customized, high-capacity freeze-dryer. All work processes are linked by an electronic sample registration and tracking system.

  9. PrimerView: high-throughput primer design and visualization.

    PubMed

    O'Halloran, Damien M

    2015-01-01

    High-throughput primer design is routinely performed in a wide number of molecular applications including genotyping specimens using traditional PCR techniques as well as assembly PCR, nested PCR, and primer walking experiments. Batch primer design is also required in validation experiments from RNA-seq transcriptome sequencing projects, as well as in generating probes for microarray experiments. The growing popularity of next generation sequencing and microarray technology has created a greater need for more primer design tools to validate large numbers of candidate genes and markers. To meet these demands I here present a tool called PrimerView that designs forward and reverse primers from multi-sequence datasets, and generates graphical outputs that map the position and distribution of primers to the target sequence. This module operates from the command-line and can collect user-defined input for the design phase of each primer. PrimerView is a straightforward to use module that implements a primer design algorithm to return forward and reverse primers from any number of FASTA formatted sequences to generate text based output of the features for each primer, and also graphical outputs that map the designed primers to the target sequence. PrimerView is freely available without restrictions.

  10. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    SciTech Connect

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  11. High-throughput automated refolding screening of inclusion bodies.

    PubMed

    Vincentelli, Renaud; Canaan, Stéphane; Campanacci, Valérie; Valencia, Christel; Maurin, Damien; Frassinetti, Frédéric; Scappucini-Calvo, Loréna; Bourne, Yves; Cambillau, Christian; Bignon, Christophe

    2004-10-01

    One of the main stumbling blocks encountered when attempting to express foreign proteins in Escherichia coli is the occurrence of amorphous aggregates of misfolded proteins, called inclusion bodies (IB). Developing efficient protein native structure recovery procedures based on IB refolding is therefore an important challenge. Unfortunately, there is no "universal" refolding buffer: Experience shows that refolding buffer composition varies from one protein to another. In addition, the methods developed so far for finding a suitable refolding buffer suffer from a number of weaknesses. These include the small number of refolding formulations, which often leads to negative results, solubility assays incompatible with high-throughput, and experiment formatting not suitable for automation. To overcome these problems, it was proposed in the present study to address some of these limitations. This resulted in the first completely automated IB refolding screening procedure to be developed using a 96-well format. The 96 refolding buffers were obtained using a fractional factorial approach. The screening procedure is potentially applicable to any nonmembrane protein, and was validated with 24 proteins in the framework of two Structural Genomics projects. The tests used for this purpose included the use of quality control methods such as circular dichroism, dynamic light scattering, and crystallogenesis. Out of the 24 proteins, 17 remained soluble in at least one of the 96 refolding buffers, 15 passed large-scale purification tests, and five gave crystals.

  12. A high-throughput screen for antibiotic drug discovery.

    PubMed

    Scanlon, Thomas C; Dostal, Sarah M; Griswold, Karl E

    2014-02-01

    We describe an ultra-high-throughput screening platform enabling discovery and/or engineering of natural product antibiotics. The methodology involves creation of hydrogel-in-oil emulsions in which recombinant microorganisms are co-emulsified with bacterial pathogens; antibiotic activity is assayed by use of a fluorescent viability dye. We have successfully utilized both bulk emulsification and microfluidic technology for the generation of hydrogel microdroplets that are size-compatible with conventional flow cytometry. Hydrogel droplets are ∼25 pL in volume, and can be synthesized and sorted at rates exceeding 3,000 drops/s. Using this technique, we have achieved screening throughputs exceeding 5 million clones/day. Proof-of-concept experiments demonstrate efficient selection of antibiotic-secreting yeast from a vast excess of negative controls. In addition, we have successfully used this technique to screen a metagenomic library for secreted antibiotics that kill the human pathogen Staphylococcus aureus. Our results establish the practical utility of the screening platform, and we anticipate that the accessible nature of our methods will enable others seeking to identify and engineer the next generation of antibacterial biomolecules. © 2013 Wiley Periodicals, Inc.

  13. Analysis of High Throughput Screening Assays using Cluster Enrichment

    PubMed Central

    Pu, Minya; Hayashi, Tomoko; Cottam, Howard; Mulvaney, Joseph; Arkin, Michelle; Corr, Maripat; Carson, Dennis; Messer, Karen

    2013-01-01

    In this paper we describe implementation and evaluation of a cluster-based enrichment strategy to call hits from a high-throughput screen (HTS), using a typical cell-based assay of 160,000 chemical compounds. Our focus is on statistical properties of the prospective design choices throughout the analysis, including how to choose the number of clusters for optimal power, the choice of test statistic, the significance thresholds for clusters and the activity threshold for candidate hits, how to rank selected hits for carry-forward to the confirmation screen, and how to identify confirmed hits in a data-driven manner. While previously the literature has focused on choice of test statistic or chemical descriptors, our studies suggest cluster size is the more important design choice. We recommend clusters be ranked by enrichment odds ratio, not p-value. Our conceptually simple test statistic is seen to identify the same set of hits as more complex scoring methods proposed in the literature. We prospectively confirm that such a cluster-based approach can outperform the naive top X approach, and estimate that we improved confirmation rates by about 31.5%, from 813 using the Top X approach to 1187 using our cluster-based method. PMID:22763983

  14. Analysis of high-throughput screening assays using cluster enrichment.

    PubMed

    Pu, Minya; Hayashi, Tomoko; Cottam, Howard; Mulvaney, Joseph; Arkin, Michelle; Corr, Maripat; Carson, Dennis; Messer, Karen

    2012-12-30

    In this paper, we describe the implementation and evaluation of a cluster-based enrichment strategy to call hits from a high-throughput screen using a typical cell-based assay of 160,000 chemical compounds. Our focus is on statistical properties of the prospective design choices throughout the analysis, including how to choose the number of clusters for optimal power, the choice of test statistic, the significance thresholds for clusters and the activity threshold for candidate hits, how to rank selected hits for carry-forward to the confirmation screen, and how to identify confirmed hits in a data-driven manner. Whereas previously the literature has focused on choice of test statistic or chemical descriptors, our studies suggest that cluster size is the more important design choice. We recommend clusters to be ranked by enrichment odds ratio, not by p-value. Our conceptually simple test statistic is seen to identify the same set of hits as more complex scoring methods proposed in the literature do. We prospectively confirm that such a cluster-based approach can outperform the naive top X approach and estimate that we improved confirmation rates by about 31.5% from 813 using the top X approach to 1187 using our cluster-based method. Copyright © 2012 John Wiley & Sons, Ltd.

  15. Edge electrospinning for high throughput production of quality nanofibers.

    PubMed

    Thoppey, N M; Bochinski, J R; Clarke, L I; Gorga, R E

    2011-08-26

    A novel, simple geometry for high throughput electrospinning from a bowl edge is presented that utilizes a vessel filled with a polymer solution and a concentric cylindrical collector. Successful fiber formation is presented for two different polymer systems with differing solution viscosity and solvent volatility. The process of jet initiation, resultant fiber morphology and fiber production rate are discussed for this unconfined feed approach. Under high voltage initiation, the jets spontaneously form directly on the fluid surface and rearrange along the circumference of the bowl to provide approximately equal spacing between spinning sites. Nanofibers currently produced from bowl electrospinning are identical in quality to those fabricated by traditional needle electrospinning (TNE) with a demonstrated ∼ 40 times increase in the production rate for a single batch of solution due primarily to the presence of many simultaneous jets. In the bowl electrospinning geometry, the electric field pattern and subsequent effective feed rate are very similar to those parameters found under optimized TNE experiments. Consequently, the electrospinning process per jet is directly analogous to that in TNE and thereby results in the same quality of nanofibers.

  16. A Fully Automated High-Throughput Training System for Rodents

    PubMed Central

    Poddar, Rajesh; Kawai, Risa; Ölveczky, Bence P.

    2013-01-01

    Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal’s home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors. PMID:24349451

  17. High-Throughput Single-Cell Manipulation in Brain Tissue

    PubMed Central

    Steinmeyer, Joseph D.; Yanik, Mehmet Fatih

    2012-01-01

    The complexity of neurons and neuronal circuits in brain tissue requires the genetic manipulation, labeling, and tracking of single cells. However, current methods for manipulating cells in brain tissue are limited to either bulk techniques, lacking single-cell accuracy, or manual methods that provide single-cell accuracy but at significantly lower throughputs and repeatability. Here, we demonstrate high-throughput, efficient, reliable, and combinatorial delivery of multiple genetic vectors and reagents into targeted cells within the same tissue sample with single-cell accuracy. Our system automatically loads nanoliter-scale volumes of reagents into a micropipette from multiwell plates, targets and transfects single cells in brain tissues using a robust electroporation technique, and finally preps the micropipette by automated cleaning for repeating the transfection cycle. We demonstrate multi-colored labeling of adjacent cells, both in organotypic and acute slices, and transfection of plasmids encoding different protein isoforms into neurons within the same brain tissue for analysis of their effects on linear dendritic spine density. Our platform could also be used to rapidly deliver, both ex vivo and in vivo, a variety of genetic vectors, including optogenetic and cell-type specific agents, as well as fast-acting reagents such as labeling dyes, calcium sensors, and voltage sensors to manipulate and track neuronal circuit activity at single-cell resolution. PMID:22536416

  18. High-throughput single-cell manipulation in brain tissue.

    PubMed

    Steinmeyer, Joseph D; Yanik, Mehmet Fatih

    2012-01-01

    The complexity of neurons and neuronal circuits in brain tissue requires the genetic manipulation, labeling, and tracking of single cells. However, current methods for manipulating cells in brain tissue are limited to either bulk techniques, lacking single-cell accuracy, or manual methods that provide single-cell accuracy but at significantly lower throughputs and repeatability. Here, we demonstrate high-throughput, efficient, reliable, and combinatorial delivery of multiple genetic vectors and reagents into targeted cells within the same tissue sample with single-cell accuracy. Our system automatically loads nanoliter-scale volumes of reagents into a micropipette from multiwell plates, targets and transfects single cells in brain tissues using a robust electroporation technique, and finally preps the micropipette by automated cleaning for repeating the transfection cycle. We demonstrate multi-colored labeling of adjacent cells, both in organotypic and acute slices, and transfection of plasmids encoding different protein isoforms into neurons within the same brain tissue for analysis of their effects on linear dendritic spine density. Our platform could also be used to rapidly deliver, both ex vivo and in vivo, a variety of genetic vectors, including optogenetic and cell-type specific agents, as well as fast-acting reagents such as labeling dyes, calcium sensors, and voltage sensors to manipulate and track neuronal circuit activity at single-cell resolution.

  19. A Microfluidic, High Throughput Protein Crystal Growth Method for Microgravity

    PubMed Central

    Carruthers Jr, Carl W.; Gerdts, Cory; Johnson, Michael D.; Webb, Paul

    2013-01-01

    The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG) capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS). The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions’ microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD), as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 103 cm.). After 70 days on the ISS, our samples were returned with 16 of 25 (64%) microgravity cards having crystals, compared to 12 of 25 (48%) of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories. PMID:24278480

  20. A High-Throughput Yeast Halo Assay for Bioactive Compounds.

    PubMed

    Bray, Walter; Lokey, R Scott

    2016-09-01

    When a disk of filter paper is impregnated with a cytotoxic or cytostatic drug and added to solid medium seeded with yeast, a visible clear zone forms around the disk whose size depends on the concentration and potency of the drug. This is the traditional "halo" assay and provides a convenient, if low-throughput, read-out of biological activity that has been the mainstay of antifungal and antibiotic testing for decades. Here, we describe a protocol for a high-throughput version of the halo assay, which uses an array of 384 pins to deliver ∼200 nL of stock solutions from compound plates onto single-well plates seeded with yeast. Using a plate reader in the absorbance mode, the resulting halos can be quantified and the data archived in the form of flat files that can be connected to compound databases with standard software. This assay has the convenience associated with the visual readout of the traditional halo assay but uses far less material and can be automated to screen thousands of compounds per day.

  1. High-Throughput Genotyping with Single Nucleotide Polymorphisms

    PubMed Central

    Ranade, Koustubh; Chang, Mau-Song; Ting, Chih-Tai; Pei, Dee; Hsiao, Chin-Fu; Olivier, Michael; Pesich, Robert; Hebert, Joan; Chen, Yii-Der I.; Dzau, Victor J.; Curb, David; Olshen, Richard; Risch, Neil; Cox, David R.; Botstein, David

    2001-01-01

    To make large-scale association studies a reality, automated high-throughput methods for genotyping with single-nucleotide polymorphisms (SNPs) are needed. We describe PCR conditions that permit the use of the TaqMan or 5′ nuclease allelic discrimination assay for typing large numbers of individuals with any SNP and computational methods that allow genotypes to be assigned automatically. To demonstrate the utility of these methods, we typed >1600 individuals for a G-to-T transversion that results in a glutamate-to-aspartate substitution at position 298 in the endothelial nitric oxide synthase gene, and a G/C polymorphism (newly identified in our laboratory) in intron 8 of the 11–β hydroxylase gene. The genotyping method is accurate—we estimate an error rate of fewer than 1 in 2000 genotypes, rapid—with five 96-well PCR machines, one fluorescent reader, and no automated pipetting, over one thousand genotypes can be generated by one person in one day, and flexible—a new SNP can be tested for association in less than one week. Indeed, large-scale genotyping has been accomplished for 23 other SNPs in 13 different genes using this method. In addition, we identified three “pseudo-SNPs” (WIAF1161, WIAF2566, and WIAF335) that are probably a result of duplication. PMID:11435409

  2. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    PubMed Central

    Kim, Robin E.; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F.; Song, Junho

    2016-01-01

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved. PMID:27258270

  3. High Throughput Screening for Anti–Trypanosoma cruzi Drug Discovery

    PubMed Central

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-01-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti–T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti–T. cruzi drug entities in the near future, are reviewed here. PMID:25474364

  4. A rapid transglutaminase assay for high-throughput screening applications.

    PubMed

    Wu, Yu-Wei; Tsai, Yu-Hui

    2006-10-01

    Transglutaminases (TGs) are widely distributed enzymes that catalyze posttranslational modification of proteins by Ca(2+)-dependent cross-linking reactions. The family members of TGs participate in many significant processes of biological functions such as tissue regeneration, cell differentiation, apoptosis, and certain pathologies. A novel technique for TG activity assay was developed in this study. It was based on the rapid capturing, fluorescence quenching, and fast separation of the unreacted fluorescent molecules from the macromolecular product with magnetic dextran-coated charcoal. As few as 3 ng of guinea pig liver transglutaminase (gpTG) could be detected by the method; activities of 96 TG samples could be measured within an hour. The K(m) of gpTG determined by this method for monodansylcadaverine (dansyl-CAD) and N, N-dimethylcasein was 14 and 5 muM, respectively. A typical competitive inhibition pattern of cystamine on dansyl-CAD for gpTG activity was also demonstrated. The application of this technique is not limited to the use of dansyl-CAD as the fluorescent substrate of TG; other small fluor-labeled TG substrates may substitute dansyl-CAD. Finally, this method is rapid, highly sensitive, and inexpensive. It is suitable not only for high-throughput screening of enzymes or enzyme inhibitors but also for enzyme kinetic analysis.

  5. Fulcrum: condensing redundant reads from high-throughput sequencing studies

    PubMed Central

    Burriesci, Matthew S.; Lehnert, Erik M.; Pringle, John R.

    2012-01-01

    Motivation: Ultra-high-throughput sequencing produces duplicate and near-duplicate reads, which can consume computational resources in downstream applications. A tool that collapses such reads should reduce storage and assembly complications and costs. Results: We developed Fulcrum to collapse identical and near-identical Illumina and 454 reads (such as those from PCR clones) into single error-corrected sequences; it can process paired-end as well as single-end reads. Fulcrum is customizable and can be deployed on a single machine, a local network or a commercially available MapReduce cluster, and it has been optimized to maximize ease-of-use, cross-platform compatibility and future scalability. Sequence datasets have been collapsed by up to 71%, and the reduced number and improved quality of the resulting sequences allow assemblers to produce longer contigs while using less memory. Availability and implementation: Source code and a tutorial are available at http://pringlelab.stanford.edu/protocols.html under a BSD-like license. Fulcrum was written and tested in Python 2.6, and the single-machine and local-network modes depend on a modified version of the Parallel Python library (provided). Contact: erik.m.lehnert@gmail.com Supplementary information: Supplementary information is available at Bioinformatics online. PMID:22419786

  6. Comprehensive analysis of high-throughput screening data

    NASA Astrophysics Data System (ADS)

    Heyse, Stephan

    2002-06-01

    High-Throughput Screening (HTS) data in its entirety is a valuable raw material for the drug-discovery process. It provides the most compete information about the biological activity of a company's compounds. However, its quantity, complexity and heterogeneity require novel, sophisticated approaches in data analysis. At GeneData, we are developing methods for large-scale, synoptical mining of screening data in a five-step analysis: (1) Quality Assurance: Checking data for experimental artifacts and eliminating low quality data. (2) Biological Profiling: Clustering and ranking of compounds based on their biological activity, taking into account specific characteristics of HTS data. (3) Rule-based Classification: Applying user-defined rules to biological and chemical properties, and providing hypotheses on the biological mode-of-action of compounds. (4) Joint Biological-Chemical Analysis: Associating chemical compound data to HTS data, providing hypotheses for structure- activity relationships. (5) integration with Genomic and Gene Expression Data: Linking into other components of GeneData's bioinformatics platform, and assessing the compounds' modes-of-action, toxicity, and metabolic properties. These analyses address issues that are crucial for a correct interpretation and full exploitation of screening data. They lead to a sound rating of assays and compounds at an early state of the lead-finding process.

  7. Picking Cell Lines for High-Throughput Transcriptomic Toxicity ...

    EPA Pesticide Factsheets

    High throughput, whole genome transcriptomic profiling is a promising approach to comprehensively evaluate chemicals for potential biological effects. To be useful for in vitro toxicity screening, gene expression must be quantified in a set of representative cell types that captures the diversity of potential responses across chemicals. The ideal dataset to select these cell types would consist of hundreds of cell types treated with thousands of chemicals, but does not yet exist. However, basal gene expression data may be useful as a surrogate for representing the relevant biological space necessary for cell type selection. The goal of this study was to identify a small (< 20) number of cell types that capture a large, quantifiable fraction of basal gene expression diversity. Three publicly available collections of Affymetrix U133+2.0 cellular gene expression data were used: 1) 59 cell lines from the NCI60 set; 2) 303 primary cell types from the Mabbott et al (2013) expression atlas; and 3) 1036 cell lines from the Cancer Cell Line Encyclopedia. The data were RMA normalized, log-transformed, and the probe sets mapped to HUGO gene identifiers. The results showed that <20 cell lines capture only a small fraction of the total diversity in basal gene expression when evaluated using either the entire set of 20960 HUGO genes or a subset of druggable genes likely to be chemical targets. The fraction of the total gene expression variation explained was consistent when

  8. High-throughput screening and optimization of photoembossed relief structures.

    PubMed

    Adams, Nico; De Gans, Berend-Jan; Kozodaev, Dimitri; Sanchez, Carlos; Bastiaansen, Cees W M; Broer, Dirk J; Schubert, Ulrich S

    2006-01-01

    A methodology for the rapid design, screening, and optimization of coating systems with surface relief structures, using a combination of statistical experimental design, high-throughput experimentation, data mining, and graphical and mathematical optimization routines was developed. The methodology was applied to photopolymers used in photoembossing applications. A library of 72 films was prepared by dispensing a given amount of sample onto a chemically patterned substrate consisting of hydrophilic areas separated by fluorinated hydrophobic barriers. Film composition and film processing conditions were determined using statistical experimental design. The surface topology of the films was characterized by automated AFM. Subsequently, models explaining the dependence of surface topologies on sample composition and processing parameters were developed and used for screening a virtual 4000-membered in silico library of photopolymer lacquers. Simple graphical optimization or Pareto algorithms were subsequently used to find an ensemble of formulations, which were optimal with respect to a predefined set of properties, such as aspect ratio and shape of the relief structures.

  9. Microfluidic system for high throughput characterisation of echogenic particles.

    PubMed

    Rademeyer, Paul; Carugo, Dario; Lee, Jeong Yu; Stride, Eleanor

    2015-01-21

    Echogenic particles, such as microbubbles and volatile liquid micro/nano droplets, have shown considerable potential in a variety of clinical diagnostic and therapeutic applications. The accurate prediction of their response to ultrasound excitation is however extremely challenging, and this has hindered the optimisation of techniques such as quantitative ultrasound imaging and targeted drug delivery. Existing characterisation techniques, such as ultra-high speed microscopy provide important insights, but suffer from a number of limitations; most significantly difficulty in obtaining large data sets suitable for statistical analysis and the need to physically constrain the particles, thereby altering their dynamics. Here a microfluidic system is presented that overcomes these challenges to enable the measurement of single echogenic particle response to ultrasound excitation. A co-axial flow focusing device is used to direct a continuous stream of unconstrained particles through the combined focal region of an ultrasound transducer and a laser. Both the optical and acoustic scatter from individual particles are then simultaneously recorded. Calibration of the device and example results for different types of echogenic particle are presented, demonstrating a high throughput of up to 20 particles per second and the ability to resolve changes in particle radius down to 0.1 μm with an uncertainty of less than 3%.

  10. Edge electrospinning for high throughput production of quality nanofibers

    NASA Astrophysics Data System (ADS)

    Thoppey, N. M.; Bochinski, J. R.; Clarke, L. I.; Gorga, R. E.

    2011-08-01

    A novel, simple geometry for high throughput electrospinning from a bowl edge is presented that utilizes a vessel filled with a polymer solution and a concentric cylindrical collector. Successful fiber formation is presented for two different polymer systems with differing solution viscosity and solvent volatility. The process of jet initiation, resultant fiber morphology and fiber production rate are discussed for this unconfined feed approach. Under high voltage initiation, the jets spontaneously form directly on the fluid surface and rearrange along the circumference of the bowl to provide approximately equal spacing between spinning sites. Nanofibers currently produced from bowl electrospinning are identical in quality to those fabricated by traditional needle electrospinning (TNE) with a demonstrated ~ 40 times increase in the production rate for a single batch of solution due primarily to the presence of many simultaneous jets. In the bowl electrospinning geometry, the electric field pattern and subsequent effective feed rate are very similar to those parameters found under optimized TNE experiments. Consequently, the electrospinning process per jet is directly analogous to that in TNE and thereby results in the same quality of nanofibers.

  11. High Throughput Multispectral Image Processing with Applications in Food Science

    PubMed Central

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing’s outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples. PMID:26466349

  12. High throughput jet singlet oxygen generator for multi kilowatt SCOIL

    NASA Astrophysics Data System (ADS)

    Rajesh, R.; Singhal, Gaurav; Mainuddin; Tyagi, R. K.; Dawar, A. L.

    2010-06-01

    A jet flow singlet oxygen generator (JSOG) capable of handling chlorine flows of nearly 1.5 mol s -1 has been designed, developed, and tested. The generator is designed in a modular configuration taking into consideration the practical aspects of handling high throughput flows without catastrophic BHP carry over. While for such high flow rates a cross-flow configuration has been reported, the generator utilized in the present study is a counter flow configuration. A near vertical extraction of singlet oxygen is effected at the generator exit, followed by a 90° rotation of the flow forming a novel verti-horizontal COIL scheme. This allows the COIL to be operated with a vertical extraction SOG followed by the horizontal arrangement of subsequent COIL systems such as supersonic nozzle, cavity, supersonic diffuser, etc. This enables a more uniform weight distribution from point of view of mobile and other platform mounted systems, which is highly relevant for large scale systems. The present study discusses the design aspects of the jet singlet oxygen generator along with its test results for various operating ranges. Typically, for the intended design flow rates, the chlorine utilization and singlet oxygen yield have been observed to be ˜94% and ˜64%, respectively.

  13. High-throughput PCR in silicon based microchamber array.

    PubMed

    Nagai, H; Murakami, Y; Yokoyama, K; Tamiya, E

    2001-12-01

    Highly integrated hybridization assay and capillary electrophoresis have improved the throughput of DNA analysis. The shift to high throughput analysis requires a high speed DNA amplification system, and several rapid PCR systems have been developed. In these thermal cyclers, the temperature was controlled by effective methodology instead of a large heating/cooling block preventing rapid thermal cycling. In our research, high speed PCR was performed using a silicon-based microchamber array and three heat blocks. The highly integrated microchamber array was fabricated by semiconductor microfabrication techniques. The temperature of the PCR microchamber was controlled by alternating between three heat blocks of different temperature. In general, silicon has excellent thermal conductivity, and the heat capacity is small in the miniaturized sample volume. Hence, the heating/cooling rate was rapid, approximately 16 degrees C/s. The rapid PCR was therefore completed in 18 min for 40 cycles. The thermal cycle time was reduced to 1/10 of a commercial PCR instrument (Model 9600, PE Applied Biosystems-3 h).

  14. Enzyme assay design for high-throughput screening.

    PubMed

    Williams, Kevin P; Scott, John E

    2009-01-01

    Enzymes continue to be a major drug target class for the pharmaceutical industry with high-throughput screening the approach of choice for identifying initial active chemical compounds. The development of fluorescent- or absorbance-based readouts typically remains the formats of choice for enzyme screens and a wealth of experience from both industry and academia has led to a comprehensive set of standardized assay development and validation guidelines for enzyme assays. In this chapter, we generalize approaches to developing, validating, and troubleshooting assays that should be applicable in both industrial and academic settings. Real-life examples of various enzyme classes including kinases, proteases, transferases, and phosphatases are used to illustrate assay development approaches and solutions. Practical examples are given for how to deal with low-purity enzyme targets, compound interference, and identification of activators. Assay acceptance criteria and a number of assay notes on pitfalls to avoid should provide pointers on how to develop a suitable enzymatic assay applicable for HTS.

  15. Validation of high throughput sequencing and microbial forensics applications

    PubMed Central

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166

  16. High Throughput Heuristics for Prioritizing Human Exposure to ...

    EPA Pesticide Factsheets

    The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, forecasts of exposure, the putative risk of adverse health effect from a chemical cannot be evaluated. We used Bayesian methodology to infer ranges of exposure intakes that are consistent with biomarkers of chemical exposures identified in urine samples from the U.S. population by the National Health and Nutrition Examination Survey (NHANES). We perform linear regression on inferred exposure for demographic subsets of NHANES demarked by age, gender, and weight using high throughput chemical descriptors gleaned from databases and chemical structure-based calculators. We find that five of these descriptors are capable of explaining roughly 50% of the variability across chemicals for all the demographic groups examined, including children aged 6-11. For the thousands of chemicals with no other source of information, this approach allows rapid and efficient prediction of average exposure intake of environmental chemicals. The methods described by this manuscript provide a highly improved methodology for HTS of human exposure to environmental chemicals. The manuscript includes a ranking of 7785 environmental chemicals with respect to potential human exposure, including most of the Tox21 in vit

  17. Surface free energy activated high-throughput cell sorting.

    PubMed

    Zhang, Xinru; Zhang, Qian; Yan, Tao; Jiang, Zeyi; Zhang, Xinxin; Zuo, Yi Y

    2014-09-16

    Cell sorting is an important screening process in microbiology, biotechnology, and clinical research. Existing methods are mainly based on single-cell analysis as in flow cytometric and microfluidic cell sorters. Here we report a label-free bulk method for sorting cells by differentiating their characteristic surface free energies (SFEs). We demonstrated the feasibility of this method by sorting model binary cell mixtures of various bacterial species, including Pseudomonas putida KT2440, Enterococcus faecalis ATCC 29212, Salmonella Typhimurium ATCC 14028, and Escherichia coli DH5α. This method can effectively separate 10(10) bacterial cells within 30 min. Individual bacterial species can be sorted with up to 96% efficiency, and the cell viability ratio can be as high as 99%. In addition to its capacity of sorting evenly mixed bacterial cells, we demonstrated the feasibility of this method in selecting and enriching cells of minor populations in the mixture (presenting at only 1% in quantity) to a purity as high as 99%. This SFE-activated method may be used as a stand-alone method for quickly sorting a large quantity of bacterial cells or as a prescreening tool for microbial discrimination. Given its advantages of label-free, high-throughput, low cost, and simplicity, this SFE-activated cell sorting method has potential in various applications of sorting cells and abiotic particles.

  18. A Call for Nominations of Quantitative High-Throughput ...

    EPA Pesticide Factsheets

    The National Research Council of the United States National Academies of Science has recently released a document outlining a long-range vision and strategy for transforming toxicity testing from largely whole animal-based testing to one based on in vitro assays. “Toxicity Testing in the 21st Century: A Vision and a Strategy” advises a focus on relevant human toxicity pathway assays. Toxicity pathways are defined in the document as “Cellular response pathways that, when sufficiently perturbed, are expected to result in adverse health effects”. Results of such pathway screens would serve as a filter to drive selection of more specific, targeted testing that will complement and validate the pathway assays. In response to this report, the US EPA has partnered with two NIH organizations, the National Toxicology Program and the NIH Chemical Genomics Center (NCGC), in a program named Tox21. A major goal of this collaboration is to screen chemical libraries consisting of known toxicants, chemicals of environmental and occupational exposure concern, and human pharmaceuticals in cell-based pathway assays. Currently, approximately 3000 compounds (increasing to 9000 by the end of 2009) are being validated and screened in quantitative high-throughput (qHTS) format at the NCGC producing extensive concentration-response data for a diverse set of potential toxicity pathways. The Tox21 collaboration is extremely interested in accessing additional toxicity pathway assa

  19. A simple high-throughput enzymeimmunoassay for norethisterone (norethindrone).

    PubMed

    Turkes, A; Read, G F; Riad-Fahmy, D

    1982-05-01

    A direct enzymeimmunoassay having the sensitivity required for determining norethisterone concentrations in small aliquots of plasma (10 microliter) has been developed. This assay featured a solid phase antiserum raised against a norethisterone-11 alpha-hemisuccinyl/bovine serum albumin conjugate. The antiserum was coupled to cyanogen bromide-activated magnetisable cellulose, and antibody-bound and free fractions were separated by a simple magnetic device. A norethisterone/horseradish peroxidase conjugate was used as the label; o-phenylenediamine/hydrogen peroxide being the substrate for colour development. The results obtained by this direct EIA, which allowed processing of at least 100 samples per day, were compared with those of a well-validated enzymeimmunoassay featuring solvent extraction and centrifugal separation of antibody-bound and free steroid; the results were in excellent agreement (n = 30; r greater than 0.99) suggesting the usefulness of the simple high-throughput procedure for processing the large sample numbers generated by field investigations and pharmacokinetic studies.

  20. Aptamers as reagents for high-throughput screening.

    PubMed

    Green, L S; Bell, C; Janjic, N

    2001-05-01

    The identification of new drug candidates from chemical libraries is a major component of discovery research in many pharmaceutical companies. Given the large size of many conventional and combinatorial libraries and the rapid increase in the number of possible therapeutic targets, the speed with which efficient high-throughput screening (HTS) assays can be developed can be a rate-limiting step in the discovery process. We show here that aptamers, nucleic acids that bind other molecules with high affinity, can be used as versatile reagents in competition binding HTS assays to identify and optimize small-molecule ligands to protein targets. To illustrate this application, we have used labeled aptamers to platelet-derived growth factor B-chain and wheat germ agglutinin to screen two sets of potential small-molecule ligands. In both cases, binding affinities of all ligands tested (small molecules and aptamers) were strongly correlated with their inhibitory potencies in functional assays. The major advantages of using aptamers in HTS assays are speed of aptamer identification, high affinity of aptamers for protein targets, relatively large aptamer-protein interaction surfaces, and compatibility with various labeling/detection strategies. Aptamers may be particularly useful in HTS assays with protein targets that have no known binding partners such as orphan receptors. Since aptamers that bind to proteins are often specific and potent antagonists of protein function, the use of aptamers for target validation can be coupled with their subsequent use in HTS.

  1. PRISM: a data management system for high-throughput proteomics.

    PubMed

    Kiebel, Gary R; Auberry, Ken J; Jaitly, Navdeep; Clark, David A; Monroe, Matthew E; Peterson, Elena S; Tolić, Nikola; Anderson, Gordon A; Smith, Richard D

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management system (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at Pacific Northwest National Laboratory. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  2. High-throughput optical screening of cellular mechanotransduction

    NASA Astrophysics Data System (ADS)

    Compton, Jonathan L.; Luo, Justin C.; Ma, Huan; Botvinick, Elliot; Venugopalan, Vasan

    2014-09-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demonstrate microtsunami-initiated mechanosignalling in primary human endothelial cells. This observed signalling is consistent with G-protein-coupled receptor stimulation, resulting in Ca2+ release by the endoplasmic reticulum. Moreover, we demonstrate the dose-dependent modulation of microtsunami-induced Ca2+ signalling by introducing a known inhibitor to this pathway. The imaging of Ca2+ signalling and its modulation by exogenous molecules demonstrates the capacity to initiate and assess cellular mechanosignalling in real time. We utilize this capability to screen the effects of a set of small molecules on cellular mechanotransduction in 96-well plates using standard imaging cytometry.

  3. Functional approach to high-throughput plant growth analysis

    PubMed Central

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  4. Dimensioning storage and computing clusters for efficient high throughput computing

    NASA Astrophysics Data System (ADS)

    Accion, E.; Bria, A.; Bernabeu, G.; Caubet, M.; Delfino, M.; Espinal, X.; Merino, G.; Lopez, F.; Martinez, F.; Planas, E.

    2012-12-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  5. High-throughput screening of chemical effects on ...

    EPA Pesticide Factsheets

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples on steroidogenesis via HPLC-MS/MS quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a three stage screening strategy. The first stage established the maximum tolerated concentration (MTC; >70% viability) per sample. The second stage quantified changes in hormone levels at the MTC while the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were pre-stimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2,060 chemical samples evaluated, 524 samples were selected for six-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into five distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A d

  6. High-Throughput Preparation of New Photoactive Nanocomposites.

    PubMed

    Conterosito, Eleonora; Benesperi, Iacopo; Toson, Valentina; Saccone, Davide; Barbero, Nadia; Palin, Luca; Barolo, Claudia; Gianotti, Valentina; Milanesio, Marco

    2016-06-08

    New low-cost photoactive hybrid materials based on organic luminescent molecules inserted into hydrotalcite (layered double hydroxides; LDH) were produced, which exploit the high-throughput liquid-assisted grinding (LAG) method. These materials are conceived for applications in dye-sensitized solar cells (DSSCs) as a co-absorbers and in silicon photovoltaic (PV) panels to improve their efficiency as they are able to emit where PV modules show the maximum efficiency. A molecule that shows a large Stokes' shift was designed, synthesized, and intercalated into LDH. Two dyes already used in DSSCs were also intercalated to produce two new nanocomposites. LDH intercalation allows the stability of organic dyes to be improved and their direct use in polymer melt blending. The prepared nanocomposites absorb sunlight from UV to visible and emit from blue to near-IR and thus can be exploited for light-energy management. Finally one nanocomposite was dispersed by melt blending into a poly(methyl methacrylate)-block-poly(n-butyl acrylate) copolymer to obtain a photoactive film. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  8. High Throughput Interrogation of Behavioral Transitions in C. elegans

    NASA Astrophysics Data System (ADS)

    Liu, Mochi; Shaevitz, Joshua; Leifer, Andrew

    We present a high-throughput method to probe transformations from neural activity to behavior in Caenorhabditis elegans to better understand how organisms change behavioral states. We optogenetically deliver white-noise stimuli to target sensory or inter neurons while simultaneously recording the movement of a population of worms. Using all the postural movement data collected, we computationally classify stereotyped behaviors in C. elegans by clustering based on the spectral properties of the instantaneous posture. (Berman et al., 2014) Transitions between these behavioral clusters indicate discrete behavioral changes. To study the neural correlates dictating these transitions, we perform model-driven experiments and employ Linear-Nonlinear-Poisson cascades that take the white-noise stimulus as the input. The parameters of these models are fitted by reverse-correlation from our measurements. The parameterized models of behavioral transitions predict the worm's response to novel stimuli and reveal the internal computations the animal makes before carrying out behavioral decisions. Preliminary results are shown that describe the neural-behavioral transformation between neural activity in mechanosensory neurons and reversal behavior.

  9. Use of High Throughput Screening Data in IARC Monograph ...

    EPA Pesticide Factsheets

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  10. High-throughput label-free image cytometry and image-based classification of live Euglena gracilis

    PubMed Central

    Lei, Cheng; Ito, Takuro; Ugawa, Masashi; Nozawa, Taisuke; Iwata, Osamu; Maki, Masanori; Okada, Genki; Kobayashi, Hirofumi; Sun, Xinlei; Tiamsak, Pimsiri; Tsumura, Norimichi; Suzuki, Kengo; Di Carlo, Dino; Ozeki, Yasuyuki; Goda, Keisuke

    2016-01-01

    We demonstrate high-throughput label-free single-cell image cytometry and image-based classification of Euglena gracilis (a microalgal species) under different culture conditions. We perform it with our high-throughput optofluidic image cytometer composed of a time-stretch microscope with 780-nm resolution and 75-Hz line rate, and an inertial-focusing microfluidic device. By analyzing a large number of single-cell images from the image cytometer, we identify differences in morphological and intracellular phenotypes between E. gracilis cell groups and statistically classify them under various culture conditions including nitrogen deficiency for lipid induction. Our method holds promise for real-time evaluation of culture techniques for E. gracilis and possibly other microalgae in a non-invasive manner. PMID:27446699

  11. High-throughput measurements of thermochromic behavior in V(1-x)Nb(x)O(2) combinatorial thin film libraries.

    PubMed

    Barron, S C; Gorham, J M; Patel, M P; Green, M L

    2014-10-13

    We describe a high-throughput characterization of near-infrared thermochromism in V1-xNbxO2 combinatorial thin film libraries. The oxide thin film library was prepared with a VO2 crystal structure and a continuous gradient in composition with Nb concentrations in the range of less than 1% to 45%. The thermochromic phase transition from monoclinic to tetragonal was characterized by the accompanying change in near-infrared reflectance. With increasing Nb substitution, the transition temperature was depressed from 65 to 35 °C, as desirable for smart window applications. However, the magnitude of the reflectance change across the thermochromic transition was also reduced with increasing Nb film content. Data collection, handling, and analysis supporting thermochromic characterization were fully automated to achieve high throughput. Using this system, in 14 h, temperature-dependent infrared reflectances were measured at 165 arbitrary locations on a thin film combinatorial library; these measurements were analyzed for thermochromic transitions in minutes.

  12. Review of high-throughput techniques for detecting solid phase Transformation from material libraries produced by combinatorial methods

    NASA Technical Reports Server (NTRS)

    Lee, Jonathan A.

    2005-01-01

    High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.

  13. A simple dual online ultra-high pressure liquid chromatography system (sDO-UHPLC) for high throughput proteome analysis.

    PubMed

    Lee, Hangyeore; Mun, Dong-Gi; Bae, Jingi; Kim, Hokeun; Oh, Se Yeon; Park, Young Soo; Lee, Jae-Hyuk; Lee, Sang-Won

    2015-08-21

    We report a new and simple design of a fully automated dual-online ultra-high pressure liquid chromatography system. The system employs only two nano-volume switching valves (a two-position four port valve and a two-position ten port valve) that direct solvent flows from two binary nano-pumps for parallel operation of two analytical columns and two solid phase extraction (SPE) columns. Despite the simple design, the sDO-UHPLC offers many advantageous features that include high duty cycle, back flushing sample injection for fast and narrow zone sample injection, online desalting, high separation resolution and high intra/inter-column reproducibility. This system was applied to analyze proteome samples not only in high throughput deep proteome profiling experiments but also in high throughput MRM experiments.

  14. Review of high-throughput techniques for detecting solid phase Transformation from material libraries produced by combinatorial methods

    NASA Technical Reports Server (NTRS)

    Lee, Jonathan A.

    2005-01-01

    High-throughput measurement techniques are reviewed for solid phase transformation from materials produced by combinatorial methods, which are highly efficient concepts to fabricate large variety of material libraries with different compositional gradients on a single wafer. Combinatorial methods hold high potential for reducing the time and costs associated with the development of new materials, as compared to time-consuming and labor-intensive conventional methods that test large batches of material, one- composition at a time. These high-throughput techniques can be automated to rapidly capture and analyze data, using the entire material library on a single wafer, thereby accelerating the pace of materials discovery and knowledge generation for solid phase transformations. The review covers experimental techniques that are applicable to inorganic materials such as shape memory alloys, graded materials, metal hydrides, ferric materials, semiconductors and industrial alloys.

  15. Analysis of high-throughput RNAi screening data in identifying genes mediating sensitivity to chemotherapeutic drugs: statistical approaches and perspectives.

    PubMed

    Ye, Fei; Bauer, Joshua A; Pietenpol, Jennifer A; Shyr, Yu

    2012-01-01

    High-throughput RNA interference (RNAi) screens have been used to find genes that, when silenced, result in sensitivity to certain chemotherapy drugs. Researchers therefore can further identify drug-sensitive targets and novel drug combinations that sensitize cancer cells to chemotherapeutic drugs. Considerable uncertainty exists about the efficiency and accuracy of statistical approaches used for RNAi hit selection in drug sensitivity studies. Researchers require statistical methods suitable for analyzing high-throughput RNAi screening data that will reduce false-positive and false-negative rates. In this study, we carried out a simulation study to evaluate four types of statistical approaches (fold-change/ratio, parametric tests/statistics, sensitivity index, and linear models) with different scenarios of RNAi screenings for drug sensitivity studies. With the simulated datasets, the linear model resulted in significantly lower false-negative and false-positive rates. Based on the results of the simulation study, we then make recommendations of statistical analysis methods for high-throughput RNAi screening data in different scenarios. We assessed promising methods using real data from a loss-of-function RNAi screen to identify hits that modulate paclitaxel sensitivity in breast cancer cells. High-confidence hits with specific inhibitors were further analyzed for their ability to inhibit breast cancer cell growth. Our analysis identified a number of gene targets with inhibitors known to enhance paclitaxel sensitivity, suggesting other genes identified may merit further investigation. RNAi screening can identify druggable targets and novel drug combinations that can sensitize cancer cells to chemotherapeutic drugs. However, applying an inappropriate statistical method or model to the RNAi screening data will result in decreased power to detect the true hits and increase false positive and false negative rates, leading researchers to draw incorrect conclusions. In

  16. A novel high-throughput automated chip-based nanoelectrospray tandem mass spectrometric method for PAMPA sample analysis.

    PubMed

    Balimane, Praveen V; Pace, Ellen; Chong, Saeho; Zhu, Mingshe; Jemal, Mohammed; Pelt, Colleen K Van

    2005-09-01

    Parallel artificial membrane permeability assay (PAMPA) has recently gained popularity as a novel, high-throughput assay capable of rapidly screening compounds for their permeability characteristics in early drug discovery. The analytical techniques typically used for PAMPA sample analysis are HPLC-UV, LC/MS or more recently UV-plate reader. The LC techniques, though sturdy and accurate, are often labor and time intensive and are not ideal for high-throughput. On the other hand, UV-plate reader technique is amenable to high-throughput but is not sensitive enough to detect the lower concentrations that are often encountered in early drug discovery work. This article investigates a novel analytical method, a chip-based automated nanoelectrospray mass spectrometric method for its ability to rapidly analyze PAMPA permeability samples. The utility and advantages of this novel analytical method is demonstrated by comparing PAMPA permeability values obtained from nanoelectrospray to those from conventional analytical methods. Ten marketed drugs having a broad range of structural space, physico-chemical properties and extent of intestinal absorption were selected as test compounds for this investigation. PAMPA permeability and recovery experiments were conducted with model compounds followed by analysis by UV-plate reader, UV-HPLC as well as the automated nanoelectrospray technique (nanoESI-MS/MS). There was a very good correlation (r(2) > 0.9) between the results obtained using nanoelectrospray and the other analytical techniques tested. Moreover, the nanoelectrospray approach presented several advantages over the standard techniques such as higher sensitivity and ability to detect individual compounds in cassette studies, making it an attractive high-throughput analytical technique. Thus, it has been demonstrated that nanoelectrospray analysis provides a highly efficient and accurate analytical methodology to analyze PAMPA samples generated in early drug discovery.

  17. Identification and Characterization of miRNA Transcriptome in Potato by High-Throughput Sequencing

    PubMed Central

    Zhang, Runxuan; Marshall, David; Bryan, Glenn J.; Hornyik, Csaba

    2013-01-01

    Micro RNAs (miRNAs) represent a class of short, non-coding, endogenous RNAs which play important roles in post-transcriptional regulation of gene expression. While the diverse functions of miRNAs in model plants have been well studied, the impact of miRNAs in crop plant biology is poorly understood. Here we used high-throughput sequencing and bioinformatics analysis to analyze miRNAs in the tuber bearing crop potato (Solanum tuberosum). Small RNAs were analysed from leaf and stolon tissues. 28 conserved miRNA families were found and potato-specific miRNAs were identified and validated by RNA gel blot hybridization. The size, origin and predicted targets of conserved and potato specific miRNAs are described. The large number of miRNAs and complex population of small RNAs in potato suggest important roles for these non-coding RNAs in diverse physiological and metabolic pathways. PMID:23437348

  18. A novel platform for automated high-throughput fluxome profiling of metabolic variants.

    PubMed

    Heux, Stéphanie; Poinot, Juliette; Massou, Stéphane; Sokol, Serguei; Portais, Jean-Charles

    2014-09-01

    Advances in metabolic engineering are enabling the creation of a large number of cell factories. However, high-throughput platforms do not yet exist for rapidly analyzing the metabolic network of the engineered cells. To fill the gap, we developed an integrated solution for fluxome profiling of large sets of biological systems and conditions. This platform combines a robotic system for (13)C-labelling experiments and sampling of labelled material with NMR-based isotopic fingerprinting and automated data interpretation. As a proof-of-concept, this workflow was applied to discriminate between Escherichia coli mutants with gradual expression of the glucose-6-phosphate dehydrogenase. Metabolic variants were clearly discriminated while pathways that support metabolic flexibility towards modulation of a single enzyme were elucidating. By directly connecting the data flow between cell cultivation and flux quantification, considerable advances in throughput, robustness, release of resources and screening capacity were achieved. This will undoubtedly facilitate the development of efficient cell factories.

  19. Generating high accuracy peptide binding data in high throughput with yeast surface display and SORTCERY

    PubMed Central

    Reich, Lothar “Luther”; Dutta, Sanjib; Keating, Amy E.

    2016-01-01

    Library methods are widely used to study protein-protein interactions, and high-throughput screening or selection followed by sequencing can identify a large number of peptide ligands for a protein target. In this chapter we describe a procedure called "SORTCERY" that can rank the affinities of library members for a target with high accuracy. SORTCERY follows a three-step protocol. First, fluorescence activated cell sorting (FACS) is used to sort a library of yeast displayed peptide ligands according to their affinities for a target. Second, all sorted pools are deep sequenced. Third, the resulting data are analyzed to create a ranking. We demonstrate an application of SORTCERY to the problem of ranking peptide ligands for the anti-apoptotic regulator Bcl-xL. PMID:27094295

  20. HTSvis: a web app for exploratory data analysis and visualization of arrayed high-throughput screens.

    PubMed

    Scheeder, Christian; Heigwer, Florian; Boutros, Michael

    2017-09-15

    Arrayed high-throughput screens (HTS) cover a broad range of applications using RNAi or small molecules as perturbations and specialized software packages for statistical analysis have become available. However, exploratory data analysis and integration of screening results has remained challenging due to the size of the data sets and the lack of user-friendly tools for interpretation and visualization of screening results. Here we present HTSvis, a web application to interactively visualize raw data, perform quality control and assess screening results from single to multi-channel measurements such as image-based screens. Per well aggregated raw and analyzed data of various assay types and scales can be loaded in a generic tabular format. HTSvis is distributed as an open-source R package, downloadable from https://github.com/boutroslab/HTSvis and can also be accessed at http://htsvis.dkfz.de . m.boutros@dkfz.de. Supplementary data are available at Bioinformatics online .

  1. High-throughput microarray detection of olfactory receptor gene expression in the mouse

    PubMed Central

    Zhang, Xinmin; Rogers, Matthew; Tian, Huikai; Zhang, Xiaohong; Zou, Dong-Jing; Liu, Jian; Ma, Minghong; Shepherd, Gordon M.; Firestein, Stuart J.

    2004-01-01

    The large number of olfactory receptor genes necessitates high throughput methods to analyze their expression patterns. We have therefore designed a high-density oligonucleotide array containing all known mouse olfactory receptor (OR) and V1R vomeronasal receptor genes. This custom array detected a large number of receptor genes, demonstrating specific expression in the olfactory sensory epithelium for ≈800 OR genes previously designated as ORs based solely on genomic sequences. The array also enabled us to monitor the spatial and temporal distribution of gene expression for the entire OR family. Interestingly, OR genes showing spatially segregated expression patterns were also segregated on the chromosomes. This correlation between genomic location and spatial expression provides unique insights about the regulation of this large family of genes. PMID:15377787

  2. Differential Dynamic Microscopy: A High-Throughput Method for Characterizing the Motility of Microorganisms

    PubMed Central

    Martinez, Vincent A.; Besseling, Rut; Croze, Ottavio A.; Tailleur, Julien; Reufer, Mathias; Schwarz-Linek, Jana; Wilson, Laurence G.; Bees, Martin A.; Poon, Wilson C.K.

    2012-01-01

    We present a fast, high-throughput method for characterizing the motility of microorganisms in three dimensions based on standard imaging microscopy. Instead of tracking individual cells, we analyze the spatiotemporal fluctuations of the intensity in the sample from time-lapse images and obtain the intermediate scattering function of the system. We demonstrate our method on two different types of microorganisms: the bacterium Escherichia coli (both smooth swimming and wild type) and the biflagellate alga Chlamydomonas reinhardtii. We validate the methodology using computer simulations and particle tracking. From the intermediate scattering function, we are able to extract the swimming speed distribution, fraction of motile cells, and diffusivity for E. coli, and the swimming speed distribution, and amplitude and frequency of the oscillatory dynamics for C. reinhardtii. In both cases, the motility parameters were averaged over ∼ 104∼104 cells and obtained in a few minutes. PMID:23083706

  3. High-Throughput Phenotyping and QTL Mapping Reveals the Genetic Architecture of Maize Plant Growth.

    PubMed

    Zhang, Xuehai; Huang, Chenglong; Wu, Di; Qiao, Feng; Li, Wenqiang; Duan, Lingfeng; Wang, Ke; Xiao, Yingjie; Chen, Guoxing; Liu, Qian; Xiong, Lizhong; Yang, Wanneng; Yan, Jianbing

    2017-03-01

    With increasing demand for novel traits in crop breeding, the plant research community faces the challenge of quantitatively analyzing the structure and function of large numbers of plants. A clear goal of high-throughput phenotyping is to bridge the gap between genomics and phenomics. In this study, we quantified 106 traits from a maize (Zea mays) recombinant inbred line population (n = 167) across 16 developmental stages using the automatic phenotyping platform. Quantitative trait locus (QTL) mapping with a high-density genetic linkage map, including 2,496 recombinant bins, was used to uncover the genetic basis of these complex agronomic traits, and 988 QTLs have been identified for all investigated traits, including three QTL hotspots. Biomass accumulation and final yield were predicted using a combination of dissected traits in the early growth stage. These results reveal the dynamic genetic architecture of maize plant growth and enhance ideotype-based maize breeding and prediction.

  4. Advancing the High Throughput Identification of Liver Fibrosis Protein Signatures Using Multiplexed Ion Mobility Spectrometry*

    PubMed Central

    Baker, Erin Shammel; Burnum-Johnson, Kristin E.; Jacobs, Jon M.; Diamond, Deborah L.; Brown, Roslyn N.; Ibrahim, Yehia M.; Orton, Daniel J.; Piehowski, Paul D.; Purdy, David E.; Moore, Ronald J.; Danielson, William F.; Monroe, Matthew E.; Crowell, Kevin L.; Slysz, Gordon W.; Gritsenko, Marina A.; Sandoval, John D.; LaMarche, Brian L.; Matzke, Melissa M.; Webb-Robertson, Bobbie-Jo M.; Simons, Brenna C.; McMahon, Brian J.; Bhattacharya, Renuka; Perkins, James D.; Carithers, Robert L.; Strom, Susan; Self, Steven G.; Katze, Michael G.; Anderson, Gordon A.; Smith, Richard D.

    2014-01-01

    Rapid diagnosis of disease states using less invasive, safer, and more clinically acceptable approaches than presently employed is a crucial direction for the field of medicine. While MS-based proteomics approaches have attempted to meet these objectives, challenges such as the enormous dynamic range of protein concentrations in clinically relevant biofluid samples coupled with the need to address human biodiversity have slowed their employment. Herein, we report on the use of a new instrumental platform that addresses these challenges by coupling technical advances in rapid gas phase multiplexed ion mobility spectrometry separations with liquid chromatography and MS to dramatically increase measurement sensitivity and throughput, further enabling future high throughput MS-based clinical applications. An initial application of the liquid chromatography - ion mobility spectrometry-MS platform analyzing blood serum samples from 60 postliver transplant patients with recurrent fibrosis progression and 60 nontransplant patients illustrates its potential utility for disease characterization. PMID:24403597

  5. Two-step protocol for preparing adherent cells for high-throughput flow cytometry.

    PubMed

    Kaur, Mandeep; Esau, Luke

    2015-09-01

    We have developed a simple, cost-effective, and labor-efficient two-step protocol for preparing adherent cells for high-throughput flow cytometry. Adherent cells were grown on microplates, detached with 2.9 mM EDTA (pH 6.14) added directly to wells containing cell culture medium, stained, and then analyzed on a flow cytometer. This protocol bypasses washing, centrifugation, and transfer between plates, reducing the cell loss that occurs in standard multistep protocols. The method has been validated using six adherent cell lines, four commercially available dyes, and two antibodies; the results have been confirmed using two different flow cytometry (FC) instruments. Our approach has been used for estimating apoptosis, mitochondrial membrane potential, reactive oxygen species, and autophagy in response to exposure to pure compounds as well as plant and bacterial extracts.

  6. Characterization of DNA-protein interactions using high-throughput sequencing data from pulldown experiments

    NASA Astrophysics Data System (ADS)

    Moreland, Blythe; Oman, Kenji; Curfman, John; Yan, Pearlly; Bundschuh, Ralf

    Methyl-binding domain (MBD) protein pulldown experiments have been a valuable tool in measuring the levels of methylated CpG dinucleotides. Due to the frequent use of this technique, high-throughput sequencing data sets are available that allow a detailed quantitative characterization of the underlying interaction between methylated DNA and MBD proteins. Analyzing such data sets, we first found that two such proteins cannot bind closer to each other than 2 bp, consistent with structural models of the DNA-protein interaction. Second, the large amount of sequencing data allowed us to find rather weak but nevertheless clearly statistically significant sequence preferences for several bases around the required CpG. These results demonstrate that pulldown sequencing is a high-precision tool in characterizing DNA-protein interactions. This material is based upon work supported by the National Science Foundation under Grant No. DMR-1410172.

  7. Peptoid Library Agar Diffusion (PLAD) Assay for the High-Throughput Identification of Antimicrobial Peptoids.

    PubMed

    Fisher, Kevin J; Turkett, Jeremy A; Corson, Ashley E; Bicker, Kevin L

    2016-06-13

    Rapid emergence of antimicrobial resistant organisms necessitates equally rapid methods for the development of new antimicrobial compounds. Of recent interest have been mimics of antimicrobial peptides known as antimicrobial peptoids, which exhibit similar potency to the former but with improved proteolytic stability. Presented herein is a high-throughput method to screen libraries of antimicrobial peptoids immobilized on beads embedded into solid media. Termed the peptoid library agar diffusion (PLAD) assay, this assay allows for individual chemical manipulation of two identical peptoid strands. One strand can be released to diffuse out from a solid support bead and interact with the microorganism during screening. The other strand can be cleaved after screening from beads showing strong antimicrobial activity and analyzed by mass spectrometry to deconvolute the structure of the peptoid. This method was applied to a small library of peptoids to identify an antimicrobial peptoid with modest efficacy against the ESKAPE pathogens.

  8. Improved microbiological diagnostic due to utilization of a high-throughput homogenizer for routine tissue processing.

    PubMed

    Redanz, Sylvio; Podbielski, Andreas; Warnke, Philipp

    2015-07-01

    Tissue specimens are valuable materials for microbiological diagnostics and require swift and accurate processing. Established processing methods are complex, labor intensive, hardly if at all standardizable, and prone to incorporate contaminants. To improve analyses from tissue samples in routine microbiological diagnostics, by facilitating, fastening, and standardizing processing as well as increasing the microbial yield, performance of Precellys 24 high-throughput tissue homogenizer was evaluated. Therefore, tissue samples were artificially inoculated with Staphylococcus aureus, Escherichia coli, and Candida albicans in 3 different ways on the surface and within the material. Microbial yield from homogenized samples was compared to direct plating method. Further, as proof of principle, routine tissue samples from knee and hip endoprosthesis infections were analyzed. The process of tissue homogenization with Precellys 24 homogenizer is easy and fast to perform and allows for a high degree of standardization. Microbial yield after homogenization was significantly higher as compared to conventional plating technique.

  9. Detection of dysregulated protein-association networks by high-throughput proteomics predicts cancer vulnerabilities.

    PubMed

    Lapek, John D; Greninger, Patricia; Morris, Robert; Amzallag, Arnaud; Pruteanu-Malinici, Iulian; Benes, Cyril H; Haas, Wilhelm

    2017-09-11

    The formation of protein complexes and the co-regulation of the cellular concentrations of proteins are essential mechanisms for cellular signaling and for maintaining homeostasis. Here we use isobaric-labeling multiplexed proteomics to analyze protein co-regulation and show that this allows the identification of protein-protein associations with high accuracy. We apply this 'interactome mapping by high-throughput quantitative proteome analysis' (IMAHP) method to a panel of 41 breast cancer cell lines and show that deviations of the observed protein co-regulations in specific cell lines from the consensus network affects cellular fitness. Furthermore, these aberrant interactions serve as biomarkers that predict the drug sensitivity of cell lines in screens across 195 drugs. We expect that IMAHP can be broadly used to gain insight into how changing landscapes of protein-protein associations affect the phenotype of biological systems.

  10. High Throughput Cryogenic And Room Temperature Testing Of Focal Plane Components

    NASA Astrophysics Data System (ADS)

    Voynick, Stanley

    1988-04-01

    To increase production efficiency in the manufacture of infrared focal plane components, test techniques were refined to enhance testing throughput and accuracy. The result is an integrated package of high performance hardware and software tools which performs well in high throughput production environments. The test system is also very versatile. It has been used for readout (multiplexer) device characterization, room temperature automated wafer probing, and focal plane array (FPA) testing. Tests have been performed using electrical and radiometric optical stimulus. An integrated, convenient software package was developed and is used to acquire, reduce, analyze, display, and archive test data. The test software supports fully automated operation for the production environment, as well as menu-driven operation for R&D, characterization and setup purposes. Trade-offs between handling techniques in cryogenic production testing were investigated. " atch processing" is preferred over "continuous flow", primarily due to considerations of contamination of the cryogenic environment.

  11. A high-throughput assay format for determination of nitrate reductase and nitrite reductase enzyme activities

    SciTech Connect

    McNally, N.; Liu, Xiang Yang; Choudary, P.V.

    1997-01-01

    The authors describe a microplate-based high-throughput procedure for rapid assay of the enzyme activities of nitrate reductase and nitrite reductase, using extremely small volumes of reagents. The new procedure offers the advantages of rapidity, small sample size-nanoliter volumes, low cost, and a dramatic increase in the throughput sample number that can be analyzed simultaneously. Additional advantages can be accessed by using microplate reader application software packages that permit assigning a group type to the wells, recording of the data on exportable data files and exercising the option of using the kinetic or endpoint reading modes. The assay can also be used independently for detecting nitrite residues/contamination in environmental/food samples. 10 refs., 2 figs.

  12. High-throughput STR analysis for DNA database using direct PCR.

    PubMed

    Sim, Jeong Eun; Park, Su Jeong; Lee, Han Chul; Kim, Se-Yong; Kim, Jong Yeol; Lee, Seung Hwan

    2013-07-01

    Since the Korean criminal DNA database was launched in 2010, we have focused on establishing an automated DNA database profiling system that analyzes short tandem repeat loci in a high-throughput and cost-effective manner. We established a DNA database profiling system without DNA purification using a direct PCR buffer system. The quality of direct PCR procedures was compared with that of conventional PCR system under their respective optimized conditions. The results revealed not only perfect concordance but also an excellent PCR success rate, good electropherogram quality, and an optimal intra/inter-loci peak height ratio. In particular, the proportion of DNA extraction required due to direct PCR failure could be minimized to <3%. In conclusion, the newly developed direct PCR system can be adopted for automated DNA database profiling systems to replace or supplement conventional PCR system in a time- and cost-saving manner.

  13. New Tools For Understanding Microbial Diversity Using High-throughput Sequence Data

    NASA Astrophysics Data System (ADS)

    Knight, R.; Hamady, M.; Liu, Z.; Lozupone, C.

    2007-12-01

    High-throughput sequencing techniques such as 454 are straining the limits of tools traditionally used to build trees, choose OTUs, and perform other essential sequencing tasks. We have developed a workflow for phylogenetic analysis of large-scale sequence data sets that combines existing tools, such as the Arb phylogeny package and the NAST multiple sequence alignment tool, with new methods for choosing and clustering OTUs and for performing phylogenetic community analysis with UniFrac. This talk discusses the cyberinfrastructure we are developing to support the human microbiome project, and the application of these workflows to analyze very large data sets that contrast the gut microbiota with a range of physical environments. These tools will ultimately help to define core and peripheral microbiomes in a range of environments, and will allow us to understand the physical and biotic factors that contribute most to differences in microbial diversity.

  14. High-throughput microfluidic line scan imaging for cytological characterization

    NASA Astrophysics Data System (ADS)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  15. Systematic error detection in experimental high-throughput screening

    PubMed Central

    2011-01-01

    Background High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection process. Several error correction methods and software have been developed to address this issue in the context of experimental HTS [1-7]. Despite their power to reduce the impact of systematic error when applied to error perturbed datasets, those methods also have one disadvantage - they introduce a bias when applied to data not containing any systematic error [6]. Hence, we need first to assess the presence of systematic error in a given HTS assay and then carry out systematic error correction method if and only if the presence of systematic error has been confirmed by statistical tests. Results We tested three statistical procedures to assess the presence of systematic error in experimental HTS data, including the χ2 goodness-of-fit test, Student's t-test and Kolmogorov-Smirnov test [8] preceded by the Discrete Fourier Transform (DFT) method [9]. We applied these procedures to raw HTS measurements, first, and to estimated hit distribution surfaces, second. The three competing tests were applied to analyse simulated datasets containing different types of systematic error, and to a real HTS dataset. Their accuracy was compared under various error conditions. Conclusions A successful assessment of the presence of systematic error in experimental HTS assays is possible when the appropriate statistical methodology is used. Namely, the t-test should be carried out by researchers to determine whether systematic error is present in their HTS data prior to applying any error correction method

  16. Systematic error detection in experimental high-throughput screening.

    PubMed

    Dragiev, Plamen; Nadon, Robert; Makarenkov, Vladimir

    2011-01-19

    High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection process. Several error correction methods and software have been developed to address this issue in the context of experimental HTS 1234567. Despite their power to reduce the impact of systematic error when applied to error perturbed datasets, those methods also have one disadvantage - they introduce a bias when applied to data not containing any systematic error 6. Hence, we need first to assess the presence of systematic error in a given HTS assay and then carry out systematic error correction method if and only if the presence of systematic error has been confirmed by statistical tests. We tested three statistical procedures to assess the presence of systematic error in experimental HTS data, including the χ2 goodness-of-fit test, Student's t-test and Kolmogorov-Smirnov test 8 preceded by the Discrete Fourier Transform (DFT) method 9. We applied these procedures to raw HTS measurements, first, and to estimated hit distribution surfaces, second. The three competing tests were applied to analyse simulated datasets containing different types of systematic error, and to a real HTS dataset. Their accuracy was compared under various error conditions. A successful assessment of the presence of systematic error in experimental HTS assays is possible when the appropriate statistical methodology is used. Namely, the t-test should be carried out by researchers to determine whether systematic error is present in their HTS data prior to applying any error correction method. This important step can significantly

  17. Using In Vitro High-Throughput Screening Data for Predicting ...

    EPA Pesticide Factsheets

    Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values and risk screening values. We aim to use computational toxicology and quantitative high throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. By coupling qHTS data with adverse outcome pathways (AOPs) we can use ontologies to make predictions about potential hazards and to identify those assays which are sufficient to infer these same hazards. Once those assays are identified, we can use bootstrap natural spline-based metaregression to integrate the evidence across multiple replicates or assays (if a combination of assays are together necessary to be sufficient). In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene (B[k]F) may induce DNA damage and steatosis using qHTS data and two separate AOPs. We also demonstrate how bootstrap natural spline-based metaregression can be used to integrate the data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an internal point of departure of 0.751µM and risk-specific concentrations of 0.378µM for both 1:1,000 and 1:10,000 additive risk for B[k]F induced DNA damage based on the p53 assay. Based on the available evidence, we

  18. High-throughput DNA extraction of forensic adhesive tapes.

    PubMed

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples.

  19. High-throughput phenotyping of root growth dynamics.

    PubMed

    Yazdanbakhsh, Nima; Fisahn, Joachim

    2012-01-01

    Plant organ phenotyping by noninvasive video imaging techniques provides a powerful tool to assess physiological traits, circadian and diurnal rhythms, and biomass production. In particular, growth of individual plant organs is known to exhibit a high plasticity and occurs as a result of the interaction between various endogenous and environmental processes. Thus, any investigation aiming to unravel mechanisms that determine plant or organ growth has to accurately control and document the environmental growth conditions. Here we describe challenges in establishing a recently developed plant root monitoring platform (PlaRoM) specially suited for noninvasive high-throughput plant growth analysis with highest emphasis on the detailed documentation of capture time, as well as light and temperature conditions. Furthermore, we discuss the experimental procedure for measuring root elongation kinetics and key points that must be considered in such measurements. PlaRoM consists of a robotized imaging platform enclosed in a custom designed phytochamber and a root extension profiling software application. This platform has been developed for multi-parallel recordings of root growth phenotypes of up to 50 individual seedlings over several days, with high spatial and temporal resolution. Two Petri dishes are mounted on a vertical sample stage in a custom designed phytochamber that provides exact temperature control. A computer-controlled positioning unit moves these Petri dishes in small increments and enables continuous screening of the surface under a binocular microscope. Detection of the root tip is achieved by applying thresholds on image pixel data and verifying the neighbourhood for each dark pixel. The growth parameters are visualized as position over time or growth rate over time graphs and averaged over consecutive days, light-dark periods and 24 h day periods. This setup enables the investigation of root extension profiles of different genotypes in various growth

  20. High-throughput neuroimaging-genetics computational infrastructure

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D.; Franco, Joseph; Toga, Arthur W.

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  1. High-throughput process development: II. Membrane chromatography.

    PubMed

    Rathore, Anurag S; Muthukumar, Sampath

    2014-01-01

    Membrane chromatography is gradually emerging as an alternative to conventional column chromatography. It alleviates some of the major disadvantages associated with the latter including high pressure drop across the column bed and dependence on intra-particle diffusion for the transport of solute molecules to their binding sites within the pores of separation media. In the last decade, it has emerged as a method of choice for final polishing of biopharmaceuticals, in particular monoclonal antibody products. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of a membrane chromatography step. It describes operation of a commercially available device (AcroPrep™ Advance filter plate with Mustang S membrane from Pall Corporation). This device is available in 96-well format with 7 μL membrane in each well. We discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion exchange chromatography of Granulocyte Colony Stimulating Factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.99). We think that this protocol will be of significant value to those involved in performing high-throughput process development of membrane chromatography.

  2. Scanning fluorescence detector for high-throughput DNA genotyping

    NASA Astrophysics Data System (ADS)

    Rusch, Terry L.; Petsinger, Jeremy; Christensen, Carl; Vaske, David A.; Brumley, Robert L., Jr.; Luckey, John A.; Weber, James L.

    1996-04-01

    A new scanning fluorescence detector (SCAFUD) was developed for high-throughput genotyping of short tandem repeat polymorphisms (STRPs). Fluorescent dyes are incorporated into relatively short DNA fragments via polymerase chain reaction (PCR) and are separated by electrophoresis in short, wide polyacrylamide gels (144 lanes with well to read distances of 14 cm). Excitation light from an argon laser with primary lines at 488 and 514 nm is introduced into the gel through a fiber optic cable, dichroic mirror, and 40X microscope objective. Emitted fluorescent light is collected confocally through a second fiber. The confocal head is translated across the bottom of the gel at 0.5 Hz. The detection unit utilizes dichroic mirrors and band pass filters to direct light with 10 - 20 nm bandwidths to four photomultiplier tubes (PMTs). PMT signals are independently amplified with variable gain and then sampled at a rate of 2500 points per scan using a computer based A/D board. LabView software (National Instruments) is used for instrument operation. Currently, three fluorescent dyes (Fam, Hex and Rox) are simultaneously detected with peak detection wavelengths of 543, 567, and 613 nm, respectively. The detection limit for fluorescein-labeled primers is about 100 attomoles. Planned SCAFUD upgrades include rearrangement of laser head geometry, use of additional excitation lasers for simultaneous detection of more dyes, and the use of detector arrays instead of individual PMTs. Extensive software has been written for automatic analysis of SCAFUD images. The software enables background subtraction, band identification, multiple- dye signal resolution, lane finding, band sizing and allele calling. Whole genome screens are currently underway to search for loci influencing such complex diseases as diabetes, asthma, and hypertension. Seven production SCAFUDs are currently in operation. Genotyping output for the coming year is projected to be about one million total genotypes (DNA

  3. Applications of Biophysics in High-Throughput Screening Hit Validation.

    PubMed

    Genick, Christine Clougherty; Barlier, Danielle; Monna, Dominique; Brunner, Reto; Bé, Céline; Scheufler, Clemens; Ottl, Johannes

    2014-06-01

    For approximately a decade, biophysical methods have been used to validate positive hits selected from high-throughput screening (HTS) campaigns with the goal to verify binding interactions using label-free assays. By applying label-free readouts, screen artifacts created by compound interference and fluorescence are discovered, enabling further characterization of the hits for their target specificity and selectivity. The use of several biophysical methods to extract this type of high-content information is required to prevent the promotion of false positives to the next level of hit validation and to select the best candidates for further chemical optimization. The typical technologies applied in this arena include dynamic light scattering, turbidometry, resonance waveguide, surface plasmon resonance, differential scanning fluorimetry, mass spectrometry, and others. Each technology can provide different types of information to enable the characterization of the binding interaction. Thus, these technologies can be incorporated in a hit-validation strategy not only according to the profile of chemical matter that is desired by the medicinal chemists, but also in a manner that is in agreement with the target protein's amenability to the screening format. Here, we present the results of screening strategies using biophysics with the objective to evaluate the approaches, discuss the advantages and challenges, and summarize the benefits in reference to lead discovery. In summary, the biophysics screens presented here demonstrated various hit rates from a list of ~2000 preselected, IC50-validated hits from HTS (an IC50 is the inhibitor concentration at which 50% inhibition of activity is observed). There are several lessons learned from these biophysical screens, which will be discussed in this article.

  4. A Robotic Platform for Quantitative High-Throughput Screening

    PubMed Central

    Michael, Sam; Auld, Douglas; Klumpp, Carleen; Jadhav, Ajit; Zheng, Wei; Thorne, Natasha; Austin, Christopher P.; Inglese, James

    2008-01-01

    Abstract High-throughput screening (HTS) is increasingly being adopted in academic institutions, where the decoupling of screening and drug development has led to unique challenges, as well as novel uses of instrumentation, assay formulations, and software tools. Advances in technology have made automated unattended screening in the 1,536-well plate format broadly accessible and have further facilitated the exploration of new technologies and approaches to screening. A case in point is our recently developed quantitative HTS (qHTS) paradigm, which tests each library compound at multiple concentrations to construct concentration-response curves (CRCs) generating a comprehensive data set for each assay. The practical implementation of qHTS for cell-based and biochemical assays across libraries of > 100,000 compounds (e.g., between 700,000 and 2,000,000 sample wells tested) requires maximal efficiency and miniaturization and the ability to easily accommodate many different assay formats and screening protocols. Here, we describe the design and utilization of a fully integrated and automated screening system for qHTS at the National Institutes of Health's Chemical Genomics Center. We report system productivity, reliability, and flexibility, as well as modifications made to increase throughput, add additional capabilities, and address limitations. The combination of this system and qHTS has led to the generation of over 6 million CRCs from > 120 assays in the last 3 years and is a technology that can be widely implemented to increase efficiency of screening and lead generation. PMID:19035846

  5. Emerging metrology for high-throughput nanomaterial genotoxicology.

    PubMed

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  6. Mining Chemical Activity Status from High-Throughput Screening Assays

    PubMed Central

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B.

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare. PMID:26658480

  7. Using In Vitro High-Throughput Screening Data for Predicting ...

    EPA Pesticide Factsheets

    Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values and risk screening values. We aim to use computational toxicology and quantitative high throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. By coupling qHTS data with adverse outcome pathways (AOPs) we can use ontologies to make predictions about potential hazards and to identify those assays which are sufficient to infer these same hazards. Once those assays are identified, we can use bootstrap natural spline-based metaregression to integrate the evidence across multiple replicates or assays (if a combination of assays are together necessary to be sufficient). In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene (B[k]F) may induce DNA damage and steatosis using qHTS data and two separate AOPs. We also demonstrate how bootstrap natural spline-based metaregression can be used to integrate the data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an internal point of departure of 0.751µM and risk-specific concentrations of 0.378µM for both 1:1,000 and 1:10,000 additive risk for B[k]F induced DNA damage based on the p53 assay. Based on the available evidence, we

  8. Evaluation of sequencing approaches for high-throughput ...

    EPA Pesticide Factsheets

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platforms for potential application to high-throughput screening: 1. TempO-Seq utilizing custom designed paired probes per gene; 2. Targeted sequencing (TSQ) utilizing Illumina’s TruSeq RNA Access Library Prep Kit containing tiled exon-specific probe sets; 3. Low coverage whole transcriptome sequencing (LSQ) using Illumina’s TruSeq Stranded mRNA Kit. Each platform was required to cover the ~20,000 genes of the full transcriptome, operate directly with cell lysates, and be automatable with 384-well plates. Technical reproducibility was assessed using MAQC control RNA samples A and B, while functional utility for chemical screening was evaluated using six treatments at a single concentration after 6 hr in MCF7 breast cancer cells: 10 µM chlorpromazine, 10 µM ciclopriox, 10 µM genistein, 100 nM sirolimus, 1 µM tanespimycin, and 1 µM trichostatin A. All RNA samples and chemical treatments were run with 5 technical replicates. The three platforms achieved different read depths, with the TempO-Seq having ~34M mapped reads per sample, while TSQ and LSQ averaged 20M and 11M aligned reads per sample, respectively. Inter-replicate correlation averaged ≥0.95 for raw log2 expression values i

  9. High-Throughput Next-Generation Sequencing of Polioviruses.

    PubMed

    Montmayeur, Anna M; Ng, Terry Fei Fan; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A; Oberste, M Steven; Burns, Cara C

    2017-02-01

    The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance.

  10. High-throughput neuroimaging-genetics computational infrastructure.

    PubMed

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D; Franco, Joseph; Toga, Arthur W

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  11. A bioimage informatics platform for high-throughput embryo phenotyping.

    PubMed

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2016-10-14

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest.

  12. High-throughput Protein Purification and Quality Assessment for Crystallization

    PubMed Central

    Kim, Youngchang; Babnigg, Gyorgy; Jedrzejczak, Robert; Eschenfeldt, William H.; Li, Hui; Maltseva, Natalia; Hatzos-Skintges, Catherine; Gu, Minyi; Makowska-Grzyska, Magdalena; Wu, Ruiying; An, Hao; Chhor, Gekleng; Joachimiak, Andrzej

    2012-01-01

    The ultimate goal of structural biology is to understand the structural basis of proteins in cellular processes. In structural biology, the most critical issue is the availability of high-quality samples. “Structural biology-grade” proteins must be generated in the quantity and quality suitable for structure determination using X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The purification procedures must reproducibly yield homogeneous proteins or their derivatives containing marker atom(s) in milligram quantities. The choice of protein purification and handling procedures plays a critical role in obtaining high-quality protein samples. With structural genomics emphasizing a genome-based approach in understanding protein structure and function, a number of unique structures covering most of the protein folding space have been determined and new technologies with high efficiency have been developed. At the Midwest Center for Structural Genomics (MCSG), we have developed semi-automated protocols for high-throughput parallel protein expression and purification. A protein, expressed as a fusion with a cleavable affinity tag, is purified in two consecutive immobilized metal affinity chromatography (IMAC) steps: (i) the first step is an IMAC coupled with buffer-exchange, or size exclusion chromatography (IMAC-I), followed by the cleavage of the affinity tag using the highly specific Tobacco Etch Virus (TEV) protease; [1] the second step is IMAC and buffer exchange (IMAC-II) to remove the cleaved tag and tagged TEV protease. These protocols have been implemented on multidimensional chromatography workstations and, as we have shown, many proteins can be successfully produced in large-scale. All methods and protocols used for purification, some developed by MCSG, others adopted and integrated into the MCSG purification pipeline and more recently the Center for Structural Genomics of Infectious Diseases (CSGID) purification pipeline, are

  13. High throughput single molecule detection for monitoring biochemical reactions

    PubMed Central

    Okagbare, Paul I.; Soper, Steven A.

    2009-01-01

    The design, performance and application of a novel optical system for high throughput single molecule detection (SMD) configured in a continuous flow format using microfluidics is reported. The system consisted of a microfabricated polymer-based multi-channel fluidic network situated within the optical path of a laser source (λex = 660 nm) with photon transduction accomplished using an electron-multiplying charge coupled device (EMCCD) operated in a frame transfer mode that allowed tracking single molecules as they passed through a large field-of-view (FoV) illumination zone. The microfluidic device consisted of 30 microchannels possessing dimensions of 30 μm (width) × 20 μm (depth) with a 25 mm pitch. Individual molecules were electrokinetically driven through the fluidic network and excited within the wide-field illumination area with the resulting fluorescence collected via an objective and imaged onto the EMCCD camera. The detection system demonstrated sufficient sensitivity to detect single DNA molecules labeled with a fluorescent tag (AlexaFluor 660) identified through their characteristic emission wavelength and the burst of photons produced during their transit through the excitation volume. In its present configuration and fluidic architecture, the sample processing throughput was ∼4.02 × 105 molecules s−1, but could be increased dramatically through the use of narrower channels and a smaller pitch. The system was further evaluated using a single molecule-based fluorescence quenching assay for measuring the population differences between duplexed and single-stranded DNA molecules as a function of temperature for determining the duplex melting temperature, Tm. PMID:19082181

  14. Mining Chemical Activity Status from High-Throughput Screening Assays.

    PubMed

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  15. High-throughput process development for recombinant protein purification.

    PubMed

    Rege, Kaushal; Pepsin, Mike; Falcon, Brandy; Steele, Landon; Heng, Meng

    2006-03-05

    screening of a wide variety of actual bioprocess media and conditions and represents a novel paradigm approach for the high-throughput process development of recombinant proteins.

  16. Missing call bias in high-throughput genotyping.

    PubMed

    Fu, Wenqing; Wang, Yi; Wang, Ying; Li, Rui; Lin, Rong; Jin, Li

    2009-03-13

    The advent of high-throughput and cost-effective genotyping platforms made genome-wide association (GWA) studies a reality. While the primary focus has been invested upon the improvement of reducing genotyping error, the problems associated with missing calls are largely overlooked. To probe into the effect of missing calls on GWAs, we demonstrated experimentally the prevalence and severity of the problem of missing call bias (MCB) in four genotyping technologies (Affymetrix 500 K SNP array, SNPstream, TaqMan, and Illumina Beadlab). Subsequently, we showed theoretically that MCB leads to biased conclusions in the subsequent analyses, including estimation of allele/genotype frequencies, the measurement of HWE and association tests under various modes of inheritance relationships. We showed that MCB usually leads to power loss in association tests, and such power change is greater than what could be achieved by equivalent reduction of sample size unbiasedly. We also compared the bias in allele frequency estimation and in association tests introduced by MCB with those by genotyping errors. Our results illustrated that in most cases, the bias can be greatly reduced by increasing the call-rate at the cost of genotyping error rate. The commonly used 'no-call' procedure for the observations of borderline quality should be modified. If the objective is to minimize the bias, the cut-off for call-rate and that for genotyping error rate should be properly coupled in GWA. We suggested that the ongoing QC cut-off for call-rate should be increased, while the cut-off for genotyping error rate can be reduced properly.

  17. High-throughput screening method for lipases/esterases.

    PubMed

    Mateos-Díaz, Eduardo; Rodríguez, Jorge Alberto; de Los Ángeles Camacho-Ruiz, María; Mateos-Díaz, Juan Carlos

    2012-01-01

    High-throughput screening (HTS) methods for lipases and esterases are generally performed by using synthetic chromogenic substrates (e.g., p-nitrophenyl, resorufin, and umbelliferyl esters) which may be misleading since they are not their natural substrates (e.g., partially or insoluble triglycerides). In previous works, we have shown that soluble nonchromogenic substrates and p-nitrophenol (as a pH indicator) can be used to quantify the hydrolysis and estimate the substrate selectivity of lipases and esterases from several sources. However, in order to implement a spectrophotometric HTS method using partially or insoluble triglycerides, it is necessary to find particular conditions which allow a quantitative detection of the enzymatic activity. In this work, we used Triton X-100, CHAPS, and N-lauroyl sarcosine as emulsifiers, β-cyclodextrin as a fatty acid captor, and two substrate concentrations, 1 mM of tributyrin (TC4) and 5 mM of trioctanoin (TC8), to improve the test conditions. To demonstrate the utility of this method, we screened 12 enzymes (commercial preparations and culture broth extracts) for the hydrolysis of TC4 and TC8, which are both classical substrates for lipases and esterases (for esterases, only TC4 may be hydrolyzed). Subsequent pH-stat experiments were performed to confirm the preference of substrate hydrolysis with the hydrolases tested. We have shown that this method is very useful for screening a high number of lipases (hydrolysis of TC4 and TC8) or esterases (only hydrolysis of TC4) from wild isolates or variants generated by directed evolution using nonchromogenic triglycerides directly in the test.

  18. Evaluation of sequencing approaches for high-throughput ...

    EPA Pesticide Factsheets

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platforms for potential application to high-throughput screening: 1. TempO-Seq utilizing custom designed paired probes per gene; 2. Targeted sequencing (TSQ) utilizing Illumina’s TruSeq RNA Access Library Prep Kit containing tiled exon-specific probe sets; 3. Low coverage whole transcriptome sequencing (LSQ) using Illumina’s TruSeq Stranded mRNA Kit. Each platform was required to cover the ~20,000 genes of the full transcriptome, operate directly with cell lysates, and be automatable with 384-well plates. Technical reproducibility was assessed using MAQC control RNA samples A and B, while functional utility for chemical screening was evaluated using six treatments at a single concentration after 6 hr in MCF7 breast cancer cells: 10 µM chlorpromazine, 10 µM ciclopriox, 10 µM genistein, 100 nM sirolimus, 1 µM tanespimycin, and 1 µM trichostatin A. All RNA samples and chemical treatments were run with 5 technical replicates. The three platforms achieved different read depths, with the TempO-Seq having ~34M mapped reads per sample, while TSQ and LSQ averaged 20M and 11M aligned reads per sample, respectively. Inter-replicate correlation averaged ≥0.95 for raw log2 expression values i

  19. High-throughput mode liquid microjunction surface sampling probe.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; King, Richard C

    2009-08-15

    A simple and automated spot sampling operation mode for a liquid microjunction surface sampling probe/electrospray ionization mass spectrometry (LMJ-SSP/ESI-MS) system is reported. Prior manual and automated spot sampling methods with this probe relied on a careful, relatively slow alignment of the probe and surface distance (<20 microm spacing) to form the probe-to-surface liquid microjunction critical to successful surface sampling. Moreover, sampling multiple spots required retraction of the surface from the probe and a repeat of this careful probe-to-surface distance alignment at the next sampling position. With the method described here, the probe was not positioned as close to the surface, the exact probe-to-surface positioning was found to be less critical (spanning distances from about 100-300 microm), and this distance was not altered during the sampling of an entire array of sample spots. With the probe positioned within the appropriate distance from the surface, the liquid microjunction was formed by letting the liquid from the sampling end of the probe extend out from the probe to the surface. This was accomplished by reducing the self-aspiration liquid flow rate of the probe to a value less than the volume flow rate pumped into the probe. When the self-aspiration rate of the probe was subsequently increased, analytes on the surface that dissolved at the liquid microjunction were aspirated back into the probe with the liquid that created the liquid microjunction and electrosprayed. Presented here are the basics of this new sampling mode, as well as data that illustrate the potential analytical capabilities of the device to conduct high-throughput quantitative analysis.

  20. High-throughput screening of solid-state catalyst libraries

    NASA Astrophysics Data System (ADS)

    Senkan, Selim M.

    1998-07-01

    Combinatorial synthesis methods allow the rapid preparation and processing of large libraries of solid-state materials. The use of these methods, together with the appropriate screening techniques, has recently led to the discovery of materials with promising superconducting, magnetoresistive, luminescent and dielectric properties. Solid-state catalysts, which play an increasingly important role in the chemical and oil industries, represent another class of material amenable to combinatorial synthesis. Yet typically, catalyst discovery still involves inefficient trial-and-error processes, because catalytic activity is inherently difficult to screen. In contrast to superconductivity, magnetoresistivity and dielectric properties, which can be tested by contact probes, or luminescence, which can be observed directly, the assessment of catalytic activity requires the unambiguous detection of a specific product molecule above a small catalyst site on a large library. Screening by in situ infrared thermography and microprobe sampling mass spectrometry, have been suggested, but the first method, while probing activity, provides no information on reaction products, whereas the second is difficult to implement because it requires the transport of minute gas samples from each library site to the detection system. Here I describe the use of laser-induced resonance-enhanced multiphoton ionization for sensitive, selective and high-throughput screening of a library of solid-state catalysts that activate the dehydrogenation of cyclohexane to benzene. I show that benzene, the product molecule, can be selectively photoionized in the vicinity of the catalytic sites, and that the detection of the resultant photoions by an array of microelectrodes provides information on the activity of individual sites. Adaptation of this technique for the screening of other catalytic reactions and larger libraries with smaller site size seems feasible, thus opening up the possibility of exploiting

  1. Hydrogel Droplet Microfluidics for High-Throughput Single Molecule/Cell Analysis.

    PubMed

    Zhu, Zhi; Yang, Chaoyong James

    2017-01-17

    Heterogeneity among individual molecules and cells has posed significant challenges to traditional bulk assays, due to the assumption of average behavior, which would lose important biological information in heterogeneity and result in a misleading interpretation. Single molecule/cell analysis has become an important and emerging field in biological and biomedical research for insights into heterogeneity between large populations at high resolution. Compared with the ensemble bulk method, single molecule/cell analysis explores the information on time trajectories, conformational states, and interactions of individual molecules/cells, all key factors in the study of chemical and biological reaction pathways. Various powerful techniques have been developed for single molecule/cell analysis, including flow cytometry, atomic force microscopy, optical and magnetic tweezers, single-molecule fluorescence spectroscopy, and so forth. However, some of them have the low-throughput issue that has to analyze single molecules/cells one by one. Flow cytometry is a widely used high-throughput technique for single cell analysis but lacks the ability for intercellular interaction study and local environment control. Droplet microfluidics becomes attractive for single molecule/cell manipulation because single molecules/cells can be individually encased in monodisperse microdroplets, allowing high-throughput analysis and manipulation with precise control of the local environment. Moreover, hydrogels, cross-linked polymer networks that swell in the presence of water, have been introduced into droplet microfluidic systems as hydrogel droplet microfluidics. By replacing an aqueous phase with a monomer or polymer solution, hydrogel droplets can be generated on microfluidic chips for encapsulation of single molecules/cells according to the Poisson distribution. The sol-gel transition property endows the hydrogel droplets with new functionalities and diversified applications in single

  2. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  3. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  4. High throughput transmission optical projection tomography using low cost graphics processing unit.

    PubMed

    Vinegoni, Claudio; Fexon, Lyuba; Feruglio, Paolo Fumene; Pivovarov, Misha; Figueiredo, Jose-Luiz; Nahrendorf, Matthias; Pozzo, Antonio; Sbarbati, Andrea; Weissleder, Ralph

    2009-12-07

    We implement the use of a graphics processing unit (GPU) in order to achieve real time data processing for high-throughput transmission optical projection tomography imaging. By implementing the GPU we have obtained a 300 fold performance enhancement in comparison to a CPU workstation implementation. This enables to obtain on-the-fly reconstructions enabling for high throughput imaging.

  5. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  6. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  7. Development and Optimization of a Novel 384-Well Anti-Malarial Imaging Assay Validated for High-Throughput Screening

    PubMed Central

    Duffy, Sandra; Avery, Vicky M.

    2012-01-01

    With the increasing occurrence of drug resistance in the malaria parasite, Plasmodium falciparum, there is a great need for new and novel anti-malarial drugs. We have developed a 384-well, high-throughput imaging assay for the detection of new anti-malarial compounds, which was initially validated by screening a marine natural product library, and subsequently used to screen more than 3 million data points from a variety of compound sources. Founded on another fluorescence-based P. falciparum growth inhibition assay, the DNA-intercalating dye 4′,6-diamidino-2-phenylindole, was used to monitor changes in parasite number. Fluorescent images were acquired on the PerkinElmer Opera High Throughput confocal imaging system and analyzed with a spot detection algorithm using the Acapella data processing software. Further optimization of this assay sought to increase throughput, assay stability, and compatibility with our high-throughput screening equipment platforms. The assay typically yielded Z'-factor values of 0.5–0.6, with signal-to-noise ratios of 12. PMID:22232455

  8. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  9. High throughput research and evaporation rate modeling for solvent screening for ethylcellulose barrier membranes in pharmaceutical applications.

    PubMed

    Schoener, Cody A; Curtis-Fisk, Jaime L; Rogers, True L; Tate, Michael P

    2016-10-01

    Ethylcellulose is commonly dissolved in a solvent or formed into an aqueous dispersion and sprayed onto various dosage forms to form a barrier membrane to provide controlled release in pharmaceutical formulations. Due to the variety of solvents utilized in the pharmaceutical industry and the importance solvent can play on film formation and film strength it is critical to understand how solvent can influence these parameters. To systematically study a variety of solvent blends and how these solvent blends influence ethylcellulose film formation, physical and mechanical film properties and solution properties such as clarity and viscosity. Using high throughput capabilities and evaporation rate modeling, thirty-one different solvent blends composed of ethanol, isopropanol, acetone, methanol, and/or water were formulated, analyzed for viscosity and clarity, and narrowed down to four solvent blends. Brookfield viscosity, film casting, mechanical film testing and water permeation were also completed. High throughput analysis identified isopropanol/water, ethanol, ethanol/water and methanol/acetone/water as solvent blends with unique clarity and viscosity values. Evaporation rate modeling further rank ordered these candidates from excellent to poor interaction with ethylcellulose. Isopropanol/water was identified as the most suitable solvent blend for ethylcellulose due to azeotrope formation during evaporation, which resulted in a solvent-rich phase allowing the ethylcellulose polymer chains to remain maximally extended during film formation. Consequently, the highest clarity and most ductile films were formed. Employing high throughput capabilities paired with evaporation rate modeling allowed strong predictions between solvent interaction with ethylcellulose and mechanical film properties.

  10. Development and optimization of a novel 384-well anti-malarial imaging assay validated for high-throughput screening.

    PubMed

    Duffy, Sandra; Avery, Vicky M

    2012-01-01

    With the increasing occurrence of drug resistance in the malaria parasite, Plasmodium falciparum, there is a great need for new and novel anti-malarial drugs. We have developed a 384-well, high-throughput imaging assay for the detection of new anti-malarial compounds, which was initially validated by screening a marine natural product library, and subsequently used to screen more than 3 million data points from a variety of compound sources. Founded on another fluorescence-based P. falciparum growth inhibition assay, the DNA-intercalating dye 4',6-diamidino-2-phenylindole, was used to monitor changes in parasite number. Fluorescent images were acquired on the PerkinElmer Opera High Throughput confocal imaging system and analyzed with a spot detection algorithm using the Acapella data processing software. Further optimization of this assay sought to increase throughput, assay stability, and compatibility with our high-throughput screening equipment platforms. The assay typically yielded Z'-factor values of 0.5-0.6, with signal-to-noise ratios of 12.

  11. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    NASA Astrophysics Data System (ADS)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods.

  12. The Utilization of Formalin Fixed-Paraffin-Embedded Specimens in High Throughput Genomic Studies

    PubMed Central

    Zhang, Pan

    2017-01-01

    High throughput genomic assays empower us to study the entire human genome in short time with reasonable cost. Formalin fixed-paraffin-embedded (FFPE) tissue processing remains the most economical approach for longitudinal tissue specimen storage. Therefore, the ability to apply high throughput genomic applications to FFPE specimens can expand clinical assays and discovery. Many studies have measured the accuracy and repeatability of data generated from FFPE specimens using high throughput genomic assays. Together, these studies demonstrate feasibility and provide crucial guidance for future studies using FFPE specimens. Here, we summarize the findings of these studies and discuss the limitations of high throughput data generated from FFPE specimens across several platforms that include microarray, high throughput sequencing, and NanoString. PMID:28246590

  13. Demonstration of submersible high-throughput microfluidic immunosensors for underwater explosives detection.

    PubMed

    Adams, André A; Charles, Paul T; Deschamps, Jeffrey R; Kusterbeck, Anne W

    2011-11-15

    Significant security threats posed by highly energetic nitroaromatic compounds in aquatic environments and the demilitarization and pending cleanup of areas previously used for munitions manufacture and storage represent a challenge for less expensive, faster, and more sensitive systems capable of analyzing groundwater and seawater samples for trace levels of explosive materials. Presented here is an inexpensive high throughput microfluidic immunosensor (HTMI) platform intended for the rapid, highly selective quantitation of nitroaromatic compounds in the field. Immunoaffinity and fluorescence detection schemes were implemented in tandem on a novel microfluidic device containing 39 parallel microchannels that were 500 μm tall, 250 μm wide, and 2.54 cm long with covalently tethered antibodies that was engineered for high-throughput high-volume sample processing. The devices were produced via a combination of high precision micromilling and hot embossing. Mass transfer limitations were found in conventional microsystems and were minimized due to higher surface area to volume ratios that exceeded those possessed by conventional microdevices and capillaries. Until now, these assays were limited to maximum total volume flow rates of ~1 mL/min due in part to kinetics and high head pressures of single microchannels. In the design demonstrated here, highly parallelized microchannels afforded up to a 100-fold increase in total volume flow rate while maintaining favorable kinetic constraints for efficient antigen-antibody interaction. The assay employed total volume throughput of up to 6 mL/min while yielding signal-to-noise ratios of >15 in all cases. In addition to samples being processed up to 60 times faster than in conventional displacement-based immunoassays, the current system was capable of quantitating 0.01 ng/mL TNT samples without implementing offline preconcentration, thereby, demonstrating the ability to improve sensitivity by as much as 2 orders of magnitude

  14. High-throughput optofluidic profiling of Euglena gracilis with morphological and chemical specificity

    NASA Astrophysics Data System (ADS)

    Guo, Baoshan; Lei, Cheng; Ito, Takuro; Jiang, Yiyue; Ozeki, Yasuyuki; Goda, Keisuke

    2016-11-01

    The world is faced with environmental problems and the energy crisis due to the combustion and depletion of fossil fuels. The development of reliable, sustainable, and economical sources of alternative fuels is an important, but challenging goal for the world. As an alternative to liquid fossil fuels, algal biofuel is expected to play a key role in alleviating global warming since algae absorb atmospheric CO2 via photosynthesis. Among various algae for fuel production, Euglena gracilis is an attractive microalgal species as it is known to produce wax ester (good for biodiesel and aviation fuel) within lipid droplets. To date, while there exist many techniques for inducing microalgal cells to produce and accumulate lipid with high efficiency, few analytical methods are available for characterizing a population of such lipid-accumulated microalgae including E. gracilis with high throughout, high accuracy, and single-cell resolution simultaneously. Here we demonstrate a high-throughput optofluidic Euglena gracilis profiler which consists of an optical time-stretch microscope and a fluorescence analyzer on top of an inertial-focusing microfluidic device that can detect fluorescence from lipid droplets in their cell body and provide images of E. gracilis cells simultaneously at a high throughput of 10,000 cells/s. With the multi-dimensional information acquired by the system, we classify nitrogen-sufficient (ordinary) and nitrogen-deficient (lipid-accumulated) E. gracilis cells with a low false positive rate of 1.0%. This method provides a promise for evaluating the efficiency of lipid-inducing techniques for biofuel production, which is also applicable for identifying biomedical samples such as blood cells and cancer cells.

  15. PTMScout, a Web resource for analysis of high throughput post-translational proteomics studies.

    PubMed

    Naegle, Kristen M; Gymrek, Melissa; Joughin, Brian A; Wagner, Joel P; Welsch, Roy E; Yaffe, Michael B; Lauffenburger, Douglas A; White, Forest M

    2010-11-01

    The rate of discovery of post-translational modification (PTM) sites is increasing rapidly and is significantly outpacing our biological understanding of the function and regulation of those modifications. To help meet this challenge, we have created PTMScout, a web-based interface for viewing, manipulating, and analyzing high throughput experimental measurements of PTMs in an effort to facilitate biological understanding of protein modifications in signaling networks. PTMScout is constructed around a custom database of PTM experiments and contains information from external protein and post-translational resources, including gene ontology annotations, Pfam domains, and Scansite predictions of kinase and phosphopeptide binding domain interactions. PTMScout functionality comprises data set comparison tools, data set summary views, and tools for protein assignments of peptides identified by mass spectrometry. Analysis tools in PTMScout focus on informed subset selection via common criteria and on automated hypothesis generation through subset labeling derived from identification of statistically significant enrichment of other annotations in the experiment. Subset selection can be applied through the PTMScout flexible query interface available for quantitative data measurements and data annotations as well as an interface for importing data set groupings by external means, such as unsupervised learning. We exemplify the various functions of PTMScout in application to data sets that contain relative quantitative measurements as well as data sets lacking quantitative measurements, producing a set of interesting biological hypotheses. PTMScout is designed to be a widely accessible tool, enabling generation of multiple types of biological hypotheses from high throughput PTM experiments and advancing functional assignment of novel PTM sites. PTMScout is available at http://ptmscout.mit.edu.

  16. High-throughput real-time x-ray microtomography at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    De Carlo, Francesco; Albee, Paul B.; Chu, Yong S.; Mancini, Derrick C.; Tieman, Brian; Wang, Steve Y.

    2002-01-01

    It is now possible for large volumes of synchrotron- radiation-generated micro-tomography data to be produced at gigabyte-per-minute rates, especially when using currently available CCD cameras at a high-brightness source, such as the Advanced Photon Source (APS). Recent improvements in the speed of our detectors and stages, combined with increased photon flux supplied by a newly installed double multilayer monochromator, allow us to achieve these data rates on a bending magnet beamline. Previously, most x-ray microtomography experiments have produced data at comparatively lower rates, and often the data were analyzed after the experiment. The time needed to generate complete data sets meant putting off analysis to the completion of a run, thus preventing the user from evaluating the usefulness of a data set and consequently impairing decision making during data acquisition as to how to proceed. Thus, the ability to provide to a tomography user a fully reconstructed data set in few minutes is one of the major problems to be solved when dealing with high-throughput x- ray tomography. This is due to the complexity of the data analysis that involves data preprocessing, sinogram generation, 3D reconstruction, and rendering. At the APS, we have developed systems and techniques to address this issue. We present a method that uses a cluster-based, parallel- computing system based on the Message Passing Interface (MPI) standard. Among the advantages of this approach are the portability, ease-of-use, and low cost of the system. The combination of high-speed, online analysis with high- throughput acquisition allows us to acquire and reconstruct a 512x512x512-voxel sample with a few microns resolution in less than ten minutes.

  17. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping

    PubMed Central

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547

  18. Mass spectrometric techniques for label-free high-throughput screening in drug discovery.

    PubMed

    Roddy, Thomas P; Horvath, Christopher R; Stout, Steven J; Kenney, Kristin L; Ho, Pei-I; Zhang, Ji-Hu; Vickers, Chad; Kaushik, Virendar; Hubbard, Brian; Wang, Y Karen

    2007-11-01

    High-throughput screening (HTS) is an important tool for finding active compounds to initiate medicinal chemistry programs in pharmaceutical discovery research. Traditional HTS methods rely on fluorescent or radiolabeled reagents and/or coupling assays to permit quantitation of enzymatic target inhibition or activation. Mass spectrometry-based high-throughput screening (MS-HTS) is an alternative that is not susceptible to the limitations imposed by labeling and coupling enzymes. MS-HTS offers a selective and sensitive analytical method for unlabeled substrates and products. Furthermore, method development times are reduced without the need to incorporate labels or coupling assays. MS-HTS also permits screening of targets that are difficult or impossible to screen by other techniques. For example, enzymes that are challenging to purify can lead to the nonspecific detection of structurally similar components of the impure enzyme or matrix of membraneous enzymes. The high selectivity of tandem mass spectrometry (MS/MS) enables these screens to proceed with low levels of background noise to sensitively discover interesting hits even with relatively weak activity. In this article, we describe three techniques that we have adapted for large-scale (approximately 175,000 sample) compound library screening, including four-way parallel multiplexed electrospray liquid chromatography tandem mass spectrometry (MUX-LC/MS/MS), four-way parallel staggered gradient liquid chromatography tandem mass spectrometry (LC/MS/MS), and eight-way staggered flow injection MS/MS following 384-well plate solid-phase extraction (SPE). These methods are capable of analyzing a 384-well plate in 37 min, with typical analysis times of less than 2 h. The quality of the MS-HTS approach is demonstrated herein with screening data from two large-scale screens.

  19. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  20. High-throughput analysis of T-DNA location and structure using sequence capture

    DOE PAGES

    Inagaki, Soichi; Henry, Isabelle M.; Lieberman, Meric C.; ...

    2015-10-07

    Agrobacterium-mediated transformation of plants with T-DNA is used both to introduce transgenes and for mutagenesis. Conventional approaches used to identify the genomic location and the structure of the inserted T-DNA are laborious and high-throughput methods using next-generation sequencing are being developed to address these problems. Here, we present a cost-effective approach that uses sequence capture targeted to the T-DNA borders to select genomic DNA fragments containing T-DNA—genome junctions, followed by Illumina sequencing to determine the location and junction structure of T-DNA insertions. Multiple probes can be mixed so that transgenic lines transformed with different T-DNA types can be processed simultaneously,more » using a simple, index-based pooling approach. We also developed a simple bioinformatic tool to find sequence read pairs that span the junction between the genome and T-DNA or any foreign DNA. We analyzed 29 transgenic lines of Arabidopsis thaliana, each containing inserts from 4 different T-DNA vectors. We determined the location of T-DNA insertions in 22 lines, 4 of which carried multiple insertion sites. Additionally, our analysis uncovered a high frequency of unconventional and complex T-DNA insertions, highlighting the needs for high-throughput methods for T-DNA localization and structural characterization. Transgene insertion events have to be fully characterized prior to use as commercial products. As a result, our method greatly facilitates the first step of this characterization of transgenic plants by providing an efficient screen for the selection of promising lines.« less