Science.gov

Sample records for high throughput proteome-wide

  1. Proteome-wide mapping of the Drosophila acetylome demonstrates a high degree of conservation of lysine acetylation.

    PubMed

    Weinert, Brian T; Wagner, Sebastian A; Horn, Heiko; Henriksen, Peter; Liu, Wenshe R; Olsen, Jesper V; Jensen, Lars J; Choudhary, Chunaram

    2011-07-26

    Posttranslational modification of proteins by acetylation and phosphorylation regulates most cellular processes in living organisms. Surprisingly, the evolutionary conservation of phosphorylated serine and threonine residues is only marginally higher than that of unmodified serines and threonines. With high-resolution mass spectrometry, we identified 1981 lysine acetylation sites in the proteome of Drosophila melanogaster. We used data sets of experimentally identified acetylation and phosphorylation sites in Drosophila and humans to analyze the evolutionary conservation of these modification sites between flies and humans. Site-level conservation analysis revealed that acetylation sites are highly conserved, significantly more so than phosphorylation sites. Furthermore, comparison of lysine conservation in Drosophila and humans with that in nematodes and zebrafish revealed that acetylated lysines were significantly more conserved than were nonacetylated lysines. Bioinformatics analysis using Gene Ontology terms suggested that the proteins with conserved acetylation control cellular processes such as protein translation, protein folding, DNA packaging, and mitochondrial metabolism. We found that acetylation of ubiquitin-conjugating E2 enzymes was evolutionarily conserved, and mutation of a conserved acetylation site impaired the function of the human E2 enzyme UBE2D3. This systems-level analysis of comparative posttranslational modification showed that acetylation is an anciently conserved modification and suggests that phosphorylation sites may have evolved faster than acetylation sites.

  2. High throughput screening informatics.

    PubMed

    Ling, Xuefeng Bruce

    2008-03-01

    High throughput screening (HTS), an industrial effort to leverage developments in the areas of modern robotics, data analysis and control software, liquid handling devices, and sensitive detectors, has played a pivotal role in the drug discovery process, allowing researchers to efficiently screen millions of compounds to identify tractable small molecule modulators of a given biological process or disease state and advance them into high quality leads. As HTS throughput has significantly increased the volume, complexity, and information content of datasets, lead discovery research demands a clear corporate strategy for scientific computing and subsequent establishment of robust enterprise-wide (usually global) informatics platforms, which enable complicated HTS work flows, facilitate HTS data mining, and drive effective decision-making. The purpose of this review is, from the data analysis and handling perspective, to examine key elements in HTS operations and some essential data-related activities supporting or interfacing the screening process, and outline properties that various enabling software should have. Additionally, some general advice for corporate managers with system procurement responsibilities is offered.

  3. High-throughput proteomics

    NASA Astrophysics Data System (ADS)

    Lesley, Scott A.; Nasoff, Marc; Kreusch, Andreas; Spraggon, Glen

    2001-04-01

    Proteomics has become a major focus as researchers attempt to understand the vast amount of genomic information. Protein complexity makes identifying and understanding gene function inherently difficult. The challenge of studying proteins in a global way is driving the development of new technologies for systematic and comprehensive analysis of protein structure and function. We are addressing this challenge through instrumentation and approaches to rapidly express, purify, crystallize, and mutate large numbers of human gene products. Our approach applies the principles of HTS technologies commonly used in pharmaceutical development. Genes are cloned, expressed, and purified in parallel to achieve a throughput potential of hundreds per day. Our instrumentation allows us to produce tens of milligrams of protein from 96 separate clones simultaneously. Purified protein is used for several applications including a high-throughput crystallographic screening approach for structure determination using automated image analysis. To further understand protein function, we are integrating a mutagenesis and screening approach. By combining these key technologies, we hope to provide a fundamental basis for understanding gene function at the protein level.

  4. Exploiting the multiplexing capabilities of tandem mass tags for high-throughput estimation of cellular protein abundances by mass spectrometry.

    PubMed

    Ahrné, Erik; Martinez-Segura, Amalia; Syed, Afzal Pasha; Vina-Vilaseca, Arnau; Gruber, Andreas J; Marguerat, Samuel; Schmidt, Alexander

    2015-09-01

    The generation of dynamic models of biological processes critically depends on the determination of precise cellular concentrations of biomolecules. Measurements of system-wide absolute protein levels are particularly valuable information in systems biology. Recently, mass spectrometry based proteomics approaches have been developed to estimate protein concentrations on a proteome-wide scale. However, for very complex proteomes, fractionation steps are required, increasing samples number and instrument analysis time. As a result, the number of full proteomes that can be routinely analyzed is limited. Here we combined absolute quantification strategies with the multiplexing capabilities of isobaric tandem mass tags to determine cellular protein abundances in a high throughput and proteome-wide scale even for highly complex biological systems, such as a whole human cell line. We generated two independent data sets to demonstrate the power of the approach regarding sample throughput, dynamic range, quantitative precision and accuracy as well as proteome coverage in comparison to existing mass spectrometry based strategies.

  5. High throughput optical scanner

    SciTech Connect

    Basiji, David A.; van den Engh, Gerrit J.

    2001-01-01

    A scanning apparatus is provided to obtain automated, rapid and sensitive scanning of substrate fluorescence, optical density or phosphorescence. The scanner uses a constant path length optical train, which enables the combination of a moving beam for high speed scanning with phase-sensitive detection for noise reduction, comprising a light source, a scanning mirror to receive light from the light source and sweep it across a steering mirror, a steering mirror to receive light from the scanning mirror and reflect it to the substrate, whereby it is swept across the substrate along a scan arc, and a photodetector to receive emitted or scattered light from the substrate, wherein the optical path length from the light source to the photodetector is substantially constant throughout the sweep across the substrate. The optical train can further include a waveguide or mirror to collect emitted or scattered light from the substrate and direct it to the photodetector. For phase-sensitive detection the light source is intensity modulated and the detector is connected to phase-sensitive detection electronics. A scanner using a substrate translator is also provided. For two dimensional imaging the substrate is translated in one dimension while the scanning mirror scans the beam in a second dimension. For a high throughput scanner, stacks of substrates are loaded onto a conveyor belt from a tray feeder.

  6. High Throughput Transcriptomics @ USEPA (Toxicology ...

    EPA Pesticide Factsheets

    The ideal chemical testing approach will provide complete coverage of all relevant toxicological responses. It should be sensitive and specific It should identify the mechanism/mode-of-action (with dose-dependence). It should identify responses relevant to the species of interest. Responses should ideally be translated into tissue-, organ-, and organism-level effects. It must be economical and scalable. Using a High Throughput Transcriptomics platform within US EPA provides broader coverage of biological activity space and toxicological MOAs and helps fill the toxicological data gap. Slide presentation at the 2016 ToxForum on using High Throughput Transcriptomics at US EPA for broader coverage biological activity space and toxicological MOAs.

  7. High Throughput Testing

    NASA Astrophysics Data System (ADS)

    Marshall, A. H.; Buchness, R. K.; Schmunk, D. F.; Vasel, L. F.

    1982-06-01

    Several IFPA programs having large quantities of deliverable devices provided Rockwell with the necessity of solving the many problems associated with high volume production. The facility contains processing, assembly, and cryogenic testing equipment.

  8. High-throughput discovery metabolomics.

    PubMed

    Fuhrer, Tobias; Zamboni, Nicola

    2015-02-01

    Non-targeted metabolomics by mass spectrometry has established as the method of choice for investigating metabolic phenotypes in basic and applied research. Compared to other omics, metabolomics provides broad scope and yet direct information on the integrated cellular response with low demand in material and sample preparation. These features render non-targeted metabolomics ideally suited for large scale screens and discovery. Here we review the achievements and potential in high-throughput, non-targeted metabolomics. We found that routine and precise analysis of thousands of small molecular features in thousands of complex samples per day and instrument is already reality, and ongoing developments in microfluidics and integrated interfaces will likely further boost throughput in the next few years. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. High-Throughput Sequencing Technologies

    PubMed Central

    Reuter, Jason A.; Spacek, Damek; Snyder, Michael P.

    2015-01-01

    Summary The human genome sequence has profoundly altered our understanding of biology, human diversity and disease. The path from the first draft sequence to our nascent era of personal genomes and genomic medicine has been made possible only because of the extraordinary advancements in DNA sequencing technologies over the past ten years. Here, we discuss commonly used high-throughput sequencing platforms, the growing array of sequencing assays developed around them as well as the challenges facing current sequencing platforms and their clinical application. PMID:26000844

  11. High throughput protein production screening

    DOEpatents

    Beernink, Peter T.; Coleman, Matthew A.; Segelke, Brent W.

    2009-09-08

    Methods, compositions, and kits for the cell-free production and analysis of proteins are provided. The invention allows for the production of proteins from prokaryotic sequences or eukaryotic sequences, including human cDNAs using PCR and IVT methods and detecting the proteins through fluorescence or immunoblot techniques. This invention can be used to identify optimized PCR and WT conditions, codon usages and mutations. The methods are readily automated and can be used for high throughput analysis of protein expression levels, interactions, and functional states.

  12. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    PubMed

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  13. High Throughput Plasma Water Treatment

    NASA Astrophysics Data System (ADS)

    Mujovic, Selman; Foster, John

    2016-10-01

    The troublesome emergence of new classes of micro-pollutants, such as pharmaceuticals and endocrine disruptors, poses challenges for conventional water treatment systems. In an effort to address these contaminants and to support water reuse in drought stricken regions, new technologies must be introduced. The interaction of water with plasma rapidly mineralizes organics by inducing advanced oxidation in addition to other chemical, physical and radiative processes. The primary barrier to the implementation of plasma-based water treatment is process volume scale up. In this work, we investigate a potentially scalable, high throughput plasma water reactor that utilizes a packed bed dielectric barrier-like geometry to maximize the plasma-water interface. Here, the water serves as the dielectric medium. High-speed imaging and emission spectroscopy are used to characterize the reactor discharges. Changes in methylene blue concentration and basic water parameters are mapped as a function of plasma treatment time. Experimental results are compared to electrostatic and plasma chemistry computations, which will provide insight into the reactor's operation so that efficiency can be assessed. Supported by NSF (CBET 1336375).

  14. Selection of recombinant anti-SH3 domain antibodies by high-throughput phage display

    PubMed Central

    Huang, Haiming; Economopoulos, Nicolas O; Liu, Bernard A; Uetrecht, Andrea; Gu, Jun; Jarvik, Nick; Nadeem, Vincent; Pawson, Tony; Moffat, Jason; Miersch, Shane; Sidhu, Sachdev S

    2015-01-01

    Antibodies are indispensable tools in biochemical research and play an expanding role as therapeutics. While hybridoma technology is the dominant method for antibody production, phage display is an emerging technology. Here, we developed and employed a high-throughput pipeline that enables selection of antibodies against hundreds of antigens in parallel. Binding selections using a phage-displayed synthetic antigen-binding fragment (Fab) library against 110 human SH3 domains yielded hundreds of Fabs targeting 58 antigens. Affinity assays demonstrated that representative Fabs bind tightly and specifically to their targets. Furthermore, we developed an efficient affinity maturation strategy adaptable to high-throughput, which increased affinity dramatically but did not compromise specificity. Finally, we tested Fabs in common cell biology applications and confirmed recognition of the full-length antigen in immunoprecipitation, immunoblotting and immunofluorescence assays. In summary, we have established a rapid and robust high-throughput methodology that can be applied to generate highly functional and renewable antibodies targeting protein domains on a proteome-wide scale. PMID:26332758

  15. Selection of recombinant anti-SH3 domain antibodies by high-throughput phage display.

    PubMed

    Huang, Haiming; Economopoulos, Nicolas O; Liu, Bernard A; Uetrecht, Andrea; Gu, Jun; Jarvik, Nick; Nadeem, Vincent; Pawson, Tony; Moffat, Jason; Miersch, Shane; Sidhu, Sachdev S

    2015-11-01

    Antibodies are indispensable tools in biochemical research and play an expanding role as therapeutics. While hybridoma technology is the dominant method for antibody production, phage display is an emerging technology. Here, we developed and employed a high-throughput pipeline that enables selection of antibodies against hundreds of antigens in parallel. Binding selections using a phage-displayed synthetic antigen-binding fragment (Fab) library against 110 human SH3 domains yielded hundreds of Fabs targeting 58 antigens. Affinity assays demonstrated that representative Fabs bind tightly and specifically to their targets. Furthermore, we developed an efficient affinity maturation strategy adaptable to high-throughput, which increased affinity dramatically but did not compromise specificity. Finally, we tested Fabs in common cell biology applications and confirmed recognition of the full-length antigen in immunoprecipitation, immunoblotting and immunofluorescence assays. In summary, we have established a rapid and robust high-throughput methodology that can be applied to generate highly functional and renewable antibodies targeting protein domains on a proteome-wide scale. © 2015 The Protein Society.

  16. High throughput vacuum chemical epitaxy

    NASA Astrophysics Data System (ADS)

    Fraas, L. M.; Malocsay, E.; Sundaram, V.; Baird, R. W.; Mao, B. Y.; Lee, G. Y.

    1990-10-01

    We have developed a vacuum chemical epitaxy (VCE) reactor which avoids the use of arsine and allows multiple wafers to be coated at one time. Our vacuum chemical epitaxy reactor closely resembles a molecular beam epitaxy system in that wafers are loaded into a stainless steel vacuum chamber through a load chamber. Also as in MBE, arsenic vapors are supplied as reactant by heating solid arsenic sources thereby avoiding the use of arsine. However, in our VCE reactor, a large number of wafers are coated at one time in a vacuum system by the substitution of Group III alkyl sources for the elemental metal sources traditionally used in MBE. Higher wafer throughput results because in VCE, the metal-alkyl sources for Ga, Al, and dopants can be mixed at room temperature and distributed uniformly though a large area injector to multiple substrates as a homogeneous array of mixed element molecular beams. The VCE reactor that we have built and that we shall describe here uniformly deposits films on 7 inch diameter substrate platters. Each platter contains seven two inch or three 3 inch diameter wafers. The load chamber contains up to nine platters. The vacuum chamber is equipped with two VCE growth zones and two arsenic ovens, one per growth zone. Finally, each oven has a 1 kg arsenic capacity. As of this writing, mirror smooth GaAs films have been grown at up to 4 μm/h growth rate on multiple wafers with good thickness uniformity. The background doping is p-type with a typical hole concentration and mobility of 1 × 10 16/cm 3 and 350 cm 2/V·s. This background doping level is low enough for the fabrication of MESFETs, solar cells, and photocathodes as well as other types of devices. We have fabricated MESFET devices using VCE-grown epi wafers with peak extrinsic transconductance as high as 210 mS/mm for a threshold voltage of - 3 V and a 0.6 μm gate length. We have also recently grown AlGaAs epi layers with up to 80% aluminum using TEAl as the aluminum alkyl source. The Al

  17. Practical High-Throughput Experimentation for Chemists

    PubMed Central

    2017-01-01

    Large arrays of hypothesis-driven, rationally designed experiments are powerful tools for solving complex chemical problems. Conceptual and practical aspects of chemical high-throughput experimentation are discussed. A case study in the application of high-throughput experimentation to a key synthetic step in a drug discovery program and subsequent optimization for the first large scale synthesis of a drug candidate is exemplified. PMID:28626518

  18. High-Throughput Nonlinear Optical Microscopy

    PubMed Central

    So, Peter T.C.; Yew, Elijah Y.S.; Rowlands, Christopher

    2013-01-01

    High-resolution microscopy methods based on different nonlinear optical (NLO) contrast mechanisms are finding numerous applications in biology and medicine. While the basic implementations of these microscopy methods are relatively mature, an important direction of continuing technological innovation lies in improving the throughput of these systems. Throughput improvement is expected to be important for studying fast kinetic processes, for enabling clinical diagnosis and treatment, and for extending the field of image informatics. This review will provide an overview of the fundamental limitations on NLO microscopy throughput. We will further cover several important classes of high-throughput NLO microscope designs with discussions on their strengths and weaknesses and their key biomedical applications. Finally, this review will close with a perspective of potential future technological improvements in this field. PMID:24359736

  19. Quantifying protein–protein interactions in high throughput using protein domain microarrays

    PubMed Central

    Kaushansky, Alexis; Allen, John E; Gordus, Andrew; Stiffler, Michael A; Karp, Ethan S; Chang, Bryan H; MacBeath, Gavin

    2011-01-01

    Protein microarrays provide an efficient way to identify and quantify protein–protein interactions in high throughput. One drawback of this technique is that proteins show a broad range of physicochemical properties and are often difficult to produce recombinantly. To circumvent these problems, we have focused on families of protein interaction domains. Here we provide protocols for constructing microarrays of protein interaction domains in individual wells of 96-well microtiter plates, and for quantifying domain–peptide interactions in high throughput using fluorescently labeled synthetic peptides. As specific examples, we will describe the construction of microarrays of virtually every human Src homology 2 (SH2) and phosphotyrosine binding (PTB) domain, as well as microarrays of mouse PDZ domains, all produced recombinantly in Escherichia coli. For domains that mediate high-affinity interactions, such as SH2 and PTB domains, equilibrium dissociation constants (KDs) for their peptide ligands can be measured directly on arrays by obtaining saturation binding curves. For weaker binding domains, such as PDZ domains, arrays are best used to identify candidate interactions, which are then retested and quantified by fluorescence polarization. Overall, protein domain microarrays provide the ability to rapidly identify and quantify protein–ligand interactions with minimal sample consumption. Because entire domain families can be interrogated simultaneously, they provide a powerful way to assess binding selectivity on a proteome-wide scale and provide an unbiased perspective on the connectivity of protein–protein interaction networks. PMID:20360771

  20. High-Throughput Contact Flow Lithography.

    PubMed

    Le Goff, Gaelle C; Lee, Jiseok; Gupta, Ankur; Hill, William Adam; Doyle, Patrick S

    2015-10-01

    High-throughput fabrication of graphically encoded hydrogel microparticles is achieved by combining flow contact lithography in a multichannel microfluidic device and a high capacity 25 mm LED UV source. Production rates of chemically homogeneous particles are improved by two orders of magnitude. Additionally, the custom-built contact lithography instrument provides an affordable solution for patterning complex microstructures on surfaces.

  1. High-throughput computing in the sciences.

    PubMed

    Morgan, Mark; Grimshaw, Andrew

    2009-01-01

    While it is true that the modern computer is many orders of magnitude faster than that of yesteryear; this tremendous growth in CPU clock rates is now over. Unfortunately, however, the growth in demand for computational power has not abated; whereas researchers a decade ago could simply wait for computers to get faster, today the only solution to the growing need for more powerful computational resource lies in the exploitation of parallelism. Software parallelization falls generally into two broad categories--"true parallel" and high-throughput computing. This chapter focuses on the latter of these two types of parallelism. With high-throughput computing, users can run many copies of their software at the same time across many different computers. This technique for achieving parallelism is powerful in its ability to provide high degrees of parallelism, yet simple in its conceptual implementation. This chapter covers various patterns of high-throughput computing usage and the skills and techniques necessary to take full advantage of them. By utilizing numerous examples and sample codes and scripts, we hope to provide the reader not only with a deeper understanding of the principles behind high-throughput computing, but also with a set of tools and references that will prove invaluable as she explores software parallelism with her own software applications and research.

  2. High Throughput Determination of Critical Human Dosing ...

    EPA Pesticide Factsheets

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data into predicted human equivalent doses that can be linked with biologically relevant exposure scenarios. Thus, HTTK provides essential data for risk prioritization for thousands of chemicals that lack TK data. One critical HTTK parameter that can be measured in vitro is the unbound fraction of a chemical in plasma (Fub). However, for chemicals that bind strongly to plasma, Fub is below the limits of detection (LOD) for high throughput analytical chemistry, and therefore cannot be quantified. A novel method for quantifying Fub was implemented for 85 strategically selected chemicals: measurement of Fub was attempted at 10%, 30%, and 100% of physiological plasma concentrations using rapid equilibrium dialysis assays. Varying plasma concentrations instead of chemical concentrations makes high throughput analytical methodology more likely to be successful. Assays at 100% plasma concentration were unsuccessful for 34 chemicals. For 12 of these 34 chemicals, Fub could be quantified at 10% and/or 30% plasma concentrations; these results imply that the assay failure at 100% plasma concentration was caused by plasma protein binding for these chemicals. Assay failure for the remaining 22 chemicals may

  3. High-throughput in vivo vertebrate screening

    PubMed Central

    Pardo-Martin, Carlos; Chang, Tsung-Yao; Koo, Bryan Kyo; Gilleland, Cody L.; Wasserman, Steven C.; Yanik, Mehmet Fatih

    2010-01-01

    We demonstrate a high-throughput platform for cellular-resolution in vivo pharmaceutical and genetic screens on zebrafish larvae. The system automatically loads animals from reservoirs or multiwell plates, and positions and orients them for high-speed confocal imaging and laser manipulation of both superficial and deep organs within 19 seconds without damage. We show small-scale test screening of retinal axon guidance mutants and neuronal regeneration assays in combination with femtosecond laser microsurgery. PMID:20639868

  4. High throughput-per-footprint inertial focusing.

    PubMed

    Ciftlik, Ata Tuna; Ettori, Maxime; Gijs, Martin A M

    2013-08-26

    Matching the scale of microfluidic flow systems with that of microelectronic chips for realizing monolithically integrated systems still needs to be accomplished. However, this is appealing only if such re-scaling does not compromise the fluidic throughput. This is related to the fact that the cost of microelectronic circuits primarily depends on the layout footprint, while the performance of many microfluidic systems, like flow cytometers, is measured by the throughput. The simple operation of inertial particle focusing makes it a promising technique for use in such integrated flow cytometer applications, however, microfluidic footprints demonstrated so far preclude monolithic integration. Here, the scaling limits of throughput-per-footprint (TPFP) in using inertial focusing are explored by studying the interplay between theory, the effect of channel Reynolds numbers up to 1500 on focusing, the entry length for the laminar flow to develop, and pressure resistance of the microchannels. Inertial particle focusing is demonstrated with a TPFP up to 0.3 L/(min cm²) in high aspect-ratio rectangular microfluidic channels that are readily fabricated with a post-CMOS integratable process, suggesting at least a 100-fold improvement compared to previously demonstrated techniques. Not only can this be an enabling technology for realizing cost-effective monolithically integrated flow cytometry devices, but the methodology represented here can also open perspectives for miniaturization of many biomedical microfluidic applications requiring monolithic integration with microelectronics without compromising the throughput. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. High-throughput neuro-imaging informatics.

    PubMed

    Miller, Michael I; Faria, Andreia V; Oishi, Kenichi; Mori, Susumu

    2013-01-01

    This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high-throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high-dimensional neuroinformatic representation index containing O(1000-10,000) discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high-throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high-throughput machine learning methods for supporting (i) cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii) integration of image and personal medical record non-image information for diagnosis and prognosis.

  6. Economic consequences of high throughput maskless lithography

    NASA Astrophysics Data System (ADS)

    Hartley, John G.; Govindaraju, Lakshmi

    2005-11-01

    Many people in the semiconductor industry bemoan the high costs of masks and view mask cost as one of the significant barriers to bringing new chip designs to market. All that is needed is a viable maskless technology and the problem will go away. Numerous sites around the world are working on maskless lithography but inevitably, the question asked is "Wouldn't a one wafer per hour maskless tool make a really good mask writer?" Of course, the answer is yes, the hesitation you hear in the answer isn't based on technology concerns, it's financial. The industry needs maskless lithography because mask costs are too high. Mask costs are too high because mask pattern generators (PG's) are slow and expensive. If mask PG's become much faster, mask costs go down, the maskless market goes away and the PG supplier is faced with an even smaller tool demand from the mask shops. Technical success becomes financial suicide - or does it? In this paper we will present the results of a model that examines some of the consequences of introducing high throughput maskless pattern generation. Specific features in the model include tool throughput for masks and wafers, market segmentation by node for masks and wafers and mask cost as an entry barrier to new chip designs. How does the availability of low cost masks and maskless tools affect the industries tool makeup and what is the ultimate potential market for high throughput maskless pattern generators?

  7. High-throughput neuro-imaging informatics

    PubMed Central

    Miller, Michael I.; Faria, Andreia V.; Oishi, Kenichi; Mori, Susumu

    2013-01-01

    This paper describes neuroinformatics technologies at 1 mm anatomical scale based on high-throughput 3D functional and structural imaging technologies of the human brain. The core is an abstract pipeline for converting functional and structural imagery into their high-dimensional neuroinformatic representation index containing O(1000–10,000) discriminating dimensions. The pipeline is based on advanced image analysis coupled to digital knowledge representations in the form of dense atlases of the human brain at gross anatomical scale. We demonstrate the integration of these high-dimensional representations with machine learning methods, which have become the mainstay of other fields of science including genomics as well as social networks. Such high-throughput facilities have the potential to alter the way medical images are stored and utilized in radiological workflows. The neuroinformatics pipeline is used to examine cross-sectional and personalized analyses of neuropsychiatric illnesses in clinical applications as well as longitudinal studies. We demonstrate the use of high-throughput machine learning methods for supporting (i) cross-sectional image analysis to evaluate the health status of individual subjects with respect to the population data, (ii) integration of image and personal medical record non-image information for diagnosis and prognosis. PMID:24381556

  8. Lessons we learned from high-throughput and top-down systems biology analyses about glioma stem cells.

    PubMed

    Mock, Andreas; Chiblak, Sara; Herold-Mende, Christel

    2014-01-01

    A growing body of evidence suggests that glioma stem cells (GSCs) account for tumor initiation, therapy resistance, and the subsequent regrowth of gliomas. Thus, continuous efforts have been undertaken to further characterize this subpopulation of less differentiated tumor cells. Although we are able to enrich GSCs, we still lack a comprehensive understanding of GSC phenotypes and behavior. The advent of high-throughput technologies raised hope that incorporation of these newly developed platforms would help to tackle such questions. Since then a couple of comparative genome-, transcriptome- and proteome-wide studies on GSCs have been conducted giving new insights in GSC biology. However, lessons had to be learned in designing high-throughput experiments and some of the resulting conclusions fell short of expectations because they were performed on only a few GSC lines or at one molecular level instead of an integrative poly-omics approach. Despite these shortcomings, our knowledge of GSC biology has markedly expanded due to a number of survival-associated biomarkers as well as glioma-relevant signaling pathways and therapeutic targets being identified. In this article we review recent findings obtained by comparative high-throughput analyses of GSCs. We further summarize fundamental concepts of systems biology as well as its applications for glioma stem cell research.

  9. Microfabricated high-throughput electronic particle detector.

    PubMed

    Wood, D K; Requa, M V; Cleland, A N

    2007-10-01

    We describe the design, fabrication, and use of a radio frequency reflectometer integrated with a microfluidic system, applied to the very high-throughput measurement of micron-scale particles, passing in a microfluidic channel through the sensor region. The device operates as a microfabricated Coulter counter [U.S. Patent No. 2656508 (1953)], similar to a design we have described previously, but here with significantly improved electrode geometry as well as including electronic tuning of the reflectometer; the two improvements yielding an improvement by more than a factor of 10 in the signal to noise and in the diametric discrimination of single particles. We demonstrate the high-throughput discrimination of polystyrene beads with diameters in the 4-10 microm range, achieving diametric resolutions comparable to the intrinsic spread of diameters in the bead distribution, at rates in excess of 15 x 10(6) beads/h.

  10. High-throughput TILLING for functional genomics.

    PubMed

    Till, Bradley J; Colbert, Trenton; Tompa, Rachel; Enns, Linda C; Codomo, Christine A; Johnson, Jessica E; Reynolds, Steven H; Henikoff, Jorja G; Greene, Elizabeth A; Steine, Michael N; Comai, Luca; Henikoff, Steven

    2003-01-01

    Targeting-induced local lesions in genomes (TILLING) is a general strategy for identifying induced point mutations that can be applied to almost any organism. Here, we describe the basic methodology for high-throughput TILLING. Gene segments are amplified using fluorescently tagged primers, and products are denatured and reannealed to form heteroduplexes between the mutated sequence and its wild-type counterpart. These heteroduplexes are substrates for cleavage by the endonuclease CEL I. Following cleavage, products are analyzed on denaturing polyacrylamide gels using the LI-COR DNA analyzer system. High-throughput TILLING has been adopted by the Arabidopsis TILLING Project (ATP) to provide allelic series of point mutations for the general Arabidopsis community.

  11. High-throughput TILLING for Arabidopsis.

    PubMed

    Till, Bradley J; Colbert, Trenton; Codomo, Christine; Enns, Linda; Johnson, Jessica; Reynolds, Steven H; Henikoff, Jorja G; Greene, Elizabeth A; Steine, Michael N; Comai, Luca; Henikoff, Steven

    2006-01-01

    Targeting induced local lesions in genomes (TILLING) is a general strategy for identifying induced point mutations that can be applied to almost any organism. In this chapter, we describe the basic methodology for high-throughput TILLING. Gene segments are amplified using fluorescently tagged primers, and products are denatured and reannealed to form heteroduplexes between the mutated sequence and its wild-type counterpart. These heteroduplexes are substrates for cleavage by the endonuclease CEL I. Following cleavage, products are analyzed on denaturing polyacrylamide gels using the LI-COR DNA analyzer system. High-throughput TILLING has been adopted by the Arabidopsis TILLING Project (ATP) to provide allelic series of point mutations for the general Arabidopsis community.

  12. High Throughput Determination of Tetramine in Drinking ...

    EPA Pesticide Factsheets

    Report The sampling and analytical procedure (SAP) presented herein, describes a method for the high throughput determination of tetramethylene disulfotetramine in drinking water by solid phase extraction and isotope dilution gas chromatography/mass spectrometry. This method, which will be included in the SAM, is expected to provide the Water Laboratory Alliance, as part of EPA’s Environmental Response Laboratory Network, with a more reliable and faster means of analyte collection and measurement.

  13. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  14. High Throughput Screening For Hazard and Risk of Environmental Contaminants

    EPA Science Inventory

    High throughput toxicity testing provides detailed mechanistic information on the concentration response of environmental contaminants in numerous potential toxicity pathways. High throughput screening (HTS) has several key advantages: (1) expense orders of magnitude less than an...

  15. High Throughput PBTK: Open-Source Data and Tools for ...

    EPA Pesticide Factsheets

    Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy Presentation on High Throughput PBTK at the PBK Modelling in Risk Assessment meeting in Ispra, Italy

  16. Rapid fabrication of glass/PDMS hybrid µIMER for high throughput membrane proteomics.

    PubMed

    Pereira-Medrano, Ana G; Forster, Simon; Fowler, Gregory J S; McArthur, Sally L; Wright, Phillip C

    2010-12-21

    Mass spectrometry (MS) based proteomics has brought a radical approach to systems biology, offering a platform to study complex biological functions. However, key proteomic technical challenges remain, mainly the inability to characterise the complete proteome of a cell due to the thousands of diverse, complex proteins expressed at an extremely wide concentration range. Currently, high throughput and efficient techniques to unambiguously identify and quantify proteins on a proteome-wide scale are in demand. Miniaturised analytical systems placed upstream of MS help us to attain these goals. One time-consuming step in traditional techniques is the in-solution digestion of proteins (4-20 h). This also has other drawbacks, including enzyme autoproteolysis, low efficiency, and manual operation. Furthermore, the identification of α-helical membrane proteins has remained a challenge due to their high hydrophobicity and lack of trypsin cleavage targets in transmembrane helices. We demonstrate a new rapidly produced glass/PDMS micro Immobilised Enzyme Reactor (µIMER) with enzymes covalently immobilised onto polyacrylic acid plasma-modified surfaces for the purpose of rapidly (as low as 30 s) generating peptides suitable for MS analysis. This µIMER also allows, for the first time, rapid digestion of insoluble proteins. Membrane protein identification through this method was achieved after just 4 min digestion time, up to 9-fold faster than either dual-stage in-solution digestion approaches or other commonly used bacterial membrane proteomic workflows.

  17. Automated High Throughput Drug Target Crystallography

    SciTech Connect

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  18. Development of proteome-wide binding reagents for research and diagnostics.

    PubMed

    Taussig, Michael J; Schmidt, Ronny; Cook, Elizabeth A; Stoevesandt, Oda

    2013-12-01

    Alongside MS, antibodies and other specific protein-binding molecules have a special place in proteomics as affinity reagents in a toolbox of applications for determining protein location, quantitative distribution and function (affinity proteomics). The realisation that the range of research antibodies available, while apparently vast is nevertheless still very incomplete and frequently of uncertain quality, has stimulated projects with an objective of raising comprehensive, proteome-wide sets of protein binders. With progress in automation and throughput, a remarkable number of recent publications refer to the practical possibility of selecting binders to every protein encoded in the genome. Here we review the requirements of a pipeline of production of protein binders for the human proteome, including target prioritisation, antigen design, 'next generation' methods, databases and the approaches taken by ongoing projects in Europe and the USA. While the task of generating affinity reagents for all human proteins is complex and demanding, the benefits of well-characterised and quality-controlled pan-proteome binder resources for biomedical research, industry and life sciences in general would be enormous and justify the effort. Given the technical, personnel and financial resources needed to fulfil this aim, expansion of current efforts may best be addressed through large-scale international collaboration. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. High-throughput electrophysiology with Xenopus oocytes

    PubMed Central

    Papke, Roger L.; Smith-Maxwell, Cathy

    2010-01-01

    Voltage-clamp techniques are typically used to study the plasma membrane proteins, such as ion channels and transporters that control bioelectrical signals. Many of these proteins have been cloned and can now be studied as potential targets for drug development. The two approaches most commonly used for heterologous expression of cloned ion channels and transporters involve either transfection of the genes into small cells grown in tissue culture or the injection of the genetic material into larger cells. The standard large cells used for the expression of cloned cDNA or synthetic RNA are the egg progenitor cells (oocytes) of the African frog, Xenopus laevis. Until recently, cellular electrophysiology was performed manually, one cell at a time by a single operator. However, methods of high-throughput electrophysiology have been developed which are automated and permit data acquisition and analysis from multiple cells in parallel. These methods are breaking a bottleneck in drug discovery, useful in some cases for primary screening as well as for thorough characterization of new drugs. Increasing throughput of high-quality functional data greatly augments the efficiency of academic research and pharmaceutical drug development. Some examples of studies that benefit most from high-throughput electrophysiology include pharmaceutical screening of targeted compound libraries, secondary screening of identified compounds for subtype selectivity, screening mutants of ligand-gated channels for changes in receptor function, scanning mutagenesis of protein segments, and mutant-cycle analysis. We describe here the main features and potential applications of OpusXpress, an efficient commercially available system for automated recording from Xenopus oocytes. We show some types of data that have been gathered by this system and review realized and potential applications. PMID:19149490

  20. Clustering of High Throughput Gene Expression Data

    PubMed Central

    Pirim, Harun; Ekşioğlu, Burak; Perkins, Andy; Yüceer, Çetin

    2012-01-01

    High throughput biological data need to be processed, analyzed, and interpreted to address problems in life sciences. Bioinformatics, computational biology, and systems biology deal with biological problems using computational methods. Clustering is one of the methods used to gain insight into biological processes, particularly at the genomics level. Clearly, clustering can be used in many areas of biological data analysis. However, this paper presents a review of the current clustering algorithms designed especially for analyzing gene expression data. It is also intended to introduce one of the main problems in bioinformatics - clustering gene expression data - to the operations research community. PMID:23144527

  1. High throughput screening technologies for ion channels

    PubMed Central

    Yu, Hai-bo; Li, Min; Wang, Wei-ping; Wang, Xiao-liang

    2016-01-01

    Ion channels are involved in a variety of fundamental physiological processes, and their malfunction causes numerous human diseases. Therefore, ion channels represent a class of attractive drug targets and a class of important off-targets for in vitro pharmacological profiling. In the past decades, the rapid progress in developing functional assays and instrumentation has enabled high throughput screening (HTS) campaigns on an expanding list of channel types. Chronologically, HTS methods for ion channels include the ligand binding assay, flux-based assay, fluorescence-based assay, and automated electrophysiological assay. In this review we summarize the current HTS technologies for different ion channel classes and their applications. PMID:26657056

  2. High throughput chemical munitions treatment system

    DOEpatents

    Haroldsen, Brent L [Manteca, CA; Stofleth, Jerome H [Albuquerque, NM; Didlake, Jr., John E.; Wu, Benjamin C-P [San Ramon, CA

    2011-11-01

    A new High-Throughput Explosive Destruction System is disclosed. The new system is comprised of two side-by-side detonation containment vessels each comprising first and second halves that feed into a single agent treatment vessel. Both detonation containment vessels further comprise a surrounding ventilation facility. Moreover, the detonation containment vessels are designed to separate into two half-shells, wherein one shell can be moved axially away from the fixed, second half for ease of access and loading. The vessels are closed by means of a surrounding, clam-shell type locking seal mechanisms.

  3. High-Throughput Methods for Electron Crystallography

    PubMed Central

    Stokes, David L.; Ubarretxena-Belandia, Iban; Gonen, Tamir; Engel, Andreas

    2013-01-01

    Membrane proteins play a tremendously important role in cell physiology and serve as a target for an increasing number of drugs. Structural information is key to understanding their function and for developing new strategies for combating disease. However, the complex physical chemistry associated with membrane proteins has made them more difficult to study than their soluble cousins. Electron crystallography has historically been a successful method for solving membrane protein structures and has the advantage of providing the natural environment of a lipid membrane. Specifically, when membrane proteins form two-dimensional arrays within a lipid bilayer, images and diffraction can be recorded by electron microscopy. The corresponding data can be combined to produce a three-dimensional reconstruction which, under favorable conditions, can extend to atomic resolution. Like X-ray crystallography, the quality of the structures are very much dependent on the order and size of the crystals. However, unlike X-ray crystallography, high-throughput methods for screening crystallization trials for electron crystallography are not in general use. In this chapter, we describe two alternative and potentially complementary methods for high-throughput screening of membrane protein crystallization within the lipid bilayer. The first method relies on the conventional use of dialysis for removing detergent and thus reconstituting the bilayer; an array of dialysis wells in the standard 96-well format allows the use of a liquid-handling robot and greatly increases throughput. The second method relies on detergent complexation by cyclodextrin; a specialized pipetting robot has been designed not only to titrate cyclodextrin, but to use light scattering to monitor the reconstitution process. In addition, the use of liquid-handling robots for making negatively stained grids and methods for automatically imaging samples in the electron microscope are described. PMID:23132066

  4. Proteome-wide covalent ligand discovery in native biological systems

    PubMed Central

    Backus, Keriann M.; Correia, Bruno E.; Lum, Kenneth M.; Forli, Stefano; Horning, Benjamin D.; González-Páez, Gonzalo E.; Chatterjee, Sandip; Lanning, Bryan R.; Teijaro, John R.; Olson, Arthur J.; Wolan, Dennis W.; Cravatt, Benjamin F.

    2016-01-01

    Small molecules are powerful tools for investigating protein function and can serve as leads for new therapeutics. Most human proteins, however, lack small-molecule ligands, and entire protein classes are considered “undruggable” 1,2. Fragment-based ligand discovery (FBLD) can identify small-molecule probes for proteins that have proven difficult to target using high-throughput screening of complex compound libraries 1,3. Although reversibly binding ligands are commonly pursued, covalent fragments provide an alternative route to small-molecule probes 4–10, including those that can access regions of proteins that are difficult to access through binding affinity alone 5,10,11. In this manuscript, we report a quantitative analysis of cysteine-reactive small-molecule fragments screened against thousands of proteins. Covalent ligands were identified for >700 cysteines found in both druggable proteins and proteins deficient in chemical probes, including transcription factors, adaptor/scaffolding proteins, and uncharacterized proteins. Among the atypical ligand-protein interactions discovered were compounds that react preferentially with pro- (inactive) caspases. We used these ligands to distinguish extrinsic apoptosis pathways in human cell lines versus primary human T-cells, showing that the former is largely mediated by caspase-8 while the latter depends on both caspase-8 and −10. Fragment-based covalent ligand discovery provides a greatly expanded portrait of the ligandable proteome and furnishes compounds that can illuminate protein functions in native biological systems. PMID:27309814

  5. Preliminary High-Throughput Metagenome Assembly

    SciTech Connect

    Dusheyko, Serge; Furman, Craig; Pangilinan, Jasmyn; Shapiro, Harris; Tu, Hank

    2007-03-26

    Metagenome data sets present a qualitatively different assembly problem than traditional single-organism whole-genome shotgun (WGS) assembly. The unique aspects of such projects include the presence of a potentially large number of distinct organisms and their representation in the data set at widely different fractions. In addition, multiple closely related strains could be present, which would be difficult to assemble separately. Failure to take these issues into account can result in poor assemblies that either jumble together different strains or which fail to yield useful results. The DOE Joint Genome Institute has sequenced a number of metagenomic projects and plans to considerably increase this number in the coming year. As a result, the JGI has a need for high-throughput tools and techniques for handling metagenome projects. We present the techniques developed to handle metagenome assemblies in a high-throughput environment. This includes a streamlined assembly wrapper, based on the JGI?s in-house WGS assembler, Jazz. It also includes the selection of sensible defaults targeted for metagenome data sets, as well as quality control automation for cleaning up the raw results. While analysis is ongoing, we will discuss preliminary assessments of the quality of the assembly results (http://fames.jgi-psf.org).

  6. Incorporating High-Throughput Exposure Predictions with ...

    EPA Pesticide Factsheets

    We previously integrated dosimetry and exposure with high-throughput screening (HTS) to enhance the utility of ToxCast™ HTS data by translating in vitro bioactivity concentrations to oral equivalent doses (OEDs) required to achieve these levels internally. These OEDs were compared against regulatory exposure estimates, providing an activity-to-exposure ratio (AER) useful for a risk-based ranking strategy. As ToxCast™ efforts expand (i.e., Phase II) beyond food-use pesticides towards a wider chemical domain that lacks exposure and toxicity information, prediction tools become increasingly important. In this study, in vitro hepatic clearance and plasma protein binding were measured to estimate OEDs for a subset of Phase II chemicals. OEDs were compared against high-throughput (HT) exposure predictions generated using probabilistic modeling and Bayesian approaches generated by the U.S. EPA ExpoCast™ program. This approach incorporated chemical-specific use and national production volume data with biomonitoring data to inform the exposure predictions. This HT exposure modeling approach provided predictions for all Phase II chemicals assessed in this study whereas estimates from regulatory sources were available for only 7% of chemicals. Of the 163 chemicals assessed in this study, three or 13 chemicals possessed AERs <1 or <100, respectively. Diverse bioactivities y across a range of assays and concentrations was also noted across the wider chemical space su

  7. Modeling Steroidogenesis Disruption Using High-Throughput ...

    EPA Pesticide Factsheets

    Environmental chemicals can elicit endocrine disruption by altering steroid hormone biosynthesis and metabolism (steroidogenesis) causing adverse reproductive and developmental effects. Historically, a lack of assays resulted in few chemicals having been evaluated for effects on steroidogenesis. The steroidogenic pathway is a series of hydroxylation and dehydrogenation steps carried out by CYP450 and hydroxysteroid dehydrogenase enzymes, yet the only enzyme in the pathway for which a high-throughput screening (HTS) assay has been developed is aromatase (CYP19A1), responsible for the aromatization of androgens to estrogens. Recently, the ToxCast HTS program adapted the OECD validated H295R steroidogenesis assay using human adrenocortical carcinoma cells into a high-throughput model to quantitatively assess the concentration-dependent (0.003-100 µM) effects of chemicals on 10 steroid hormones including progestagens, androgens, estrogens and glucocorticoids. These results, in combination with two CYP19A1 inhibition assays, comprise a large dataset amenable to clustering approaches supporting the identification and characterization of putative mechanisms of action (pMOA) for steroidogenesis disruption. In total, 514 chemicals were tested in all CYP19A1 and steroidogenesis assays. 216 chemicals were identified as CYP19A1 inhibitors in at least one CYP19A1 assay. 208 of these chemicals also altered hormone levels in the H295R assay, suggesting 96% sensitivity in the

  8. High-throughput rod-induced electrospinning

    NASA Astrophysics Data System (ADS)

    Wu, Dezhi; Xiao, Zhiming; Teh, Kwok Siong; Han, Zhibin; Luo, Guoxi; Shi, Chuan; Sun, Daoheng; Zhao, Jinbao; Lin, Liwei

    2016-09-01

    A high throughput electrospinning process, directly from flat polymer solution surfaces induced by a moving insulating rod, has been proposed and demonstrated. Different rods made of either phenolic resin or paper with a diameter of 1-3 cm and a resistance of about 100-500 MΩ, has been successfully utilized in the process. The rod is placed approximately 10 mm above the flat polymer solution surface with a moving speed of 0.005-0.4 m s-1 this causes the solution to generate multiple liquid jets under an applied voltage of 15-60 kV for the tip-less electrospinning process. The local electric field induced by the rod can boost electrohydrodynamic instability in order to generate Taylor cones and liquid jets. Experimentally, it is found that a large rod diameter and a small solution-to-rod distance can enhance the local electrical field to reduce the magnitude of the applied voltage. In the prototype setup with poly (ethylene oxide) polymer solution, an area of 5 cm  ×  10 cm and under an applied voltage of 60 kV, the maximum throughput of nanofibers is recorded to be approximately144 g m-2 h-1.

  9. High Throughput Screening Tools for Thermoelectric Materials

    NASA Astrophysics Data System (ADS)

    Wong-Ng, W.; Yan, Y.; Otani, M.; Martin, J.; Talley, K. R.; Barron, S.; Carroll, D. L.; Hewitt, C.; Joress, H.; Thomas, E. L.; Green, M. L.; Tang, X. F.

    2015-06-01

    A suite of complementary high-throughput screening systems for combinatorial films was developed at National Institute of Standards and Technology to facilitate the search for efficient thermoelectric materials. These custom-designed capabilities include a facility for combinatorial thin film synthesis and a suite of tools for screening the Seebeck coefficient, electrical resistance (electrical resistivity), and thermal effusivity (thermal conductivity) of these films. The Seebeck coefficient and resistance are measured via custom-built automated apparatus at both ambient and high temperatures. Thermal effusivity is measured using a frequency domain thermoreflectance technique. This paper will discuss applications using these tools on representative thermoelectric materials, including combinatorial composition-spread films, conventional films, single crystals, and ribbons.

  10. High-Throughput Analysis of Enzyme Activities

    SciTech Connect

    Lu, Guoxin

    2007-01-01

    High-throughput screening (HTS) techniques have been applied to many research fields nowadays. Robot microarray printing technique and automation microtiter handling technique allows HTS performing in both heterogeneous and homogeneous formats, with minimal sample required for each assay element. In this dissertation, new HTS techniques for enzyme activity analysis were developed. First, patterns of immobilized enzyme on nylon screen were detected by multiplexed capillary system. The imaging resolution is limited by the outer diameter of the capillaries. In order to get finer images, capillaries with smaller outer diameters can be used to form the imaging probe. Application of capillary electrophoresis allows separation of the product from the substrate in the reaction mixture, so that the product doesn't have to have different optical properties with the substrate. UV absorption detection allows almost universal detection for organic molecules. Thus, no modifications of either the substrate or the product molecules are necessary. This technique has the potential to be used in screening of local distribution variations of specific bio-molecules in a tissue or in screening of multiple immobilized catalysts. Another high-throughput screening technique is developed by directly monitoring the light intensity of the immobilized-catalyst surface using a scientific charge-coupled device (CCD). Briefly, the surface of enzyme microarray is focused onto a scientific CCD using an objective lens. By carefully choosing the detection wavelength, generation of product on an enzyme spot can be seen by the CCD. Analyzing the light intensity change over time on an enzyme spot can give information of reaction rate. The same microarray can be used for many times. Thus, high-throughput kinetic studies of hundreds of catalytic reactions are made possible. At last, we studied the fluorescence emission spectra of ADP and obtained the detection limits for ADP under three different

  11. Origin and evolution of high throughput screening

    PubMed Central

    Pereira, D A; Williams, J A

    2007-01-01

    This article reviews the origin and evolution of high throughput screening (HTS) through the experience of an individual pharmaceutical company, revealing some of the mysteries of the early stages of drug discovery to the wider pharmacology audience. HTS in this company (Pfizer, Groton, USA) had its origin in natural products screening in 1986, by substituting fermentation broths with dimethyl sulphoxide solutions of synthetic compounds, using 96-well plates and reduced assay volumes of 50-100μl. A nominal 30mM source compound concentration provided high μM assay concentrations. Starting at 800 compounds each week, the process reached a steady state of 7200 compounds per week by 1989. Screening in the Applied Biotechnology and Screening Group was centralized with screens operating in lock-step to maximize efficiency. Initial screens were full files run in triplicate. Autoradiography and image analysis were introduced for 125I receptor ligand screens. Reverse transcriptase (RT) coupled with quantitative PCR and multiplexing addressed several targets in a single assay. By 1992 HTS produced ‘hits' as starting matter for approximately 40% of the Discovery portfolio. In 1995, the HTS methodology was expanded to include ADMET targets. ADME targets required each compound to be physically detected leading to the development of automated high throughput LC-MS. In 1996, 90 compounds/week were screened in microsomal, protein binding and serum stability assays. Subsequently, the mutagenic Ames assay was adapted to a 96-well plate liquid assay and novel algorithms permitted automated image analysis of the micronucleus assay. By 1999 ADME HTS was fully integrated into the discovery cycle. PMID:17603542

  12. A high-throughput radiometric kinase assay

    PubMed Central

    Duong-Ly, Krisna C.; Peterson, Jeffrey R.

    2016-01-01

    Aberrant kinase signaling has been implicated in a number of diseases. While kinases have become attractive drug targets, only a small fraction of human protein kinases have validated inhibitors. Screening libraries of compounds against a kinase or kinases of interest is routinely performed during kinase inhibitor development to identify promising scaffolds for a particular target and to identify kinase targets for compounds of interest. Screening of more focused compound libraries may also be conducted in the later stages of inhibitor development to improve potency and optimize selectivity. The dot blot kinase assay is a robust, high-throughput kinase assay that can be used to screen a number of small molecule compounds against one kinase of interest or several kinases. Here, a protocol for a dot blot kinase assay used for measuring insulin receptor kinase activity is presented. This protocol can be readily adapted for use with other protein kinases. PMID:26501904

  13. High throughput assays for analyzing transcription factors.

    PubMed

    Li, Xianqiang; Jiang, Xin; Yaoi, Takuro

    2006-06-01

    Transcription factors are a group of proteins that modulate the expression of genes involved in many biological processes, such as cell growth and differentiation. Alterations in transcription factor function are associated with many human diseases, and therefore these proteins are attractive potential drug targets. A key issue in the development of such therapeutics is the generation of effective tools that can be used for high throughput discovery of the critical transcription factors involved in human diseases, and the measurement of their activities in a variety of disease or compound-treated samples. Here, a number of innovative arrays and 96-well format assays for profiling and measuring the activities of transcription factors will be discussed.

  14. High-throughput hyperdimensional vertebrate phenotyping

    PubMed Central

    Pardo-Martin, Carlos; Allalou, Amin; Medina, Jaime; Eimon, Peter M.; Wählby, Carolina; Yanik, Mehmet Fatih

    2013-01-01

    Most gene mutations and biologically active molecules cause complex responses in animals that cannot be predicted by cell culture models. Yet animal studies remain too slow and their analyses are often limited to only a few readouts. Here we demonstrate high-throughput optical projection tomography with micrometer resolution and hyperdimensional screening of entire vertebrates in tens of seconds using a simple fluidic system. Hundreds of independent morphological features and complex phenotypes are automatically captured in three dimensions with unprecedented speed and detail in semi-transparent zebrafish larvae. By clustering quantitative phenotypic signatures, we can detect and classify even subtle alterations in many biological processes simultaneously. We term our approach hyperdimensional in vivo phenotyping (HIP). To illustrate the power of HIP, we have analyzed the effects of several classes of teratogens on cartilage formation using 200 independent morphological measurements and identified similarities and differences that correlate well with their known mechanisms of actions in mammals. PMID:23403568

  15. A high-throughput neutron spectrometer

    NASA Astrophysics Data System (ADS)

    Stampfl, Anton; Noakes, Terry; Bartsch, Friedl; Bertinshaw, Joel; Veliscek-Carolan, Jessica; Nateghi, Ebrahim; Raeside, Tyler; Yethiraj, Mohana; Danilkin, Sergey; Kearley, Gordon

    2010-03-01

    A cross-disciplinary high-throughput neutron spectrometer is currently under construction at OPAL, ANSTO's open pool light-water research reactor. The spectrometer is based on the design of a Be-filter spectrometer (FANS) that is operating at the National Institute of Standards research reactor in the USA. The ANSTO filter-spectrometer will be switched in and out with another neutron spectrometer, the triple-axis spectrometer, Taipan. Thus two distinct types of neutron spectrometers will be accessible: one specialised to perform phonon dispersion analysis and the other, the filter-spectrometer, designed specifically to measure vibrational density of states. A summary of the design will be given along with a detailed ray-tracing analysis. Some preliminary results will be presented from the spectrometer.

  16. Sequential stopping for high-throughput experiments.

    PubMed

    Rossell, David; Müller, Peter

    2013-01-01

    In high-throughput experiments, the sample size is typically chosen informally. Most formal sample-size calculations depend critically on prior knowledge. We propose a sequential strategy that, by updating knowledge when new data are available, depends less critically on prior assumptions. Experiments are stopped or continued based on the potential benefits in obtaining additional data. The underlying decision-theoretic framework guarantees the design to proceed in a coherent fashion. We propose intuitively appealing, easy-to-implement utility functions. As in most sequential design problems, an exact solution is prohibitive. We propose a simulation-based approximation that uses decision boundaries. We apply the method to RNA-seq, microarray, and reverse-phase protein array studies and show its potential advantages. The approach has been added to the Bioconductor package gaga.

  17. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Aniruddha

    2006-01-01

    We have shown that by covalently modifying a subpopulation, less than or equal to 1%, of a macromolecule with a fluorescent probe, the labeled material will add to a growing crystal as a microheterogeneous growth unit. Labeling procedures can be readily incorporated into the final stages of purification, and the presence of the probe at low concentrations does not affect the X-ray data quality or the crystallization behavior. The presence of the trace fluorescent label gives a number of advantages when used with high throughput crystallizations. The covalently attached probe will concentrate in the crystal relative to the solution, and under fluorescent illumination crystals show up as bright objects against a dark background. Non-protein structures, such as salt crystals, will not incorporate the probe and will not show up under fluorescent illumination. Brightly fluorescent crystals are readily found against less bright precipitated phases, which under white light illumination may obscure the crystals. Automated image analysis to find crystals should be greatly facilitated, without having to first define crystallization drop boundaries as the protein or protein structures is all that shows up. Fluorescence intensity is a faster search parameter, whether visually or by automated methods, than looking for crystalline features. We are now testing the use of high fluorescence intensity regions, in the absence of clear crystalline features or "hits", as a means for determining potential lead conditions. A working hypothesis is that kinetics leading to non-structured phases may overwhelm and trap more slowly formed ordered assemblies, which subsequently show up as regions of brighter fluorescence intensity. Preliminary experiments with test proteins have resulted in the extraction of a number of crystallization conditions from screening outcomes based solely on the presence of bright fluorescent regions. Subsequent experiments will test this approach using a wider

  18. High-Throughput Screening in Primary Neurons

    PubMed Central

    Sharma, Punita; Ando, D. Michael; Daub, Aaron; Kaye, Julia A.; Finkbeiner, Steven

    2013-01-01

    Despite years of incremental progress in our understanding of diseases such as Alzheimer's disease (AD), Parkinson's disease (PD), Huntington's disease (HD), and amyotrophic lateral sclerosis (ALS), there are still no disease-modifying therapeutics. The discrepancy between the number of lead compounds and approved drugs may partially be a result of the methods used to generate the leads and highlights the need for new technology to obtain more detailed and physiologically relevant information on cellular processes in normal and diseased states. Our high-throughput screening (HTS) system in a primary neuron model can help address this unmet need. HTS allows scientists to assay thousands of conditions in a short period of time which can reveal completely new aspects of biology and identify potential therapeutics in the span of a few months when conventional methods could take years or fail all together. HTS in primary neurons combines the advantages of HTS with the biological relevance of intact, fully differentiated neurons which can capture the critical cellular events or homeostatic states that make neurons uniquely susceptible to disease-associated proteins. We detail methodologies of our primary neuron HTS assay workflow from sample preparation to data reporting. We also discuss our adaptation of our HTS system into high-content screening (HCS), a type of HTS that uses multichannel fluorescence images to capture biological events in situ, and is uniquely suited to study dynamical processes in living cells. PMID:22341232

  19. High Throughput Determination of Critical Human Dosing Parameters (SOT)

    EPA Science Inventory

    High throughput toxicokinetics (HTTK) is a rapid approach that uses in vitro data to estimate TK for hundreds of environmental chemicals. Reverse dosimetry (i.e., reverse toxicokinetics or RTK) based on HTTK data converts high throughput in vitro toxicity screening (HTS) data int...

  20. AOPs and Biomarkers: Bridging High Throughput Screening ...

    EPA Pesticide Factsheets

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  1. High-Throughput Enzyme Kinetics Using Microarrays

    SciTech Connect

    Guoxin Lu; Edward S. Yeung

    2007-11-01

    We report a microanalytical method to study enzyme kinetics. The technique involves immobilizing horseradish peroxidase on a poly-L-lysine (PLL)- coated glass slide in a microarray format, followed by applying substrate solution onto the enzyme microarray. Enzyme molecules are immobilized on the PLL-coated glass slide through electrostatic interactions, and no further modification of the enzyme or glass slide is needed. In situ detection of the products generated on the enzyme spots is made possible by monitoring the light intensity of each spot using a scientific-grade charged-coupled device (CCD). Reactions of substrate solutions of various types and concentrations can be carried out sequentially on one enzyme microarray. To account for the loss of enzyme from washing in between runs, a standard substrate solution is used for calibration. Substantially reduced amounts of substrate solution are consumed for each reaction on each enzyme spot. The Michaelis constant K{sub m} obtained by using this method is comparable to the result for homogeneous solutions. Absorbance detection allows universal monitoring, and no chemical modification of the substrate is needed. High-throughput studies of native enzyme kinetics for multiple enzymes are therefore possible in a simple, rapid, and low-cost manner.

  2. AOPs and Biomarkers: Bridging High Throughput Screening ...

    EPA Pesticide Factsheets

    As high throughput screening (HTS) plays a larger role in toxicity testing, camputational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models designed to quantify potential adverse effects based on HTS data will benefit from additional data sources that connect the magnitude of perturbation from the in vitro system to a level of concern at the organism or population level. The adverse outcome pathway (AOP) concept provides an ideal framework for combining these complementary data. Recent international efforts under the auspices of the Organization for Economic Co-operation and Development (OECD) have resulted in an AOP wiki designed to house formal descriptions of AOPs suitable for use in regulatory decision making. Recent efforts have built upon this to include an ontology describing the AOP with linkages to biological pathways, physiological terminology, and taxonomic applicability domains. Incorporation of an AOP network tool developed by the U.S. Army Corps of Engineers also allows consideration of cumulative risk from chemical and non-chemical stressors. Biomarkers are an important complement to formal AOP descriptions, particularly when dealing with susceptible subpopulations or lifestages in human health risk assessment. To address the issue of nonchemical stressors than may modify effects of criteria air pollutants, a novel method was used to integrate blood gene expression data with hema

  3. Uncertainty Quantification in High Throughput Screening ...

    EPA Pesticide Factsheets

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  4. New High Throughput Methods to Estimate Chemical ...

    EPA Pesticide Factsheets

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP

  5. High-throughput Crystallography for Structural Genomics

    PubMed Central

    Joachimiak, Andrzej

    2009-01-01

    Protein X-ray crystallography recently celebrated its 50th anniversary. The structures of myoglobin and hemoglobin determined by Kendrew and Perutz provided the first glimpses into the complex protein architecture and chemistry. Since then, the field of structural molecular biology has experienced extraordinary progress and now over 53,000 proteins structures have been deposited into the Protein Data Bank. In the past decade many advances in macromolecular crystallography have been driven by world-wide structural genomics efforts. This was made possible because of third-generation synchrotron sources, structure phasing approaches using anomalous signal and cryo-crystallography. Complementary progress in molecular biology, proteomics, hardware and software for crystallographic data collection, structure determination and refinement, computer science, databases, robotics and automation improved and accelerated many processes. These advancements provide the robust foundation for structural molecular biology and assure strong contribution to science in the future. In this report we focus mainly on reviewing structural genomics high-throughput X-ray crystallography technologies and their impact. PMID:19765976

  6. New High Throughput Methods to Estimate Chemical ...

    EPA Pesticide Factsheets

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP

  7. Techniques for analysis and purification in high-throughput chemistry.

    PubMed

    Hughes, I; Hunter, D

    2001-06-01

    The success of combinatorial chemistry, and the increased emphasis on single well-characterised compounds of high purity, has had a significant impact on analytical and purification technologies. The requirement for ever-increasing throughput has led to the automation and parallelisation of these techniques. Advances have also been made in developing faster methods to augment throughput further.

  8. High Throughput Bent-Pipe Processor Demonstrator

    NASA Astrophysics Data System (ADS)

    Tabacco, P.; Vernucci, A.; Russo, L.; Cangini, P.; Botticchio, T.; Angeletti, P.

    2008-08-01

    The work associated to this article is a study initiative sponsored by ESA/ESTEC that responds to the crucial need of developing new Satellite payload aimed at making rapid progresses in handling large amounts of data at a competitive price with respect to terrestrial one in the telecommunication field. Considering the quite limited band allowed to space communications at Ka band, reusing the same band in a large number of beams is mandatory: therefore beam-forming is the right technological answer. Technological progresses - mainly in the digital domain - also help greatly in increasing the satellite capacity. Next Satellite payload target are set in throughput range of 50Gbps. Despite the fact that the implementation of a wideband transparent processor for a high capacity communication payload is a very challenging task, Space Engineering team in the frame of this ESA study proposed an intermediate step of development for a scalable unit able to demonstrate both the capacity and flexibility objectives for different type of Wideband Beamforming antennas designs. To this aim the article describes the features of Wideband HW (analog and digital) platform purposely developed by Space Engineering in the frame of this ESA/ESTEC contract ("WDBFN" contract) with some preliminary system test results. The same platform and part of the associated SW will be used in the development and demonstration of the real payload digital front end Mux and Demux algorithms as well as the Beam Forming and on Board channel switching in frequency domain. At the time of this article writing, despite new FPGA and new ADC and DAC converters have become available as choices for wideband system implementation, the two HW platforms developed by Space Engineering, namely WDBFN ADC and DAC Boards, represent still the most performing units in terms of analog bandwidth, processing capability (in terms of FPGA module density), SERDES (SERiliazer DESerializers) external links density, integration form

  9. High-throughput techniques for compound characterization and purification.

    PubMed

    Kyranos, J N; Cai, H; Zhang, B; Goetzinger, W K

    2001-11-01

    A new paradigm in drug discovery is the synthesis of structurally diverse collections of compounds, so-called libraries, followed by high-throughput biological screening. High-throughput characterization and purification techniques are required to provide high-quality compounds and reliable biological data, which has led to the development of faster methods, system automation and parallel approaches. This review summarizes recent advances in support of analytical characterization and preparative purification technologies. Notably, mass spectrometry (MS) and supercritical fluid chromatography (SFC) are among the areas where new developments have had a major impact on defining these high-throughput applications.

  10. A Proteome-wide Fission Yeast Interactome Reveals Network Evolution Principles from Yeasts to Human.

    PubMed

    Vo, Tommy V; Das, Jishnu; Meyer, Michael J; Cordero, Nicolas A; Akturk, Nurten; Wei, Xiaomu; Fair, Benjamin J; Degatano, Andrew G; Fragoza, Robert; Liu, Lisa G; Matsuyama, Akihisa; Trickey, Michelle; Horibata, Sachi; Grimson, Andrew; Yamano, Hiroyuki; Yoshida, Minoru; Roth, Frederick P; Pleiss, Jeffrey A; Xia, Yu; Yu, Haiyuan

    2016-01-14

    Here, we present FissionNet, a proteome-wide binary protein interactome for S. pombe, comprising 2,278 high-quality interactions, of which ∼ 50% were previously not reported in any species. FissionNet unravels previously unreported interactions implicated in processes such as gene silencing and pre-mRNA splicing. We developed a rigorous network comparison framework that accounts for assay sensitivity and specificity, revealing extensive species-specific network rewiring between fission yeast, budding yeast, and human. Surprisingly, although genes are better conserved between the yeasts, S. pombe interactions are significantly better conserved in human than in S. cerevisiae. Our framework also reveals that different modes of gene duplication influence the extent to which paralogous proteins are functionally repurposed. Finally, cross-species interactome mapping demonstrates that coevolution of interacting proteins is remarkably prevalent, a result with important implications for studying human disease in model organisms. Overall, FissionNet is a valuable resource for understanding protein functions and their evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Proteome-wide survey of the autoimmune target repertoire in autoimmune polyendocrine syndrome type 1.

    PubMed

    Landegren, Nils; Sharon, Donald; Freyhult, Eva; Hallgren, Åsa; Eriksson, Daniel; Edqvist, Per-Henrik; Bensing, Sophie; Wahlberg, Jeanette; Nelson, Lawrence M; Gustafsson, Jan; Husebye, Eystein S; Anderson, Mark S; Snyder, Michael; Kämpe, Olle

    2016-02-01

    Autoimmune polyendocrine syndrome type 1 (APS1) is a monogenic disorder that features multiple autoimmune disease manifestations. It is caused by mutations in the Autoimmune regulator (AIRE) gene, which promote thymic display of thousands of peripheral tissue antigens in a process critical for establishing central immune tolerance. We here used proteome arrays to perform a comprehensive study of autoimmune targets in APS1. Interrogation of established autoantigens revealed highly reliable detection of autoantibodies, and by exploring the full panel of more than 9000 proteins we further identified MAGEB2 and PDILT as novel major autoantigens in APS1. Our proteome-wide assessment revealed a marked enrichment for tissue-specific immune targets, mirroring AIRE's selectiveness for this category of genes. Our findings also suggest that only a very limited portion of the proteome becomes targeted by the immune system in APS1, which contrasts the broad defect of thymic presentation associated with AIRE-deficiency and raises novel questions what other factors are needed for break of tolerance.

  12. Proteome-wide analysis of functional divergence in bacteria: exploring a host of ecological adaptations.

    PubMed

    Caffrey, Brian E; Williams, Tom A; Jiang, Xiaowei; Toft, Christina; Hokamp, Karsten; Fares, Mario A

    2012-01-01

    Functional divergence is the process by which new genes and functions originate through the modification of existing ones. Both genetic and environmental factors influence the evolution of new functions, including gene duplication or changes in the ecological requirements of an organism. Novel functions emerge at the expense of ancestral ones and are generally accompanied by changes in the selective forces at constrained protein regions. We present software capable of analyzing whole proteomes, identifying putative amino acid replacements leading to functional change in each protein and performing statistical tests on all tabulated data. We apply this method to 750 complete bacterial proteomes to identify high-level patterns of functional divergence and link these patterns to ecological adaptations. Proteome-wide analyses of functional divergence in bacteria with different ecologies reveal a separation between proteins involved in information processing (Ribosome biogenesis etc.) and those which are dependent on the environment (energy metabolism, defense etc.). We show that the evolution of pathogenic and symbiotic bacteria is constrained by their association with the host, and also identify unusual events of functional divergence even in well-studied bacteria such as Escherichia coli. We present a description of the roles of phylogeny and ecology in functional divergence at the level of entire proteomes in bacteria.

  13. A proteome-wide fission yeast interactome reveals network evolution principles from yeasts to human

    PubMed Central

    Vo, Tommy V.; Das, Jishnu; Meyer, Michael J.; Cordero, Nicolas A.; Akturk, Nurten; Wei, Xiaomu; Fair, Benjamin J.; Degatano, Andrew G.; Fragoza, Robert; Liu, Lisa G.; Matsuyama, Akihisa; Trickey, Michelle; Horibata, Sachi; Grimson, Andrew; Yamano, Hiroyuki; Yoshida, Minoru; Roth, Frederick P.; Pleiss, Jeffrey A.; Xia, Yu; Yu, Haiyuan

    2015-01-01

    SUMMARY Here, we present FissionNet, a proteome-wide binary protein interactome for S. pombe, comprising 2,278 high-quality interactions, of which ~50% were previously not reported in any species. FissionNet unravels previously unreported interactions implicated in processes such as gene silencing and pre-mRNA splicing. We developed a rigorous network comparison framework that accounts for assay sensitivity and specificity, revealing extensive species-specific network rewiring between fission yeast, budding yeast, and human. Surprisingly, although genes are better conserved between the yeasts, S. pombe interactions are significantly better conserved in human than in S. cerevisiae. Our framework also reveals that different modes of gene duplication influence the extent to which paralogous proteins are functionally repurposed. Finally, cross-species interactome mapping demonstrates that coevolution of interacting proteins is remarkably prevalent, a result with important implications for studying human disease in model organisms. Overall, FissionNet is a valuable resource for understanding protein functions and their evolution. PMID:26771498

  14. Applications of ambient mass spectrometry in high-throughput screening.

    PubMed

    Li, Li-Ping; Feng, Bao-Sheng; Yang, Jian-Wang; Chang, Cui-Lan; Bai, Yu; Liu, Hu-Wei

    2013-06-07

    The development of rapid screening and identification techniques is of great importance for drug discovery, doping control, forensic identification, food safety and quality control. Ambient mass spectrometry (AMS) allows rapid and direct analysis of various samples in open air with little sample preparation. Recently, its applications in high-throughput screening have been in rapid progress. During the past decade, various ambient ionization techniques have been developed and applied in high-throughput screening. This review discusses typical applications of AMS, including DESI (desorption electrospray ionization), DART (direct analysis in real time), EESI (extractive electrospray ionization), etc., in high-throughput screening (HTS).

  15. Development of A High Throughput Method Incorporating Traditional Analytical Devices

    PubMed Central

    White, C. C.; Embree, E.; Byrd, W. E; Patel, A. R.

    2004-01-01

    A high-throughput (high throughput is the ability to process large numbers of samples) and companion informatics system has been developed and implemented. High throughput is defined as the ability to autonomously evaluate large numbers of samples, while an informatics system provides the software control of the physical devices, in addition to the organization and storage of the generated electronic data. This high throughput system includes both an ultra-violet and visible light spectrometer (UV-Vis) and a Fourier transform infrared spectrometer (FTIR) integrated with a multi sample positioning table. This method is designed to quantify changes in polymeric materials occurring from controlled temperature, humidity and high flux UV exposures. The integration of the software control of these analytical instruments within a single computer system is presented. Challenges in enhancing the system to include additional analytical devices are discussed. PMID:27366626

  16. Extended length microchannels for high density high throughput electrophoresis systems

    DOEpatents

    Davidson, James C.; Balch, Joseph W.

    2000-01-01

    High throughput electrophoresis systems which provide extended well-to-read distances on smaller substrates, thus compacting the overall systems. The electrophoresis systems utilize a high density array of microchannels for electrophoresis analysis with extended read lengths. The microchannel geometry can be used individually or in conjunction to increase the effective length of a separation channel while minimally impacting the packing density of channels. One embodiment uses sinusoidal microchannels, while another embodiment uses plural microchannels interconnected by a via. The extended channel systems can be applied to virtually any type of channel confined chromatography.

  17. High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)

    EPA Science Inventory

    High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...

  18. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  19. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  20. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  1. High-Throughput Pharmacokinetics for Environmental Chemicals (SOT)

    EPA Science Inventory

    High throughput screening (HTS) promises to allow prioritization of thousands of environmental chemicals with little or no in vivo information. For bioactivity identified by HTS, toxicokinetic (TK) models are essential to predict exposure thresholds below which no significant bio...

  2. Evaluating Rapid Models for High-Throughput Exposure Forecasting (SOT)

    EPA Science Inventory

    High throughput exposure screening models can provide quantitative predictions for thousands of chemicals; however these predictions must be systematically evaluated for predictive ability. Without the capability to make quantitative, albeit uncertain, forecasts of exposure, the ...

  3. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  4. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  5. HIGH THROUGHPUT ASSESSMENTS OF CONVENTIONAL AND ALTERNATIVE COMPOUNDS

    EPA Science Inventory

    High throughput approaches for quantifying chemical hazard, exposure, and sustainability have the potential to dramatically impact the pace and nature of risk assessments. Integrated evaluation strategies developed at the US EPA incorporate inherency,bioactivity,bioavailability, ...

  6. MIPHENO: Data normalization for high throughput metabolic analysis.

    EPA Science Inventory

    High throughput methodologies such as microarrays, mass spectrometry and plate-based small molecule screens are increasingly used to facilitate discoveries from gene function to drug candidate identification. These large-scale experiments are typically carried out over the course...

  7. High resolution hyperspectral imaging with a high throughput virtual slit

    NASA Astrophysics Data System (ADS)

    Gooding, Edward A.; Gunn, Thomas; Cenko, Andrew T.; Hajian, Arsen R.

    2016-05-01

    Hyperspectral imaging (HSI) device users often require both high spectral resolution, on the order of 1 nm, and high light-gathering power. A wide entrance slit assures reasonable étendue but degrades spectral resolution. Spectrometers built using High Throughput Virtual Slit™ (HTVS) technology optimize both parameters simultaneously. Two remote sensing use cases that require high spectral resolution are discussed. First, detection of atmospheric gases with intrinsically narrow absorption lines, such as hydrocarbon vapors or combustion exhaust gases such as NOx and CO2. Detecting exhaust gas species with high precision has become increasingly important in the light of recent events in the automobile industry. Second, distinguishing reflected daylight from emission spectra in the visible and NIR (VNIR) regions is most easily accomplished using the Fraunhofer absorption lines in solar spectra. While ground reflectance spectral features in the VNIR are generally quite broad, the Fraunhofer lines are narrow and provide a signature of intrinsic vs. extrinsic illumination. The High Throughput Virtual Slit enables higher spectral resolution than is achievable with conventional spectrometers by manipulating the beam profile in pupil space. By reshaping the instrument pupil with reflective optics, HTVS-equipped instruments create a tall, narrow image profile at the exit focal plane, typically delivering 5X or better the spectral resolution achievable with a conventional design.

  8. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline.

  9. Toward high throughput optical metamaterial assemblies.

    PubMed

    Fontana, Jake; Ratna, Banahalli R

    2015-11-01

    Optical metamaterials have unique engineered optical properties. These properties arise from the careful organization of plasmonic elements. Transitioning these properties from laboratory experiments to functional materials may lead to disruptive technologies for controlling light. A significant issue impeding the realization of optical metamaterial devices is the need for robust and efficient assembly strategies to govern the order of the nanometer-sized elements while enabling macroscopic throughput. This mini-review critically highlights recent approaches and challenges in creating these artificial materials. As the ability to assemble optical metamaterials improves, new unforeseen opportunities may arise for revolutionary optical devices.

  10. Mapping Proteome-wide Targets of Glyphosate in Mice.

    PubMed

    Ford, Breanna; Bateman, Leslie A; Gutierrez-Palominos, Leilani; Park, Robin; Nomura, Daniel K

    2017-02-16

    Glyphosate, the active ingredient in the herbicide Roundup, is one of the most widely used pesticides in agriculture and home garden use. Whether glyphosate causes any mammalian toxicity remains highly controversial. While many studies have associated glyphosate with numerous adverse health effects, the mechanisms underlying glyphosate toxicity in mammals remain poorly understood. Here, we used activity-based protein profiling to map glyphosate targets in mice. We show that glyphosate at high doses can be metabolized in vivo to reactive metabolites such as glyoxylate and react with cysteines across many proteins in mouse liver. We show that glyoxylate inhibits liver fatty acid oxidation enzymes and glyphosate treatment in mice increases the levels of triglycerides and cholesteryl esters, likely resulting from diversion of fatty acids away from oxidation and toward other lipid pathways. Our study highlights the utility of using chemoproteomics to identify novel toxicological mechanisms of environmental chemicals such as glyphosate.

  11. Improving data quality and preserving HCD-generated reporter ions with EThcD for isobaric tag-based quantitative proteomics and proteome-wide PTM studies.

    PubMed

    Yu, Qing; Shi, Xudong; Feng, Yu; Kent, K Craig; Li, Lingjun

    2017-05-22

    Mass spectrometry (MS)-based isobaric labeling has undergone rapid development in recent years due to its capability for high throughput quantitation. Apart from its originally designed use with collision-induced dissociation (CID) and higher-energy collisional dissociation (HCD), isobaric tagging technique could also work with electron-transfer dissociation (ETD), which provides complementarity to CID and is preferred in sequencing peptides with post-translational modifications (PTMs). However, ETD suffers from long reaction time, reduced duty cycle and bias against peptides with lower charge states. In addition, common fragmentation mechanism in ETD results in altered reporter ion production, decreased multiplexing capability, and even loss of quantitation capability for some of the isobaric tags, including custom-designed dimethyl leucine (DiLeu) tags. Here, we demonstrate a novel electron-transfer/higher-energy collision dissociation (EThcD) approach that preserves original reporter ion channels, mitigates bias against lower charge states, improves sensitivity, and significantly improves data quality for quantitative proteomics and proteome-wide PTM studies. Systematic optimization was performed to achieve a balance between data quality and sensitivity. We provide direct comparison of EThcD with ETD and HCD for DiLeu- and TMT-labeled HEK cell lysate and IMAC enriched phosphopeptides. Results demonstrate improved data quality and phosphorylation localization accuracy while preserving sufficient reporter ion production. Biological studies were performed to investigate phosphorylation changes in a mouse vascular smooth muscle cell line treated with four different conditions. Overall, EThcD exhibits superior performance compared to conventional ETD and offers distinct advantages compared to HCD in isobaric labeling based quantitative proteomics and quantitative PTM studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Proteome-Wide Search Reveals Unexpected RNA-Binding Proteins in Saccharomyces cerevisiae

    PubMed Central

    Salzman, Julia; Brown, Patrick O.

    2010-01-01

    The vast landscape of RNA-protein interactions at the heart of post-transcriptional regulation remains largely unexplored. Indeed it is likely that, even in yeast, a substantial fraction of the regulatory RNA-binding proteins (RBPs) remain to be discovered. Systematic experimental methods can play a key role in discovering these RBPs - most of the known yeast RBPs lack RNA-binding domains that might enable this activity to be predicted. We describe here a proteome-wide approach to identify RNA-protein interactions based on in vitro binding of RNA samples to yeast protein microarrays that represent over 80% of the yeast proteome. We used this procedure to screen for novel RBPs and RNA-protein interactions. A complementary mass spectrometry technique also identified proteins that associate with yeast mRNAs. Both the protein microarray and mass spectrometry methods successfully identify previously annotated RBPs, suggesting that other proteins identified in these assays might be novel RBPs. Of 35 putative novel RBPs identified by either or both of these methods, 12, including 75% of the eight most highly-ranked candidates, reproducibly associated with specific cellular RNAs. Surprisingly, most of the 12 newly discovered RBPs were enzymes. Functional characteristics of the RNA targets of some of the novel RBPs suggest coordinated post-transcriptional regulation of subunits of protein complexes and a possible link between mRNA trafficking and vesicle transport. Our results suggest that many more RBPs still remain to be identified and provide a set of candidates for further investigation. PMID:20844764

  13. Experimental Design for Combinatorial and High Throughput Materials Development

    NASA Astrophysics Data System (ADS)

    Cawse, James N.

    2002-12-01

    In the past decade, combinatorial and high throughput experimental methods have revolutionized the pharmaceutical industry, allowing researchers to conduct more experiments in a week than was previously possible in a year. Now high throughput experimentation is rapidly spreading from its origins in the pharmaceutical world to larger industrial research establishments such as GE and DuPont, and even to smaller companies and universities. Consequently, researchers need to know the kinds of problems, desired outcomes, and appropriate patterns for these new strategies. Editor James Cawse's far-reaching study identifies and applies, with specific examples, these important new principles and techniques. Experimental Design for Combinatorial and High Throughput Materials Development progresses from methods that are now standard, such as gradient arrays, to mathematical developments that are breaking new ground. The former will be particularly useful to researchers entering the field, while the latter should inspire and challenge advanced practitioners. The book's contents are contributed by leading researchers in their respective fields. Chapters include: -High Throughput Synthetic Approaches for the Investigation of Inorganic Phase Space -Combinatorial Mapping of Polymer Blends Phase Behavior -Split-Plot Designs -Artificial Neural Networks in Catalyst Development -The Monte Carlo Approach to Library Design and Redesign This book also contains over 200 useful charts and drawings. Industrial chemists, chemical engineers, materials scientists, and physicists working in combinatorial and high throughput chemistry will find James Cawse's study to be an invaluable resource.

  14. Implementation of high throughput experimentation techniques for kinetic reaction testing.

    PubMed

    Nagy, Anton J

    2012-02-01

    Successful implementation of High throughput Experimentation (EE) tools has resulted in their increased acceptance as essential tools in chemical, petrochemical and polymer R&D laboratories. This article provides a number of concrete examples of EE systems, which have been designed and successfully implemented in studies, which focus on deriving reaction kinetic data. The implementation of high throughput EE tools for performing kinetic studies of both catalytic and non-catalytic systems results in a significantly faster acquisition of high-quality kinetic modeling data, required to quantitatively predict the behavior of complex, multistep reactions.

  15. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth; Achari, Amiruddha

    2005-01-01

    cost optics, further increasing throughput at synchrotrons. Preliminary experiments show that the presence of the fluorescent probe does not affect the nucleation process or the quality of the X-ray data obtained.

  16. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth

    2005-01-01

    , further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  17. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Minamitani, Elizabeth Forsythe; Pusey, Marc L.

    2004-01-01

    using relatively low cost optics, further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  18. Fluorescent Approaches to High Throughput Crystallography

    NASA Technical Reports Server (NTRS)

    Pusey, Marc L.; Forsythe, Elizabeth

    2004-01-01

    cost optics, further increasing throughput at synchrotrons. This presentation will focus on the methodology for fluorescent labeling, the crystallization results, and the effects of the trace labeling on the crystal quality.

  19. The high throughput biomedicine unit at the institute for molecular medicine Finland: high throughput screening meets precision medicine.

    PubMed

    Pietiainen, Vilja; Saarela, Jani; von Schantz, Carina; Turunen, Laura; Ostling, Paivi; Wennerberg, Krister

    2014-05-01

    The High Throughput Biomedicine (HTB) unit at the Institute for Molecular Medicine Finland FIMM was established in 2010 to serve as a national and international academic screening unit providing access to state of the art instrumentation for chemical and RNAi-based high throughput screening. The initial focus of the unit was multiwell plate based chemical screening and high content microarray-based siRNA screening. However, over the first four years of operation, the unit has moved to a more flexible service platform where both chemical and siRNA screening is performed at different scales primarily in multiwell plate-based assays with a wide range of readout possibilities with a focus on ultraminiaturization to allow for affordable screening for the academic users. In addition to high throughput screening, the equipment of the unit is also used to support miniaturized, multiplexed and high throughput applications for other types of research such as genomics, sequencing and biobanking operations. Importantly, with the translational research goals at FIMM, an increasing part of the operations at the HTB unit is being focused on high throughput systems biological platforms for functional profiling of patient cells in personalized and precision medicine projects.

  20. Combinatorial and high-throughput screening approaches for strain engineering.

    PubMed

    Liu, Wenshan; Jiang, Rongrong

    2015-03-01

    Microbes have long been used in the industry to produce valuable biochemicals. Combinatorial engineering approaches, new strain engineering tools derived from inverse metabolic engineering, have started to attract attention in recent years, including genome shuffling, error-prone DNA polymerase, global transcription machinery engineering (gTME), random knockout/overexpression libraries, ribosome engineering, multiplex automated genome engineering (MAGE), customized optimization of metabolic pathways by combinatorial transcriptional engineering (COMPACTER), and library construction of "tunable intergenic regions" (TIGR). Since combinatorial approaches and high-throughput screening methods are fundamentally interconnected, color/fluorescence-based, growth-based, and biosensor-based high-throughput screening methods have been reviewed. We believe that with the help of metabolic engineering tools and new combinatorial approaches, plus effective high-throughput screening methods, researchers will be able to achieve better results on improving microorganism performance under stress or enhancing biochemical yield.

  1. A System for Performing High Throughput Assays of Synaptic Function

    PubMed Central

    Hempel, Chris M.; Sivula, Michael; Levenson, Jonathan M.; Rose, David M.; Li, Bing; Sirianni, Ana C.; Xia, Eva; Ryan, Timothy A.; Gerber, David J.; Cottrell, Jeffrey R.

    2011-01-01

    Unbiased, high-throughput screening has proven invaluable for dissecting complex biological processes. Application of this general approach to synaptic function would have a major impact on neuroscience research and drug discovery. However, existing techniques for studying synaptic physiology are labor intensive and low-throughput. Here, we describe a new high-throughput technology for performing assays of synaptic function in primary neurons cultured in microtiter plates. We show that this system can perform 96 synaptic vesicle cycling assays in parallel with high sensitivity, precision, uniformity, and reproducibility and can detect modulators of presynaptic function. By screening libraries of pharmacologically defined compounds on rat forebrain cultures, we have used this system to identify novel effects of compounds on specific aspects of presynaptic function. As a system for unbiased compound as well as genomic screening, this technology has significant applications for basic neuroscience research and for the discovery of novel, mechanism-based treatments for central nervous system disorders. PMID:21998743

  2. Perspective: Data infrastructure for high throughput materials discovery

    NASA Astrophysics Data System (ADS)

    Pfeif, E. A.; Kroenlein, K.

    2016-05-01

    Computational capability has enabled materials design to evolve from trial-and-error towards more informed methodologies that require large amounts of data. Expert-designed tools and their underlying databases facilitate modern-day high throughput computational methods. Standard data formats and communication standards increase the impact of traditional data, and applying these technologies to a high throughput experimental design provides dense, targeted materials data that are valuable for material discovery. Integrated computational materials engineering requires both experimentally and computationally derived data. Harvesting these comprehensively requires different methods of varying degrees of automation to accommodate variety and volume. Issues of data quality persist independent of type.

  3. Screening and synthesis: high throughput technologies applied to parasitology.

    PubMed

    Morgan, R E; Westwood, N J

    2004-01-01

    High throughput technologies continue to develop in response to the challenges set by the genome projects. This article discusses how the techniques of both high throughput screening (HTS) and synthesis can influence research in parasitology. Examples of the use of targeted and phenotype-based HTS using unbiased compound collections are provided. The important issue of identifying the protein target(s) of bioactive compounds is discussed from the synthetic chemist's perspective. This article concludes by reviewing recent examples of successful target identification studies in parasitology.

  4. Advances in high throughput DNA sequence data compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz

    2016-06-01

    Advances in high throughput sequencing technologies and reduction in cost of sequencing have led to exponential growth in high throughput DNA sequence data. This growth has posed challenges such as storage, retrieval, and transmission of sequencing data. Data compression is used to cope with these challenges. Various methods have been developed to compress genomic and sequencing data. In this article, we present a comprehensive review of compression methods for genome and reads compression. Algorithms are categorized as referential or reference free. Experimental results and comparative analysis of various methods for data compression are presented. Finally, key challenges and research directions in DNA sequence data compression are highlighted.

  5. Droplet microfluidics for high-throughput biological assays.

    PubMed

    Guo, Mira T; Rotem, Assaf; Heyman, John A; Weitz, David A

    2012-06-21

    Droplet microfluidics offers significant advantages for performing high-throughput screens and sensitive assays. Droplets allow sample volumes to be significantly reduced, leading to concomitant reductions in cost. Manipulation and measurement at kilohertz speeds enable up to 10(8) samples to be screened in one day. Compartmentalization in droplets increases assay sensitivity by increasing the effective concentration of rare species and decreasing the time required to reach detection thresholds. Droplet microfluidics combines these powerful features to enable currently inaccessible high-throughput screening applications, including single-cell and single-molecule assays.

  6. Insights to transcriptional networks by using high throughput RNAi strategies.

    PubMed

    Mattila, Jaakko; Puig, Oscar

    2010-01-01

    RNA interference (RNAi) is a powerful method to unravel the role of a given gene in eukaryotic cells. The development of high throughput assay platforms such as fluorescence plate readers and high throughput microscopy has allowed the design of genome wide RNAi screens to systemically discern members of regulatory networks around various cellular processes. Here we summarize the different strategies employed in RNAi screens to reveal regulators of transcriptional networks. We focus our discussion in experimental approaches designed to uncover regulatory interactions modulating transcription factor activity.

  7. High-throughput screening for modulators of cellular contractile force†

    PubMed Central

    Park, Chan Young; Zhou, Enhua H.; Tambe, Dhananjay; Chen, Bohao; Lavoie, Tera; Dowell, Maria; Simeonov, Anton; Maloney, David J.; Marinkovic, Aleksandar; Tschumperlin, Daniel J.; Burger, Stephanie; Frykenberg, Matthew; Butler, James P.; Stamer, W. Daniel; Johnson, Mark; Solway, Julian; Fredberg, Jeffrey J.

    2015-01-01

    When cellular contractile forces are central to pathophysiology, these forces comprise a logical target of therapy. Nevertheless, existing high-throughput screens are limited to upstream signalling intermediates with poorly defined relationships to such a physiological endpoint. Using cellular force as the target, here we report a new screening technology and demonstrate its applications using human airway smooth muscle cells in the context of asthma and Schlemm's canal endothelial cells in the context of glaucoma. This approach identified several drug candidates for both asthma and glaucoma. We attained rates of 1000 compounds per screening day, thus establishing a force-based cellular platform for high-throughput drug discovery. PMID:25953078

  8. A high-throughput label-free nanoparticle analyser

    NASA Astrophysics Data System (ADS)

    Fraikin, Jean-Luc; Teesalu, Tambet; McKenney, Christopher M.; Ruoslahti, Erkki; Cleland, Andrew N.

    2011-05-01

    Synthetic nanoparticles and genetically modified viruses are used in a range of applications, but high-throughput analytical tools for the physical characterization of these objects are needed. Here we present a microfluidic analyser that detects individual nanoparticles and characterizes complex, unlabelled nanoparticle suspensions. We demonstrate the detection, concentration analysis and sizing of individual synthetic nanoparticles in a multicomponent mixture with sufficient throughput to analyse 500,000 particles per second. We also report the rapid size and titre analysis of unlabelled bacteriophage T7 in both salt solution and mouse blood plasma, using just ~1 × 10-6 l of analyte. Unexpectedly, in the native blood plasma we discover a large background of naturally occurring nanoparticles with a power-law size distribution. The high-throughput detection capability, scalable fabrication and simple electronics of this instrument make it well suited for diverse applications.

  9. Automatic Spot Identification for High Throughput Microarray Analysis

    PubMed Central

    Wu, Eunice; Su, Yan A.; Billings, Eric; Brooks, Bernard R.; Wu, Xiongwu

    2013-01-01

    High throughput microarray analysis has great potential in scientific research, disease diagnosis, and drug discovery. A major hurdle toward high throughput microarray analysis is the time and effort needed to accurately locate gene spots in microarray images. An automatic microarray image processor will allow accurate and efficient determination of spot locations and sizes so that gene expression information can be reliably extracted in a high throughput manner. Current microarray image processing tools require intensive manual operations in addition to the input of grid parameters to correctly and accurately identify gene spots. This work developed a method, herein called auto-spot, to automate the spot identification process. Through a series of correlation and convolution operations, as well as pixel manipulations, this method makes spot identification an automatic and accurate process. Testing with real microarray images has demonstrated that this method is capable of automatically extracting subgrids from microarray images and determining spot locations and sizes within each subgrid, regardless of variations in array patterns and background noises. With this method, we are one step closer to the goal of high throughput microarray analysis. PMID:24298393

  10. Evaluation of sequencing approaches for high-throughput toxicogenomics (SOT)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platfo...

  11. Environmental Impact on Vascular Development Predicted by High Throughput Screening

    EPA Science Inventory

    Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...

  12. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  13. Environmental Impact on Vascular Development Predicted by High Throughput Screening

    EPA Science Inventory

    Understanding health risks to embryonic development from exposure to environmental chemicals is a significant challenge given the diverse chemical landscape and paucity of data for most of these compounds. High throughput screening (HTS) in EPA’s ToxCastTM project provides vast d...

  14. High-throughput production of two disulphide-bridge toxins.

    PubMed

    Upert, Grégory; Mourier, Gilles; Pastor, Alexandra; Verdenaud, Marion; Alili, Doria; Servent, Denis; Gilles, Nicolas

    2014-08-07

    A quick and efficient production method compatible with high-throughput screening was developed using 36 toxins belonging to four different families of two disulphide-bridge toxins. Final toxins were characterized using HPLC co-elution, CD and pharmacological studies.

  15. High Throughput Assays and Exposure Science (ISES annual meeting)

    EPA Science Inventory

    High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...

  16. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    EPA Science Inventory

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  17. High Throughput Exposure Estimation Using NHANES Data (SOT)

    EPA Science Inventory

    In the ExpoCast project, high throughput (HT) exposure models enable rapid screening of large numbers of chemicals for exposure potential. Evaluation of these models requires empirical exposure data and due to the paucity of human metabolism/exposure data such evaluations includ...

  18. High Throughput Assays and Exposure Science (ISES annual meeting)

    EPA Science Inventory

    High throughput screening (HTS) data characterizing chemical-induced biological activity has been generated for thousands of environmentally-relevant chemicals by the US inter-agency Tox21 and the US EPA ToxCast programs. For a limited set of chemicals, bioactive concentrations r...

  19. Fully Bayesian Analysis of High-throughput Targeted Metabolomics Assays

    EPA Science Inventory

    High-throughput metabolomic assays that allow simultaneous targeted screening of hundreds of metabolites have recently become available in kit form. Such assays provide a window into understanding changes to biochemical pathways due to chemical exposure or disease, and are usefu...

  20. New High Throughput Methods to Estimate Chemical Exposure

    EPA Science Inventory

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...

  1. A Functional High-Throughput Assay of Myelination in Vitro

    DTIC Science & Technology

    2014-07-01

    potential therapies for myelin disorders such as multiple sclerosis . Tissues engineered from human induced pluripotent stem (iPS) may be effective at...Human induced pluripotent stem cells , hydrogels, 3D culture, electrophysiology, high-throughput assay 16. SECURITY CLASSIFICATION OF: 17...or remyelination would substantially speed the development and testing of potential therapies for myelin disorders such as multiple sclerosis

  2. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  3. HTTK: R Package for High-Throughput Toxicokinetics

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening programs such as ToxCast and Tox21; these chemicals are tested in part because most of them have limited or no data on hazard, exposure, or toxicokinetics. Toxicokinetic models aid in predicting tissue concent...

  4. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  5. New High Throughput Methods to Estimate Chemical Exposure

    EPA Science Inventory

    EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...

  6. Accounting For Uncertainty in The Application Of High Throughput Datasets

    EPA Science Inventory

    The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...

  7. High-throughput screening, predictive modeling and computational embryology

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to profile thousands of chemical compounds for biological activity and potential toxicity. EPA’s ToxCast™ project, and the broader Tox21 consortium, in addition to projects worldwide,...

  8. 20170612 - Fun with High Throughput Toxicokinetics (CalEPA webinar)

    EPA Science Inventory

    Thousands of chemicals have been profiled by high-throughput screening (HTS) programs such as ToxCast and Tox21. These chemicals are tested in part because there are limited or no data on hazard, exposure, or toxicokinetics (TK). TK models aid in predicting tissue concentrations ...

  9. Evaluation of sequencing approaches for high-throughput toxicogenomics (SOT)

    EPA Science Inventory

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platfo...

  10. High Throughput Sequence Analysis for Disease Resistance in Maize

    USDA-ARS?s Scientific Manuscript database

    Preliminary results of a computational analysis of high throughput sequencing data from Zea mays and the fungus Aspergillus are reported. The Illumina Genome Analyzer was used to sequence RNA samples from two strains of Z. mays (Va35 and Mp313) collected over a time course as well as several specie...

  11. High-Throughput Toxicity Testing: New Strategies for ...

    EPA Pesticide Factsheets

    In recent years, the food industry has made progress in improving safety testing methods focused on microbial contaminants in order to promote food safety. However, food industry toxicologists must also assess the safety of food-relevant chemicals including pesticides, direct additives, and food contact substances. With the rapidly growing use of new food additives, as well as innovation in food contact substance development, an interest in exploring the use of high-throughput chemical safety testing approaches has emerged. Currently, the field of toxicology is undergoing a paradigm shift in how chemical hazards can be evaluated. Since there are tens of thousands of chemicals in use, many of which have little to no hazard information and there are limited resources (namely time and money) for testing these chemicals, it is necessary to prioritize which chemicals require further safety testing to better protect human health. Advances in biochemistry and computational toxicology have paved the way for animal-free (in vitro) high-throughput screening which can characterize chemical interactions with highly specific biological processes. Screening approaches are not novel; in fact, quantitative high-throughput screening (qHTS) methods that incorporate dose-response evaluation have been widely used in the pharmaceutical industry. For toxicological evaluation and prioritization, it is the throughput as well as the cost- and time-efficient nature of qHTS that makes it

  12. High-throughput screening, predictive modeling and computational embryology - Abstract

    EPA Science Inventory

    High-throughput screening (HTS) studies are providing a rich source of data that can be applied to chemical profiling to address sensitivity and specificity of molecular targets, biological pathways, cellular and developmental processes. EPA’s ToxCast project is testing 960 uniq...

  13. Evaluating and Refining High Throughput Tools for Toxicokinetics

    EPA Science Inventory

    This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...

  14. Evaluating and Refining High Throughput Tools for Toxicokinetics

    EPA Science Inventory

    This poster summarizes efforts of the Chemical Safety for Sustainability's Rapid Exposure and Dosimetry (RED) team to facilitate the development and refinement of toxicokinetics (TK) tools to be used in conjunction with the high throughput toxicity testing data generated as a par...

  15. Accounting For Uncertainty in The Application Of High Throughput Datasets

    EPA Science Inventory

    The use of high throughput screening (HTS) datasets will need to adequately account for uncertainties in the data generation process and propagate these uncertainties through to ultimate use. Uncertainty arises at multiple levels in the construction of predictors using in vitro ...

  16. Parallel tools in HEVC for high-throughput processing

    NASA Astrophysics Data System (ADS)

    Zhou, Minhua; Sze, Vivienne; Budagavi, Madhukar

    2012-10-01

    HEVC (High Efficiency Video Coding) is the next-generation video coding standard being jointly developed by the ITU-T VCEG and ISO/IEC MPEG JCT-VC team. In addition to the high coding efficiency, which is expected to provide 50% more bit-rate reduction when compared to H.264/AVC, HEVC has built-in parallel processing tools to address bitrate, pixel-rate and motion estimation (ME) throughput requirements. This paper describes how CABAC, which is also used in H.264/AVC, has been redesigned for improved throughput, and how parallel merge/skip and tiles, which are new tools introduced for HEVC, enable high-throughput processing. CABAC has data dependencies which make it difficult to parallelize and thus limit its throughput. The prediction error/residual, represented as quantized transform coefficients, accounts for the majority of the CABAC workload. Various improvements have been made to the context selection and scans in transform coefficient coding that enable CABAC in HEVC to potentially achieve higher throughput and increased coding gains relative to H.264/AVC. The merge/skip mode is a coding efficiency enhancement tool in HEVC; the parallel merge/skip breaks dependency between the regular and merge/skip ME, which provides flexibility for high throughput and high efficiency HEVC encoder designs. For ultra high definition (UHD) video, such as 4kx2k and 8kx4k resolutions, low-latency and real-time processing may be beyond the capability of a single core codec. Tiles are an effective tool which enables pixel-rate balancing among the cores to achieve parallel processing with a throughput scalable implementation of multi-core UHD video codec. With the evenly divided tiles, a multi-core video codec can be realized by simply replicating single core codec and adding a tile boundary processing core on top of that. These tools illustrate that accounting for implementation cost when designing video coding algorithms can enable higher processing speed and reduce

  17. A high throughput droplet based electroporation system

    NASA Astrophysics Data System (ADS)

    Yoo, Byeongsun; Ahn, Myungmo; Im, Dojin; Kang, Inseok

    2014-11-01

    Delivery of exogenous genetic materials across the cell membrane is a powerful and popular research tool for bioengineering. Among conventional non-viral DNA delivery methods, electroporation (EP) is one of the most widely used technologies and is a standard lab procedure in molecular biology. We developed a novel digital microfluidic electroporation system which has higher efficiency of transgene expression and better cell viability than that of conventional EP techniques. We present the successful performance of digital EP system for transformation of various cell lines by investigating effects of the EP conditions such as electric pulse voltage, number, and duration on the cell viability and transfection efficiency in comparison with a conventional bulk EP system. Through the numerical analysis, we have also calculated the electric field distribution around the cells precisely to verify the effect of the electric field on the high efficiency of the digital EP system. Furthermore, the parallelization of the EP processes has been developed to increase the transformation productivity. This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and Future Planning (Grant Number: 2013R1A1A2011956).

  18. High-throughput heterogeneous catalyst research

    NASA Astrophysics Data System (ADS)

    Turner, Howard W.; Volpe, Anthony F., Jr.; Weinberg, W. H.

    2009-06-01

    With the discovery of abundant and low cost crude oil in the early 1900's came the need to create efficient conversion processes to produce low cost fuels and basic chemicals. Enormous investment over the last century has led to the development of a set of highly efficient catalytic processes which define the modern oil refinery and which produce most of the raw materials and fuels used in modern society. Process evolution and development has led to a refining infrastructure that is both dominated and enabled by modern heterogeneous catalyst technologies. Refineries and chemical manufacturers are currently under intense pressure to improve efficiency, adapt to increasingly disadvantaged feedstocks including biomass, lower their environmental footprint, and continue to deliver their products at low cost. This pressure creates a demand for new and more robust catalyst systems and processes that can accommodate them. Traditional methods of catalyst synthesis and testing are slow and inefficient, particularly in heterogeneous systems where the structure of the active sites is typically complex and the reaction mechanism is at best ill-defined. While theoretical modeling and a growing understanding of fundamental surface science help guide the chemist in designing and synthesizing targets, even in the most well understood areas of catalysis, the parameter space that one needs to explore experimentally is vast. The result is that the chemist using traditional methods must navigate a complex and unpredictable diversity space with a limited data set to make discoveries or to optimize known systems. We describe here a mature set of synthesis and screening technologies that together form a workflow that breaks this traditional paradigm and allows for rapid and efficient heterogeneous catalyst discovery and optimization. We exemplify the power of these new technologies by describing their use in the development and commercialization of a novel catalyst for the

  19. High-throughput Titration of Luciferase-expressing Recombinant Viruses

    PubMed Central

    Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon

    2014-01-01

    Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay. PMID:25285536

  20. High-throughput titration of luciferase-expressing recombinant viruses.

    PubMed

    Garcia, Vanessa; Krishnan, Ramya; Davis, Colin; Batenchuk, Cory; Le Boeuf, Fabrice; Abdelbary, Hesham; Diallo, Jean-Simon

    2014-09-19

    Standard plaque assays to determine infectious viral titers can be time consuming, are not amenable to a high volume of samples, and cannot be done with viruses that do not form plaques. As an alternative to plaque assays, we have developed a high-throughput titration method that allows for the simultaneous titration of a high volume of samples in a single day. This approach involves infection of the samples with a Firefly luciferase tagged virus, transfer of the infected samples onto an appropriate permissive cell line, subsequent addition of luciferin, reading of plates in order to obtain luminescence readings, and finally the conversion from luminescence to viral titers. The assessment of cytotoxicity using a metabolic viability dye can be easily incorporated in the workflow in parallel and provide valuable information in the context of a drug screen. This technique provides a reliable, high-throughput method to determine viral titers as an alternative to a standard plaque assay.

  1. Higher throughput high resolution multi-worm tracker

    NASA Astrophysics Data System (ADS)

    Javer, Avelino; Li, Kezhi; Gyenes, Bertalan; Brown, Andre; Behavioural Genomics Team

    2015-03-01

    We have developed a high throughput imaging system for tracking multiple nematode worms at high resolution. The tracker consists of 6 cameras mounted on a motorized gantry so that up to 48 plates (each with approximately 30 worms) can be imaged without user intervention. To deal with the high data rate of the cameras we use real time processing to find worms and only save the immediately surrounding pixels. The system is also equipped with automatic oxygen and carbon dioxide control for observing stimulus response behaviour. We will describe the design and performance of the new system, some of the challenges of truly high throughput behaviour recording, and report preliminary results on inter-individual variation in behaviour as well as a quantitative analysis of C. elegans response to hypoxia, oxygen reperfusion, and carbon dioxide. Funding provided by the Medical Research Council.

  2. High-throughput theoretical design of lithium battery materials

    NASA Astrophysics Data System (ADS)

    Shi-Gang, Ling; Jian, Gao; Rui-Juan, Xiao; Li-Quan, Chen

    2016-01-01

    The rapid evolution of high-throughput theoretical design schemes to discover new lithium battery materials is reviewed, including high-capacity cathodes, low-strain cathodes, anodes, solid state electrolytes, and electrolyte additives. With the development of efficient theoretical methods and inexpensive computers, high-throughput theoretical calculations have played an increasingly important role in the discovery of new materials. With the help of automatic simulation flow, many types of materials can be screened, optimized and designed from a structural database according to specific search criteria. In advanced cell technology, new materials for next generation lithium batteries are of great significance to achieve performance, and some representative criteria are: higher energy density, better safety, and faster charge/discharge speed. Project supported by the National Natural Science Foundation of China (Grant Nos. 11234013 and 51172274) and the National High Technology Research and Development Program of China (Grant No. 2015AA034201).

  3. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy

    PubMed Central

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-01-01

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery. PMID:27294925

  4. A Multidisciplinary Approach to High Throughput Nuclear Magnetic Resonance Spectroscopy.

    PubMed

    Pourmodheji, Hossein; Ghafar-Zadeh, Ebrahim; Magierowski, Sebastian

    2016-06-09

    Nuclear Magnetic Resonance (NMR) is a non-contact, powerful structure-elucidation technique for biochemical analysis. NMR spectroscopy is used extensively in a variety of life science applications including drug discovery. However, existing NMR technology is limited in that it cannot run a large number of experiments simultaneously in one unit. Recent advances in micro-fabrication technologies have attracted the attention of researchers to overcome these limitations and significantly accelerate the drug discovery process by developing the next generation of high-throughput NMR spectrometers using Complementary Metal Oxide Semiconductor (CMOS). In this paper, we examine this paradigm shift and explore new design strategies for the development of the next generation of high-throughput NMR spectrometers using CMOS technology. A CMOS NMR system consists of an array of high sensitivity micro-coils integrated with interfacing radio-frequency circuits on the same chip. Herein, we first discuss the key challenges and recent advances in the field of CMOS NMR technology, and then a new design strategy is put forward for the design and implementation of highly sensitive and high-throughput CMOS NMR spectrometers. We thereafter discuss the functionality and applicability of the proposed techniques by demonstrating the results. For microelectronic researchers starting to work in the field of CMOS NMR technology, this paper serves as a tutorial with comprehensive review of state-of-the-art technologies and their performance levels. Based on these levels, the CMOS NMR approach offers unique advantages for high resolution, time-sensitive and high-throughput bimolecular analysis required in a variety of life science applications including drug discovery.

  5. Direct assembling methodologies for high-throughput bioscreening

    PubMed Central

    Rodríguez-Dévora, Jorge I.; Shi, Zhi-dong; Xu, Tao

    2012-01-01

    Over the last few decades, high-throughput (HT) bioscreening, a technique that allows rapid screening of biochemical compound libraries against biological targets, has been widely used in drug discovery, stem cell research, development of new biomaterials, and genomics research. To achieve these ambitions, scaffold-free (or direct) assembly of biological entities of interest has become critical. Appropriate assembling methodologies are required to build an efficient HT bioscreening platform. The development of contact and non-contact assembling systems as a practical solution has been driven by a variety of essential attributes of the bioscreening system, such as miniaturization, high throughput, and high precision. The present article reviews recent progress on these assembling technologies utilized for the construction of HT bioscreening platforms. PMID:22021162

  6. A high-throughput multiplex method adapted for GMO detection.

    PubMed

    Chaouachi, Maher; Chupeau, Gaëlle; Berard, Aurélie; McKhann, Heather; Romaniuk, Marcel; Giancola, Sandra; Laval, Valérie; Bertheau, Yves; Brunel, Dominique

    2008-12-24

    A high-throughput multiplex assay for the detection of genetically modified organisms (GMO) was developed on the basis of the existing SNPlex method designed for SNP genotyping. This SNPlex assay allows the simultaneous detection of up to 48 short DNA sequences (approximately 70 bp; "signature sequences") from taxa endogenous reference genes, from GMO constructions, screening targets, construct-specific, and event-specific targets, and finally from donor organisms. This assay avoids certain shortcomings of multiplex PCR-based methods already in widespread use for GMO detection. The assay demonstrated high specificity and sensitivity. The results suggest that this assay is reliable, flexible, and cost- and time-effective for high-throughput GMO detection.

  7. Microfluidics for High-Throughput Quantitative Studies of Early Development.

    PubMed

    Levario, Thomas J; Lim, Bomyi; Shvartsman, Stanislav Y; Lu, Hang

    2016-07-11

    Developmental biology has traditionally relied on qualitative analyses; recently, however, as in other fields of biology, researchers have become increasingly interested in acquiring quantitative knowledge about embryogenesis. Advances in fluorescence microscopy are enabling high-content imaging in live specimens. At the same time, microfluidics and automation technologies are increasing experimental throughput for studies of multicellular models of development. Furthermore, computer vision methods for processing and analyzing bioimage data are now leading the way toward quantitative biology. Here, we review advances in the areas of fluorescence microscopy, microfluidics, and data analysis that are instrumental to performing high-content, high-throughput studies in biology and specifically in development. We discuss a case study of how these techniques have allowed quantitative analysis and modeling of pattern formation in the Drosophila embryo.

  8. A high-throughput microRNA expression profiling system.

    PubMed

    Guo, Yanwen; Mastriano, Stephen; Lu, Jun

    2014-01-01

    As small noncoding RNAs, microRNAs (miRNAs) regulate diverse biological functions, including physiological and pathological processes. The expression and deregulation of miRNA levels contain rich information with diagnostic and prognostic relevance and can reflect pharmacological responses. The increasing interest in miRNA-related research demands global miRNA expression profiling on large numbers of samples. We describe here a robust protocol that supports high-throughput sample labeling and detection on hundreds of samples simultaneously. This method employs 96-well-based miRNA capturing from total RNA samples and on-site biochemical reactions, coupled with bead-based detection in 96-well format for hundreds of miRNAs per sample. With low-cost, high-throughput, high detection specificity, and flexibility to profile both small and large numbers of samples, this protocol can be adapted in a wide range of laboratory settings.

  9. High-throughput sequence alignment using Graphics Processing Units.

    PubMed

    Schatz, Michael C; Trapnell, Cole; Delcher, Arthur L; Varshney, Amitabh

    2007-12-10

    The recent availability of new, less expensive high-throughput DNA sequencing technologies has yielded a dramatic increase in the volume of sequence data that must be analyzed. These data are being generated for several purposes, including genotyping, genome resequencing, metagenomics, and de novo genome assembly projects. Sequence alignment programs such as MUMmer have proven essential for analysis of these data, but researchers will need ever faster, high-throughput alignment tools running on inexpensive hardware to keep up with new sequence technologies. This paper describes MUMmerGPU, an open-source high-throughput parallel pairwise local sequence alignment program that runs on commodity Graphics Processing Units (GPUs) in common workstations. MUMmerGPU uses the new Compute Unified Device Architecture (CUDA) from nVidia to align multiple query sequences against a single reference sequence stored as a suffix tree. By processing the queries in parallel on the highly parallel graphics card, MUMmerGPU achieves more than a 10-fold speedup over a serial CPU version of the sequence alignment kernel, and outperforms the exact alignment component of MUMmer on a high end CPU by 3.5-fold in total application time when aligning reads from recent sequencing projects using Solexa/Illumina, 454, and Sanger sequencing technologies. MUMmerGPU is a low cost, ultra-fast sequence alignment program designed to handle the increasing volume of data produced by new, high-throughput sequencing technologies. MUMmerGPU demonstrates that even memory-intensive applications can run significantly faster on the relatively low-cost GPU than on the CPU.

  10. Novel High-throughput Approach for Purification of Infectious Virions

    PubMed Central

    James, Kevin T.; Cooney, Brad; Agopsowicz, Kate; Trevors, Mary Ann; Mohamed, Adil; Stoltz, Don; Hitt, Mary; Shmulevitz, Maya

    2016-01-01

    Viruses are extensively studied as pathogens and exploited as molecular tools and therapeutic agents. Existing methods to purify viruses such as gradient ultracentrifugation or chromatography have limitations, for example demand for technical expertise or specialized equipment, high time consumption, and restricted capacity. Our laboratory explores mutations in oncolytic reovirus that could improve oncolytic activity, and makes routine use of numerous virus variants, genome reassortants, and reverse engineered mutants. Our research pace was limited by the lack of high-throughput virus purification methods that efficiently remove confounding cellular contaminants such as cytokines and proteases. To overcome this shortcoming, we evaluated a commercially available resin (Capto Core 700) that captures molecules smaller than 700 kDa. Capto. Core 700 chromatography produced virion purity and infectivity indistinguishable from CsCl density gradient ultracentrifugation as determined by electron microscopy, gel electrophoresis analysis and plaque titration. Capto Core 700 resin was then effectively adapted to a rapid in-slurry pull-out approach for high-throughput purification of reovirus and adenovirus. The in-slurry purification approach offered substantially increased virus purity over crude cell lysates, media, or high-spin preparations and would be especially useful for high-throughput virus screening applications where density gradient ultracentrifugation is not feasible. PMID:27827454

  11. High-Throughput Intracellular Antimicrobial Susceptibility Testing of Legionella pneumophila

    PubMed Central

    Chiaraviglio, Lucius

    2015-01-01

    Legionella pneumophila is a Gram-negative opportunistic human pathogen that causes a severe pneumonia known as Legionnaires' disease. Notably, in the human host, the organism is believed to replicate solely within an intracellular compartment, predominantly within pulmonary macrophages. Consequently, successful therapy is predicated on antimicrobials penetrating into this intracellular growth niche. However, standard antimicrobial susceptibility testing methods test solely for extracellular growth inhibition. Here, we make use of a high-throughput assay to characterize intracellular growth inhibition activity of known antimicrobials. For select antimicrobials, high-resolution dose-response analysis was then performed to characterize and compare activity levels in both macrophage infection and axenic growth assays. Results support the superiority of several classes of nonpolar antimicrobials in abrogating intracellular growth. Importantly, our assay results show excellent correlations with prior clinical observations of antimicrobial efficacy. Furthermore, we also show the applicability of high-throughput automation to two- and three-dimensional synergy testing. High-resolution isocontour isobolograms provide in vitro support for specific combination antimicrobial therapy. Taken together, findings suggest that high-throughput screening technology may be successfully applied to identify and characterize antimicrobials that target bacterial pathogens that make use of an intracellular growth niche. PMID:26392509

  12. Rapid Methods for High-Throughput Detection of Sulfoxides▿

    PubMed Central

    Shainsky, Janna; Derry, Netta-Lee; Leichtmann-Bardoogo, Yael; Wood, Thomas K.; Fishman, Ayelet

    2009-01-01

    Enantiopure sulfoxides are prevalent in drugs and are useful chiral auxiliaries in organic synthesis. The biocatalytic enantioselective oxidation of prochiral sulfides is a direct and economical approach for the synthesis of optically pure sulfoxides. The selection of suitable biocatalysts requires rapid and reliable high-throughput screening methods. Here we present four different methods for detecting sulfoxides produced via whole-cell biocatalysis, three of which were exploited for high-throughput screening. Fluorescence detection based on the acid activation of omeprazole was utilized for high-throughput screening of mutant libraries of toluene monooxygenases, but no active variants have been discovered yet. The second method is based on the reduction of sulfoxides to sulfides, with the coupled release and measurement of iodine. The availability of solvent-resistant microtiter plates enabled us to modify the method to a high-throughput format. The third method, selective inhibition of horse liver alcohol dehydrogenase, was used to rapidly screen highly active and/or enantioselective variants at position V106 of toluene ortho-monooxygenase in a saturation mutagenesis library, using methyl-p-tolyl sulfide as the substrate. A success rate of 89% (i.e., 11% false positives) was obtained, and two new mutants were selected. The fourth method is based on the colorimetric detection of adrenochrome, a back-titration procedure which measures the concentration of the periodate-sensitive sulfide. Due to low sensitivity during whole-cell screening, this method was found to be useful only for determining the presence or absence of sulfoxide in the reaction. The methods described in the present work are simple and inexpensive and do not require special equipment. PMID:19465532

  13. Human transcriptome array for high-throughput clinical studies.

    PubMed

    Xu, Weihong; Seok, Junhee; Mindrinos, Michael N; Schweitzer, Anthony C; Jiang, Hui; Wilhelmy, Julie; Clark, Tyson A; Kapur, Karen; Xing, Yi; Faham, Malek; Storey, John D; Moldawer, Lyle L; Maier, Ronald V; Tompkins, Ronald G; Wong, Wing Hung; Davis, Ronald W; Xiao, Wenzhong

    2011-03-01

    A 6.9 million-feature oligonucleotide array of the human transcriptome [Glue Grant human transcriptome (GG-H array)] has been developed for high-throughput and cost-effective analyses in clinical studies. This array allows comprehensive examination of gene expression and genome-wide identification of alternative splicing as well as detection of coding SNPs and noncoding transcripts. The performance of the array was examined and compared with mRNA sequencing (RNA-Seq) results over multiple independent replicates of liver and muscle samples. Compared with RNA-Seq of 46 million uniquely mappable reads per replicate, the GG-H array is highly reproducible in estimating gene and exon abundance. Although both platforms detect similar expression changes at the gene level, the GG-H array is more sensitive at the exon level. Deeper sequencing is required to adequately cover low-abundance transcripts. The array has been implemented in a multicenter clinical program and has generated high-quality, reproducible data. Considering the clinical trial requirements of cost, sample availability, and throughput, the GG-H array has a wide range of applications. An emerging approach for large-scale clinical genomic studies is to first use RNA-Seq to the sufficient depth for the discovery of transcriptome elements relevant to the disease process followed by high-throughput and reliable screening of these elements on thousands of patient samples using custom-designed arrays.

  14. Fluorescent biosensors for high throughput screening of protein kinase inhibitors.

    PubMed

    Prével, Camille; Pellerano, Morgan; Van, Thi Nhu Ngoc; Morris, May C

    2014-02-01

    High throughput screening assays aim to identify small molecules that interfere with protein function, activity, or conformation, which can serve as effective tools for chemical biology studies of targets involved in physiological processes or pathways of interest or disease models, as well as templates for development of therapeutics in medicinal chemistry. Fluorescent biosensors constitute attractive and powerful tools for drug discovery programs, from high throughput screening assays, to postscreen characterization of hits, optimization of lead compounds, and preclinical evaluation of candidate drugs. They provide a means of screening for inhibitors that selectively target enzymatic activity, conformation, and/or function in vitro. Moreover, fluorescent biosensors constitute useful tools for cell- and image-based, multiplex and multiparametric, high-content screening. Application of fluorescence-based sensors to screen large and complex libraries of compounds in vitro, in cell-based formats or whole organisms requires several levels of optimization to establish robust and reproducible assays. In this review, we describe the different fluorescent biosensor technologies which have been applied to high throughput screens, and discuss the prerequisite criteria underlying their successful application. Special emphasis is placed on protein kinase biosensors, since these enzymes constitute one of the most important classes of therapeutic targets in drug discovery.

  15. High throughput electrophysiology: new perspectives for ion channel drug discovery.

    PubMed

    Willumsen, Niels J; Bech, Morten; Olesen, Søren-Peter; Jensen, Bo Skaaning; Korsgaard, Mads P G; Christophersen, Palle

    2003-01-01

    Proper function of ion channels is crucial for all living cells. Ion channel dysfunction may lead to a number of diseases, so-called channelopathies, and a number of common diseases, including epilepsy, arrhythmia, and type II diabetes, are primarily treated by drugs that modulate ion channels. A cornerstone in current drug discovery is high throughput screening assays which allow examination of the activity of specific ion channels though only to a limited extent. Conventional patch clamp remains the sole technique with sufficiently high time resolution and sensitivity required for precise and direct characterization of ion channel properties. However, patch clamp is a slow, labor-intensive, and thus expensive, technique. New techniques combining the reliability and high information content of patch clamping with the virtues of high throughput philosophy are emerging and predicted to make a number of ion channel targets accessible for drug screening. Specifically, genuine HTS parallel processing techniques based on arrays of planar silicon chips are being developed, but also lower throughput sequential techniques may be of value in compound screening, lead optimization, and safety screening. The introduction of new powerful HTS electrophysiological techniques is predicted to cause a revolution in ion channel drug discovery.

  16. Computational analysis of high-throughput flow cytometry data

    PubMed Central

    Robinson, J Paul; Rajwa, Bartek; Patsekin, Valery; Davisson, Vincent Jo

    2015-01-01

    Introduction Flow cytometry has been around for over 40 years, but only recently has the opportunity arisen to move into the high-throughput domain. The technology is now available and is highly competitive with imaging tools under the right conditions. Flow cytometry has, however, been a technology that has focused on its unique ability to study single cells and appropriate analytical tools are readily available to handle this traditional role of the technology. Areas covered Expansion of flow cytometry to a high-throughput (HT) and high-content technology requires both advances in hardware and analytical tools. The historical perspective of flow cytometry operation as well as how the field has changed and what the key changes have been discussed. The authors provide a background and compelling arguments for moving toward HT flow, where there are many innovative opportunities. With alternative approaches now available for flow cytometry, there will be a considerable number of new applications. These opportunities show strong capability for drug screening and functional studies with cells in suspension. Expert opinion There is no doubt that HT flow is a rich technology awaiting acceptance by the pharmaceutical community. It can provide a powerful phenotypic analytical toolset that has the capacity to change many current approaches to HT screening. The previous restrictions on the technology, based on its reduced capacity for sample throughput, are no longer a major issue. Overcoming this barrier has transformed a mature technology into one that can focus on systems biology questions not previously considered possible. PMID:22708834

  17. MEGARes: an antimicrobial resistance database for high throughput sequencing

    PubMed Central

    Lakin, Steven M.; Dean, Chris; Noyes, Noelle R.; Dettenwanger, Adam; Ross, Anne Spencer; Doster, Enrique; Rovira, Pablo; Abdo, Zaid; Jones, Kenneth L.; Ruiz, Jaime; Belk, Keith E.; Morley, Paul S.; Boucher, Christina

    2017-01-01

    Antimicrobial resistance has become an imminent concern for public health. As methods for detection and characterization of antimicrobial resistance move from targeted culture and polymerase chain reaction to high throughput metagenomics, appropriate resources for the analysis of large-scale data are required. Currently, antimicrobial resistance databases are tailored to smaller-scale, functional profiling of genes using highly descriptive annotations. Such characteristics do not facilitate the analysis of large-scale, ecological sequence datasets such as those produced with the use of metagenomics for surveillance. In order to overcome these limitations, we present MEGARes (https://megares.meglab.org), a hand-curated antimicrobial resistance database and annotation structure that provides a foundation for the development of high throughput acyclical classifiers and hierarchical statistical analysis of big data. MEGARes can be browsed as a stand-alone resource through the website or can be easily integrated into sequence analysis pipelines through download. Also via the website, we provide documentation for AmrPlusPlus, a user-friendly Galaxy pipeline for the analysis of high throughput sequencing data that is pre-packaged for use with the MEGARes database. PMID:27899569

  18. MEGARes: an antimicrobial resistance database for high throughput sequencing.

    PubMed

    Lakin, Steven M; Dean, Chris; Noyes, Noelle R; Dettenwanger, Adam; Ross, Anne Spencer; Doster, Enrique; Rovira, Pablo; Abdo, Zaid; Jones, Kenneth L; Ruiz, Jaime; Belk, Keith E; Morley, Paul S; Boucher, Christina

    2017-01-04

    Antimicrobial resistance has become an imminent concern for public health. As methods for detection and characterization of antimicrobial resistance move from targeted culture and polymerase chain reaction to high throughput metagenomics, appropriate resources for the analysis of large-scale data are required. Currently, antimicrobial resistance databases are tailored to smaller-scale, functional profiling of genes using highly descriptive annotations. Such characteristics do not facilitate the analysis of large-scale, ecological sequence datasets such as those produced with the use of metagenomics for surveillance. In order to overcome these limitations, we present MEGARes (https://megares.meglab.org), a hand-curated antimicrobial resistance database and annotation structure that provides a foundation for the development of high throughput acyclical classifiers and hierarchical statistical analysis of big data. MEGARes can be browsed as a stand-alone resource through the website or can be easily integrated into sequence analysis pipelines through download. Also via the website, we provide documentation for AmrPlusPlus, a user-friendly Galaxy pipeline for the analysis of high throughput sequencing data that is pre-packaged for use with the MEGARes database.

  19. High-throughput patterning of photonic structures with tunable periodicity

    PubMed Central

    Kempa, Thomas J.; Bediako, D. Kwabena; Kim, Sun-Kyung; Park, Hong-Gyu; Nocera, Daniel G.

    2015-01-01

    A patterning method termed “RIPPLE” (reactive interface patterning promoted by lithographic electrochemistry) is applied to the fabrication of arrays of dielectric and metallic optical elements. This method uses cyclic voltammetry to impart patterns onto the working electrode of a standard three-electrode electrochemical setup. Using this technique and a template stripping process, periodic arrays of Ag circular Bragg gratings are patterned in a high-throughput fashion over large substrate areas. By varying the scan rate of the cyclically applied voltage ramps, the periodicity of the gratings can be tuned in situ over micrometer and submicrometer length scales. Characterization of the periodic arrays of periodic gratings identified point-like and annular scattering modes at different planes above the structured surface. Facile, reliable, and rapid patterning techniques like RIPPLE may enable the high-throughput and low-cost fabrication of photonic elements and metasurfaces for energy conversion and sensing applications. PMID:25870280

  20. High-throughput physical organic chemistry--Hammett parameter evaluation.

    PubMed

    Portal, Christophe F; Bradley, Mark

    2006-07-15

    High-throughput analysis techniques were developed to allow the rapid assessment of a range of Hammett parameters utilizing positive electrospray mass spectrometry (ESI+ -MS) as the sole quantitative tool, with the core of the approach being a so-called "analytical construct". Hammett substituent parameters were determined for a range of meta- and para-substituted anilines by high-throughput (HT) assessment of relative reaction rates for competitive amide bond formation reaction with up to five parameters determined in a single pot reaction. Sensitivity of the reaction to substituents' effects (materialized by Hammett's rho parameter) was determined in the first instance, with HT Hammett's sigma substituent parameter assessment then carried out successfully for over 30 anilines, with excellent correlation observed between the HT ESI+ -MS method of determination and literature values.

  1. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  2. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  3. High-throughput screening in the C. elegans nervous system.

    PubMed

    Kinser, Holly E; Pincus, Zachary

    2016-06-03

    The nematode Caenorhabditis elegans is widely used as a model organism in the field of neurobiology. The wiring of the C. elegans nervous system has been entirely mapped, and the animal's optical transparency allows for in vivo observation of neuronal activity. The nematode is also small in size, self-fertilizing, and inexpensive to cultivate and maintain, greatly lending to its utility as a whole-animal model for high-throughput screening (HTS) in the nervous system. However, the use of this organism in large-scale screens presents unique technical challenges, including reversible immobilization of the animal, parallel single-animal culture and containment, automation of laser surgery, and high-throughput image acquisition and phenotyping. These obstacles require significant modification of existing techniques and the creation of new C. elegans-based HTS platforms. In this review, we outline these challenges in detail and survey the novel technologies and methods that have been developed to address them.

  4. High throughput screening of starch structures using carbohydrate microarrays

    PubMed Central

    Tanackovic, Vanja; Rydahl, Maja Gro; Pedersen, Henriette Lodberg; Motawia, Mohammed Saddik; Shaik, Shahnoor Sultana; Mikkelsen, Maria Dalgaard; Krunic, Susanne Langgaard; Fangel, Jonatan Ulrik; Willats, William George Tycho; Blennow, Andreas

    2016-01-01

    In this study we introduce the starch-recognising carbohydrate binding module family 20 (CBM20) from Aspergillus niger for screening biological variations in starch molecular structure using high throughput carbohydrate microarray technology. Defined linear, branched and phosphorylated maltooligosaccharides, pure starch samples including a variety of different structures with variations in the amylopectin branching pattern, amylose content and phosphate content, enzymatically modified starches and glycogen were included. Using this technique, different important structures, including amylose content and branching degrees could be differentiated in a high throughput fashion. The screening method was validated using transgenic barley grain analysed during development and subjected to germination. Typically, extreme branching or linearity were detected less than normal starch structures. The method offers the potential for rapidly analysing resistant and slowly digested dietary starches. PMID:27468930

  5. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  6. Applications of high-throughput DNA sequencing to benign hematology

    PubMed Central

    Gallagher, Patrick G.

    2013-01-01

    The development of novel technologies for high-throughput DNA sequencing is having a major impact on our ability to measure and define normal and pathologic variation in humans. This review discusses advances in DNA sequencing that have been applied to benign hematologic disorders, including those affecting the red blood cell, the neutrophil, and other white blood cell lineages. Relevant examples of how these approaches have been used for disease diagnosis, gene discovery, and studying complex traits are provided. High-throughput DNA sequencing technology holds significant promise for impacting clinical care. This includes development of improved disease detection and diagnosis, better understanding of disease progression and stratification of risk of disease-specific complications, and development of improved therapeutic strategies, particularly patient-specific pharmacogenomics-based therapy, with monitoring of therapy by genomic biomarkers. PMID:24021670

  7. Sensitivity study of reliable, high-throughput resolution metricsfor photoresists

    SciTech Connect

    Anderson, Christopher N.; Naulleau, Patrick P.

    2007-07-30

    The resolution of chemically amplified resists is becoming an increasing concern, especially for lithography in the extreme ultraviolet (EUV) regime. Large-scale screening and performance-based down-selection is currently underway to identify resist platforms that can support shrinking feature sizes. Resist screening efforts, however, are hampered by the absence of reliable resolution metrics that can objectively quantify resist resolution in a high-throughput fashion. Here we examine two high-throughput metrics for resist resolution determination. After summarizing their details and justifying their utility, we characterize the sensitivity of both metrics to two of the main experimental uncertainties associated with lithographic exposure tools, namely: limited focus control and limited knowledge of optical aberrations. For an implementation at EUV wavelengths, we report aberration and focus limited error bars in extracted resolution of {approx} 1.25 nm RMS for both metrics making them attractive candidates for future screening and down-selection efforts.

  8. Proteome-wide reactivity profiling identifies diverse carbamate chemotypes tuned for serine hydrolase inhibition.

    PubMed

    Chang, Jae Won; Cognetta, Armand B; Niphakis, Micah J; Cravatt, Benjamin F

    2013-07-19

    Serine hydrolases are one of the largest and most diverse enzyme classes in Nature. Inhibitors of serine hydrolases are used to treat many diseases, including obesity, diabetes, cognitive dementia, and bacterial and viral infections. Nonetheless, the majority of the 200+ serine hydrolases in mammals still lack selective inhibitors for their functional characterization. We and others have shown that activated carbamates, through covalent reaction with the conserved serine nucleophile of serine hydrolases, can serve as useful inhibitors for members of this enzyme family. The extent to which carbamates, however, cross-react with other protein classes remains mostly unexplored. Here, we address this problem by investigating the proteome-wide reactivity of a diverse set of activated carbamates in vitro and in vivo, using a combination of competitive and click chemistry (CC)-activity-based protein profiling (ABPP). We identify multiple classes of carbamates, including O-aryl, O-hexafluoroisopropyl (HFIP), and O-N-hydroxysuccinimidyl (NHS) carbamates that react selectively with serine hydrolases across entire mouse tissue proteomes in vivo. We exploit the proteome-wide specificity of HFIP carbamates to create in situ imaging probes for the endocannabinoid hydrolases monoacylglycerol lipase (MAGL) and α-β hydrolase-6 (ABHD6). These findings, taken together, designate the carbamate as a privileged reactive group for serine hydrolases that can accommodate diverse structural modifications to produce inhibitors that display exceptional potency and selectivity across the mammalian proteome.

  9. Proteome-wide reactivity profiling identifies diverse carbamate chemotypes tuned for serine hydrolase inhibition

    PubMed Central

    Chang, Jae Won; Cognetta, Armand B.; Niphakis, Micah J.; Cravatt, Benjamin F.

    2013-01-01

    Serine hydrolases are one of the largest and most diverse enzyme classes in Nature. Inhibitors of serine hydrolases are used to treat many diseases, including obesity, diabetes, cognitive dementia, and bacterial and viral infections. Nonetheless, the majority of the 200+ serine hydrolases in mammals still lack selective inhibitors for their functional characterization. We and others have shown that activated carbamates, through covalent reaction with the conserved serine nucleophile of serine hydrolases, can serve as useful inhibitors for members of this enzyme family. The extent to which carbamates, however, cross-react with other protein classes remains mostly unexplored. Here, we address this problem by investigating the proteome-wide reactivity of a diverse set of activated carbamates in vitro and in vivo using a combination of competitive and click chemistry (CC)-activity-based protein profiling (ABPP). We identify multiple classes of carbamates, including O-aryl, O-hexafluoroisopropyl (HFIP), and O-N-hydroxysuccinimidyl (NHS) carbamates that react selectively with serine hydrolases across entire mouse tissue proteomes in vivo. We exploit the proteome-wide specificity of HFIP carbamates to create in situ imaging probes for the endocannabinoid hydrolases monoacylglycerol lipase (MAGL) and alpha-beta hydrolase-6 (ABHD6). These findings, taken together, designate the carbamate as a privileged reactive group for serine hydrolases that can accommodate diverse structural modifications to produce inhibitors that display exceptional potency and selectivity across the mammalian proteome. PMID:23701408

  10. A roadmap to evaluate the proteome-wide selectivity of covalent kinase inhibitors

    PubMed Central

    Dix, Melissa M.; Douhan, John; Gilbert, Adam M.; Hett, Erik C.; Johnson, Theodore O.; Joslyn, Chris; Kath, John C.; Niessen, Sherry; Roberts, Lee R.; Schnute, Mark E.; Wang, Chu; Hulce, Jonathan J.; Wei, Baoxian; Whiteley, Laurence O.; Hayward, Matthew M.; Cravatt, Benjamin F.

    2014-01-01

    Kinases are principal components of signal transduction pathways and the focus of intense basic and drug discovery research. Irreversible inhibitors that covalently modify non-catalytic cysteines in kinase active-sites have emerged as valuable probes and approved drugs. Many protein classes, however, possess functional cysteines and therefore understanding the proteome-wide selectivity of covalent kinase inhibitors is imperative. Here, we accomplish this objective using activity-based protein profiling coupled with quantitative mass spectrometry to globally map the targets, both specific and non-specific, of covalent kinase inhibitors in human cells. Many of the specific off-targets represent non-kinase proteins that, interestingly, possess conserved, active-site cysteines. We define windows of selectivity for covalent kinase inhibitors and show that, when these windows are exceeded, rampant proteome-wide reactivity and kinase target-independent cell death conjointly occur. Our findings, taken together, provide an experimental roadmap to illuminate opportunities and surmount challenges for the development of covalent kinase inhibitors. PMID:25038787

  11. High-throughput single-molecule optofluidic analysis

    PubMed Central

    Kim, Soohong; Streets, Aaron M; Lin, Ron R; Quake, Stephen R; Weiss, Shimon; Majumdar, Devdoot S

    2011-01-01

    We describe a high-throughput, automated single-molecule measurement system, equipped with microfluidics. the microfluidic mixing device has integrated valves and pumps to accurately accomplish titration of biomolecules with picoliter resolution. We demonstrate that the approach enabled rapid sampling of biomolecule conformational landscape and of enzymatic activity, in the form of transcription by Escherichia coli RNA polymerase, as a function of the chemical environment. PMID:21297618

  12. Generating barcoded libraries for multiplex high-throughput sequencing.

    PubMed

    Knapp, Michael; Stiller, Mathias; Meyer, Matthias

    2012-01-01

    Molecular barcoding is an essential tool to use the high throughput of next generation sequencing platforms optimally in studies involving more than one sample. Various barcoding strategies allow for the incorporation of short recognition sequences (barcodes) into sequencing libraries, either by ligation or polymerase chain reaction (PCR). Here, we present two approaches optimized for generating barcoded sequencing libraries from low copy number extracts and amplification products typical of ancient DNA studies.

  13. A fully automated robotic system for high throughput fermentation.

    PubMed

    Zimmermann, Hartmut F; Rieth, Jochen

    2007-03-01

    High throughput robotic systems have been used since the 1990s to carry out biochemical assays in microtiter plates. However, before the application of such systems in industrial fermentation process development, some important specific demands should be taken into account. These are sufficient oxygen supply, optimal growth temperature, minimized sample evaporation, avoidance of contaminations, and simple but reliable process monitoring. A fully automated solution where all these aspects have been taken into account is presented.

  14. High throughput screening operations at the University of Kansas.

    PubMed

    Roy, Anuradha

    2014-05-01

    The High Throughput Screening Laboratory at University of Kansas plays a critical role in advancing academic interest in the identification of chemical probes as tools to better understand the biological and biochemical basis of new therapeutic targets. The HTS laboratory has an open service policy and collaborates with internal and external academia as well as for-profit organizations to execute projects requiring HTS-compatible assay development and screening of chemical libraries for target validation, probe selection, hit identification and lead optimization.

  15. Web-based visual analysis for high-throughput genomics.

    PubMed

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput

  16. FLASH assembly of TALENs for high-throughput genome editing.

    PubMed

    Reyon, Deepak; Tsai, Shengdar Q; Khayter, Cyd; Foden, Jennifer A; Sander, Jeffry D; Joung, J Keith

    2012-05-01

    Engineered transcription activator–like effector nucleases (TALENs) have shown promise as facile and broadly applicable genome editing tools. However, no publicly available high-throughput method for constructing TALENs has been published, and large-scale assessments of the success rate and targeting range of the technology remain lacking. Here we describe the fast ligation-based automatable solid-phase high-throughput (FLASH) system, a rapid and cost-effective method for large-scale assembly of TALENs. We tested 48 FLASH-assembled TALEN pairs in a human cell–based EGFP reporter system and found that all 48 possessed efficient gene-modification activities. We also used FLASH to assemble TALENs for 96 endogenous human genes implicated in cancer and/or epigenetic regulation and found that 84 pairs were able to efficiently introduce targeted alterations. Our results establish the robustness of TALEN technology and demonstrate that FLASH facilitates high-throughput genome editing at a scale not currently possible with other genome modification technologies.

  17. Evaluation of High-Throughput Chemical Exposure Models ...

    EPA Pesticide Factsheets

    The U.S. EPA, under its ExpoCast program, is developing high-throughput near-field modeling methods to estimate human chemical exposure and to provide real-world context to high-throughput screening (HTS) hazard data. These novel modeling methods include reverse methods to infer parent chemical exposures from biomonitoring measurements and forward models to predict multi-pathway exposures from chemical use information and/or residential media concentrations. Here, both forward and reverse modeling methods are used to characterize the relationship between matched near-field environmental (air and dust) and biomarker measurements. Indoor air, house dust, and urine samples from a sample of 120 females (aged 60 to 80 years) were analyzed. In the measured data, 78% of the residential media measurements (across 80 chemicals) and 54% of the urine measurements (across 21 chemicals) were censored, i.e. below the limit of quantification (LOQ). Because of the degree of censoring, we applied a Bayesian approach to impute censored values for 69 chemicals having at least 15% of measurements above LOQ. This resulted in 10 chemicals (5 phthalates, 5 pesticides) with matched air, dust, and urine metabolite measurements. The population medians of indoor air and dust concentrations were compared to population median exposures inferred from urine metabolites concentrations using a high-throughput reverse-dosimetry approach. Median air and dust concentrations were found to be correl

  18. High Throughput Assays for Exposure Science (NIEHS OHAT ...

    EPA Pesticide Factsheets

    High throughput screening (HTS) data that characterize chemically induced biological activity have been generated for thousands of chemicals by the US interagency Tox21 and the US EPA ToxCast programs. In many cases there are no data available for comparing bioactivity from HTS with relevant human exposures. The EPA’s ExpoCast program is developing high-throughput approaches to generate the needed exposure estimates using existing databases and new, high-throughput measurements. The exposure pathway (i.e., the route of chemical from manufacture to human intake) significantly impacts the level of exposure. The presence, concentration, and formulation of chemicals in consumer products and articles of commerce (e.g., clothing) can therefore provide critical information for estimating risk. We have found that there are only limited data available on the chemical constituents (e.g., flame retardants, plasticizers) within most articles of commerce. Furthermore, the presence of some chemicals in otherwise well characterized products may be due to product packaging. We are analyzing sample consumer products using 2D gas chromatograph (GC) x GC Time of Flight Mass Spectrometry (GCxGCTOF/MS), which is suited for forensic investigation of chemicals in complex matrices (including toys, cleaners, and food). In parallel, we are working to create a reference library of retention times and spectral information for the entire Tox21 chemical library. In an examination of five p

  19. A microdroplet dilutor for high-throughput screening

    NASA Astrophysics Data System (ADS)

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B.; Demello, Andrew J.

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  20. A microdroplet dilutor for high-throughput screening.

    PubMed

    Niu, Xize; Gielen, Fabrice; Edel, Joshua B; deMello, Andrew J

    2011-06-01

    Pipetting and dilution are universal processes used in chemical and biological laboratories to assay and experiment. In microfluidics such operations are equally in demand, but difficult to implement. Recently, droplet-based microfluidics has emerged as an exciting new platform for high-throughput experimentation. However, it is challenging to vary the concentration of droplets rapidly and controllably. To this end, we developed a dilution module for high-throughput screening using droplet-based microfluidics. Briefly, a nanolitre-sized sample droplet of defined concentration is trapped within a microfluidic chamber. Through a process of droplet merging, mixing and re-splitting, this droplet is combined with a series of smaller buffer droplets to generate a sequence of output droplets that define a digital concentration gradient. Importantly, the formed droplets can be merged with other reagent droplets to enable rapid chemical and biological screens. As a proof of concept, we used the dilutor to perform a high-throughput homogeneous DNA-binding assay using only nanolitres of sample.

  1. High throughput biotechnology in traditional fermented food industry.

    PubMed

    Yang, Yong; Xu, Rong-man; Song, Jia; Wang, Wei-min

    2010-11-01

    Traditional fermented food is not only the staple food for most of developing countries but also the key healthy food for developed countries. As the healthy function of these foods are gradually discovered, more and more high throughput biotechnologies are being used to promote the old and new industry. As a result, the microflora, manufacturing processes and product healthy function of these foods were pushed forward either in the respect of profundity or extensiveness nowadays. The application and progress of the high throughput biotechnologies into traditional fermented food industries were different from each other, which was reviewed and detailed by the catalogues of fermented milk products (yogurt, cheese), fermented sausages, fermented vegetables (kimchi, sauerkraut), fermented cereals (sourdough) and fermented beans (tempeh, natto). Given the further promotion by high throughput biotechnologies, the middle and/or down-stream process of traditional fermented foods would be optimized and the process of industrialization of local traditional fermented food having many functional factors but in small quantity would be accelerated. The article presents some promising patents on traditional fermented food industry.

  2. NCBI GEO: archive for high-throughput functional genomic data.

    PubMed

    Barrett, Tanya; Troup, Dennis B; Wilhite, Stephen E; Ledoux, Pierre; Rudnev, Dmitry; Evangelista, Carlos; Kim, Irene F; Soboleva, Alexandra; Tomashevsky, Maxim; Marshall, Kimberly A; Phillippy, Katherine H; Sherman, Patti M; Muertter, Rolf N; Edgar, Ron

    2009-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) is the largest public repository for high-throughput gene expression data. Additionally, GEO hosts other categories of high-throughput functional genomic data, including those that examine genome copy number variations, chromatin structure, methylation status and transcription factor binding. These data are generated by the research community using high-throughput technologies like microarrays and, more recently, next-generation sequencing. The database has a flexible infrastructure that can capture fully annotated raw and processed data, enabling compliance with major community-derived scientific reporting standards such as 'Minimum Information About a Microarray Experiment' (MIAME). In addition to serving as a centralized data storage hub, GEO offers many tools and features that allow users to effectively explore, analyze and download expression data from both gene-centric and experiment-centric perspectives. This article summarizes the GEO repository structure, content and operating procedures, as well as recently introduced data mining features. GEO is freely accessible at http://www.ncbi.nlm.nih.gov/geo/.

  3. Towards High-throughput Immunomics for Infectious Diseases: Use of Next-generation Peptide Microarrays for Rapid Discovery and Mapping of Antigenic Determinants*

    PubMed Central

    Carmona, Santiago J.; Nielsen, Morten; Schafer-Nielsen, Claus; Mucci, Juan; Altcheh, Jaime; Balouz, Virginia; Tekiel, Valeria; Frasch, Alberto C.; Campetella, Oscar; Buscaglia, Carlos A.; Agüero, Fernán

    2015-01-01

    Complete characterization of antibody specificities associated to natural infections is expected to provide a rich source of serologic biomarkers with potential applications in molecular diagnosis, follow-up of chemotherapeutic treatments, and prioritization of targets for vaccine development. Here, we developed a highly-multiplexed platform based on next-generation high-density peptide microarrays to map these specificities in Chagas Disease, an exemplar of a human infectious disease caused by the protozoan Trypanosoma cruzi. We designed a high-density peptide microarray containing more than 175,000 overlapping 15mer peptides derived from T. cruzi proteins. Peptides were synthesized in situ on microarray slides, spanning the complete length of 457 parasite proteins with fully overlapped 15mers (1 residue shift). Screening of these slides with antibodies purified from infected patients and healthy donors demonstrated both a high technical reproducibility as well as epitope mapping consistency when compared with earlier low-throughput technologies. Using a conservative signal threshold to classify positive (reactive) peptides we identified 2,031 disease-specific peptides and 97 novel parasite antigens, effectively doubling the number of known antigens and providing a 10-fold increase in the number of fine mapped antigenic determinants for this disease. Finally, further analysis of the chip data showed that optimizing the amount of sequence overlap of displayed peptides can increase the protein space covered in a single chip by at least ∼threefold without sacrificing sensitivity. In conclusion, we show the power of high-density peptide chips for the discovery of pathogen-specific linear B-cell epitopes from clinical samples, thus setting the stage for high-throughput biomarker discovery screenings and proteome-wide studies of immune responses against pathogens. PMID:25922409

  4. Discovery of novel targets with high throughput RNA interference screening.

    PubMed

    Kassner, Paul D

    2008-03-01

    High throughput technologies have the potential to affect all aspects of drug discovery. Considerable attention is paid to high throughput screening (HTS) for small molecule lead compounds. The identification of the targets that enter those HTS campaigns had been driven by basic research until the advent of genomics level data acquisition such as sequencing and gene expression microarrays. Large-scale profiling approaches (e.g., microarrays, protein analysis by mass spectrometry, and metabolite profiling) can yield vast quantities of data and important information. However, these approaches usually require painstaking in silico analysis and low-throughput basic wet-lab research to identify the function of a gene and validate the gene product as a potential therapeutic drug target. Functional genomic screening offers the promise of direct identification of genes involved in phenotypes of interest. In this review, RNA interference (RNAi) mediated loss-of-function screens will be discussed and as well as their utility in target identification. Some of the genes identified in these screens should produce similar phenotypes if their gene products are antagonized with drugs. With a carefully chosen phenotype, an understanding of the biology of RNAi and appreciation of the limitations of RNAi screening, there is great potential for the discovery of new drug targets.

  5. High-throughput mass spectrometric cytochrome P450 inhibition screening.

    PubMed

    Lim, Kheng B; Ozbal, Can C; Kassel, Daniel B

    2013-01-01

    We describe here a high-throughput assay to support rapid evaluation of drug discovery compounds for possible drug-drug interaction (DDI). Each compound is evaluated for its DDI potential by incubating over a range of eight concentrations and against a panel of six cytochrome P450 (CYP) enzymes: 1A2, 2C8, 2C9, 2C19, 2D6, and 3A4. The method utilizes automated liquid handling for sample preparation, and online solid-phase extraction/tandem mass spectrometry (SPE/MS/MS) for sample analyses. The system is capable of generating two 96-well assay plates in 30 min, and completes the data acquisition and analysis of both plates in about 30 min. Many laboratories that perform the CYP inhibition screening automate only part of the processes leaving a throughput bottleneck within the workflow. The protocols described in this chapter are aimed to streamline the entire process from assay to data acquisition and processing by incorporating automation and utilizing high-precision instrument to maximize throughput and minimize bottleneck.

  6. A Microchip for High-throughput Axon Growth Drug Screening

    PubMed Central

    Kim, Hyun Soo; Jeong, Sehoon; Koo, Chiwan; Han, Arum; Park, Jaewon

    2016-01-01

    It has been recently known that not only the presence of inhibitory molecules associated with myelin but also the reduced growth capability of the axons limit mature central nervous system (CNS) axonal regeneration after injury. Conventional axon growth studies are typically conducted using multi-well cell culture plates that are very challenging to investigate localized effects of drugs and limited to low throughput. Unfortunately, there is currently no other in vitro tools that allow investigating localized axonal responses to biomolecules in high-throughput for screening potential drugs that might promote axonal growth. We have developed a compartmentalized neuron culture platform enabling localized biomolecular treatments in parallel to axons that are physically and fluidically isolated from their neuronal somata. The 24 axon compartments in the developed platform are designed to perform four sets of six different localized biomolecular treatments simultaneously on a single device. In addition, the novel microfluidic configuration allows culture medium of 24 axon compartments to be replenished altogether by a single aspiration process, making high-throughput drug screening a reality. PMID:27928514

  7. 20150325 - Application of High-Throughput In Vitro Assays for ...

    EPA Pesticide Factsheets

    Multiple drivers shape the types of human-health assessments performed on chemicals by U.S. EPA resulting in chemical assessments are “fit-for-purpose” ranging from prioritization for further testing to full risk assessments. Layered on top of the diverse assessment needs are the resource intensive nature of traditional toxicological studies used to test chemicals and the lack of toxicity information on many chemicals. To address these challenges, the Agency initiated the ToxCast program to screen thousands of chemicals across hundreds of high-throughput screening assays in concentrations-response format. One of the findings of the project has been that the majority of chemicals interact with multiple biological targets within a narrow concentration range and the extent of interactions increases rapidly near the concentration causing cytotoxicity. This means that application of high-throughput in vitro assays to chemical assessments will need to identify both the relative selectivity at chemicals interact with biological targets and the concentration at which these interactions perturb signaling pathways. The integrated analyses will be used to both define a point-of-departure for comparison with human exposure estimates and identify which chemicals may benefit from further studies in a mode-of-action or adverse outcome pathway framework. The application of new technologies in a risk-based, tiered manner provides flexibility in matching throughput and cos

  8. A High Throughput Mechanical Screening Device for Cartilage Tissue Engineering

    PubMed Central

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Greg R.; Cosgrove, Brian D.; Dodge, George R.; Mauck, Robert L.

    2014-01-01

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying ‘hits’, or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. PMID:24275442

  9. High-throughput electrical characterization for robust overlay lithography control

    NASA Astrophysics Data System (ADS)

    Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.

    2017-03-01

    Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.

  10. High throughput computing: a solution for scientific analysis

    USGS Publications Warehouse

    O'Donnell, M.

    2011-01-01

    handle job failures due to hardware, software, or network interruptions (obviating the need to manually resubmit the job after each stoppage); be affordable; and most importantly, allow us to complete very large, complex analyses that otherwise would not even be possible. In short, we envisioned a job-management system that would take advantage of unused FORT CPUs within a local area network (LAN) to effectively distribute and run highly complex analytical processes. What we found was a solution that uses High Throughput Computing (HTC) and High Performance Computing (HPC) systems to do exactly that (Figure 1).

  11. High Throughput WAN Data Transfer with Hadoop-based Storage

    NASA Astrophysics Data System (ADS)

    Amin, A.; Bockelman, B.; Letts, J.; Levshina, T.; Martin, T.; Pi, H.; Sfiligoi, I.; Thomas, M.; Wüerthwein, F.

    2011-12-01

    Hadoop distributed file system (HDFS) is becoming more popular in recent years as a key building block of integrated grid storage solution in the field of scientific computing. Wide Area Network (WAN) data transfer is one of the important data operations for large high energy physics experiments to manage, share and process datasets of PetaBytes scale in a highly distributed grid computing environment. In this paper, we present the experience of high throughput WAN data transfer with HDFS-based Storage Element. Two protocols, GridFTP and fast data transfer (FDT), are used to characterize the network performance of WAN data transfer.

  12. Achieving High Throughput for Data Transfer over ATM Networks

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.; Townsend, Jeffrey N.

    1996-01-01

    File-transfer rates for ftp are often reported to be relatively slow, compared to the raw bandwidth available in emerging gigabit networks. While a major bottleneck is disk I/O, protocol issues impact performance as well. Ftp was developed and optimized for use over the TCP/IP protocol stack of the Internet. However, TCP has been shown to run inefficiently over ATM. In an effort to maximize network throughput, data-transfer protocols can be developed to run over UDP or directly over IP, rather than over TCP. If error-free transmission is required, techniques for achieving reliable transmission can be included as part of the transfer protocol. However, selected image-processing applications can tolerate a low level of errors in images that are transmitted over a network. In this paper we report on experimental work to develop a high-throughput protocol for unreliable data transfer over ATM networks. We attempt to maximize throughput by keeping the communications pipe full, but still keep packet loss under five percent. We use the Bay Area Gigabit Network Testbed as our experimental platform.

  13. Controlling high-throughput manufacturing at the nano-scale

    NASA Astrophysics Data System (ADS)

    Cooper, Khershed P.

    2013-09-01

    Interest in nano-scale manufacturing research and development is growing. The reason is to accelerate the translation of discoveries and inventions of nanoscience and nanotechnology into products that would benefit industry, economy and society. Ongoing research in nanomanufacturing is focused primarily on developing novel nanofabrication techniques for a variety of applications—materials, energy, electronics, photonics, biomedical, etc. Our goal is to foster the development of high-throughput methods of fabricating nano-enabled products. Large-area parallel processing and highspeed continuous processing are high-throughput means for mass production. An example of large-area processing is step-and-repeat nanoimprinting, by which nanostructures are reproduced again and again over a large area, such as a 12 in wafer. Roll-to-roll processing is an example of continuous processing, by which it is possible to print and imprint multi-level nanostructures and nanodevices on a moving flexible substrate. The big pay-off is high-volume production and low unit cost. However, the anticipated cost benefits can only be realized if the increased production rate is accompanied by high yields of high quality products. To ensure product quality, we need to design and construct manufacturing systems such that the processes can be closely monitored and controlled. One approach is to bring cyber-physical systems (CPS) concepts to nanomanufacturing. CPS involves the control of a physical system such as manufacturing through modeling, computation, communication and control. Such a closely coupled system will involve in-situ metrology and closed-loop control of the physical processes guided by physics-based models and driven by appropriate instrumentation, sensing and actuation. This paper will discuss these ideas in the context of controlling high-throughput manufacturing at the nano-scale.

  14. SSFinder: high throughput CRISPR-Cas target sites prediction tool.

    PubMed

    Upadhyay, Santosh Kumar; Sharma, Shailesh

    2014-01-01

    Clustered regularly interspaced short palindromic repeats (CRISPR) and CRISPR-associated protein (Cas) system facilitates targeted genome editing in organisms. Despite high demand of this system, finding a reliable tool for the determination of specific target sites in large genomic data remained challenging. Here, we report SSFinder, a python script to perform high throughput detection of specific target sites in large nucleotide datasets. The SSFinder is a user-friendly tool, compatible with Windows, Mac OS, and Linux operating systems, and freely available online.

  15. Elimination of redundant protein identifications in high throughput proteomics.

    PubMed

    Kearney, Robert; Blondeau, Francois; McPherson, Peter; Bell, Alex; Servant, Florence; Drapeau, Mathieu; de Grandpre, Sebastien; Jm Bergeron, John

    2005-01-01

    Tandem mass spectrometry followed by data base search is the preferred method for protein identification in high throughput proteomics. However, standard analysis methods give rise to highly redundant lists of proteins with many proteins identified by the same sets of peptides. In essence, this is a list of all proteins that might be present in the sample. Here we present an algorithm that eliminates redundancy and determines the minimum number of proteins needed to explain the peptides observed. We demonstrate that application of the algorithm results in a significantly smaller set of proteins and greatly reduces the number of "shared" peptides.

  16. Analysis of individual protein turnover in live animals on a proteome-wide scale.

    PubMed

    Reckow, Stefan; Webhofer, Christian

    2014-01-01

    Classical quantitative proteomics studies focus on the relative or absolute concentration of proteins at a given time. In contrast, the investigation of protein turnover reveals the dynamics leading to these states. Analyzing the balance between synthesis and degradation of individual proteins provides insights into the regulation of protein concentration and helps understanding underlying biological processes. Comparing the half-lives of proteins allows detecting functional relationships and common regulation mechanisms. Moreover, comparing turnover of individual brain and plasma proteins between control- and treatment-groups indicates turnover changes induced by the treatment.Here, we describe a procedure for determining turnover information of individual proteins in mice on a proteome-wide scale based on partial (15)N metabolic labeling. We will outline the complete experimental workflow starting from (15)N labeling the animals over sample preparation and mass spectrometric measurement up to the analysis of the data.

  17. Mapping Proteome-Wide Interactions of Reactive Chemicals Using Chemoproteomic Platforms

    PubMed Central

    Counihan, Jessica L.; Ford, Breanna; Nomura, Daniel K.

    2015-01-01

    A large number of pharmaceuticals, endogenous metabolites, and environmental chemicals act through covalent mechanisms with protein targets. Yet, their specific interactions with the proteome still remain poorly defined for most of these reactive chemicals. Deciphering direct protein targets of reactive small-molecules is critical in understanding their biological action, off-target effects, potential toxicological liabilities, and development of safer and more selective agents. Chemoproteomic technologies have arisen as a powerful strategy that enable the assessment of proteome-wide interactions of these irreversible agents directly in complex biological systems. We review here several chemoproteomic strategies that have facilitated our understanding of specific protein interactions of irreversibly-acting pharmaceuticals, endogenous metabolites, and environmental electrophiles to reveal novel pharmacological, biological, and toxicological mechanisms. PMID:26647369

  18. High throughput nanoimprint lithography for semiconductor memory applications

    NASA Astrophysics Data System (ADS)

    Ye, Zhengmao; Zhang, Wei; Khusnatdinov, Niyaz; Stachowiak, Tim; Irving, J. W.; Longsine, Whitney; Traub, Matthew; Fletcher, Brian; Liu, Weijun

    2017-03-01

    Imprint lithography is a promising technology for replication of nano-scale features. For semiconductor device applications, Canon deposits a low viscosity resist on a field by field basis using jetting technology. A patterned mask is lowered into the resist fluid which then quickly flows into the relief patterns in the mask by capillary action. Following this filling step, the resist is crosslinked under UV radiation, and then the mask is removed, leaving a patterned resist on the substrate. There are two critical components to meeting throughput requirements for imprint lithography. Using a similar approach to what is already done for many deposition and etch processes, imprint stations can be clustered to enhance throughput. The FPA-1200NZ2C is a four station cluster system designed for high volume manufacturing. For a single station, throughput includes overhead, resist dispense, resist fill time (or spread time), exposure and separation. Resist exposure time and mask/wafer separation are well understood processing steps with typical durations on the order of 0.10 to 0.20 seconds. To achieve a total process throughput of 17 wafers per hour (wph) for a single station, it is necessary to complete the fluid fill step in 1.2 seconds. For a throughput of 20 wph, fill time must be reduced to only one 1.1 seconds. There are several parameters that can impact resist filling. Key parameters include resist drop volume (smaller is better), system controls (which address drop spreading after jetting), Design for Imprint or DFI (to accelerate drop spreading) and material engineering (to promote wetting between the resist and underlying adhesion layer). In addition, it is mandatory to maintain fast filling, even for edge field imprinting. In this paper, we address the improvements made in all of these parameters to first enable a 1.20 second filling process for a device like pattern and have demonstrated this capability for both full fields and edge fields. Non

  19. Spectrum-to-Spectrum Searching Using a Proteome-wide Spectral Library*

    PubMed Central

    Yen, Chia-Yu; Houel, Stephane; Ahn, Natalie G.; Old, William M.

    2011-01-01

    The unambiguous assignment of tandem mass spectra (MS/MS) to peptide sequences remains a key unsolved problem in proteomics. Spectral library search strategies have emerged as a promising alternative for peptide identification, in which MS/MS spectra are directly compared against a reference library of confidently assigned spectra. Two problems relate to library size. First, reference spectral libraries are limited to rediscovery of previously identified peptides and are not applicable to new peptides, because of their incomplete coverage of the human proteome. Second, problems arise when searching a spectral library the size of the entire human proteome. We observed that traditional dot product scoring methods do not scale well with spectral library size, showing reduction in sensitivity when library size is increased. We show that this problem can be addressed by optimizing scoring metrics for spectrum-to-spectrum searches with large spectral libraries. MS/MS spectra for the 1.3 million predicted tryptic peptides in the human proteome are simulated using a kinetic fragmentation model (MassAnalyzer version2.1) to create a proteome-wide simulated spectral library. Searches of the simulated library increase MS/MS assignments by 24% compared with Mascot, when using probabilistic and rank based scoring methods. The proteome-wide coverage of the simulated library leads to 11% increase in unique peptide assignments, compared with parallel searches of a reference spectral library. Further improvement is attained when reference spectra and simulated spectra are combined into a hybrid spectral library, yielding 52% increased MS/MS assignments compared with Mascot searches. Our study demonstrates the advantages of using probabilistic and rank based scores to improve performance of spectrum-to-spectrum search strategies. PMID:21532008

  20. High-throughput PCR in silicon based microchamber array.

    PubMed

    Nagai, H; Murakami, Y; Yokoyama, K; Tamiya, E

    2001-12-01

    Highly integrated hybridization assay and capillary electrophoresis have improved the throughput of DNA analysis. The shift to high throughput analysis requires a high speed DNA amplification system, and several rapid PCR systems have been developed. In these thermal cyclers, the temperature was controlled by effective methodology instead of a large heating/cooling block preventing rapid thermal cycling. In our research, high speed PCR was performed using a silicon-based microchamber array and three heat blocks. The highly integrated microchamber array was fabricated by semiconductor microfabrication techniques. The temperature of the PCR microchamber was controlled by alternating between three heat blocks of different temperature. In general, silicon has excellent thermal conductivity, and the heat capacity is small in the miniaturized sample volume. Hence, the heating/cooling rate was rapid, approximately 16 degrees C/s. The rapid PCR was therefore completed in 18 min for 40 cycles. The thermal cycle time was reduced to 1/10 of a commercial PCR instrument (Model 9600, PE Applied Biosystems-3 h).

  1. Proteome-wide post-translational modification statistics: frequency analysis and curation of the swiss-prot database

    PubMed Central

    Khoury, George A.; Baliban, Richard C.; Floudas, Christodoulos A.

    2011-01-01

    Post-translational modifications (PTMs) broadly contribute to the recent explosion of proteomic data and possess a complexity surpassing that of protein design. PTMs are the chemical modification of a protein after its translation, and have wide effects broadening its range of functionality. Based on previous estimates, it is widely believed that more than half of proteins are glycoproteins. Whereas mutations can only occur once per position, different forms of post-translational modifications may occur in tandem. With the number and abundances of modifications constantly being discovered, there is no method to readily assess their relative levels. Here we report the relative abundances of each PTM found experimentally and putatively, from high-quality, manually curated, proteome-wide data, and show that at best, less than one-fifth of proteins are glycosylated. We make available to the academic community a continuously updated resource (http://selene.princeton.edu/PTMCuration) containing the statistics so scientists can assess “how many” of each PTM exists. PMID:22034591

  2. High-throughput karyotyping of human pluripotent stem cells.

    PubMed

    Lund, Riikka J; Nikula, Tuomas; Rahkonen, Nelly; Närvä, Elisa; Baker, Duncan; Harrison, Neil; Andrews, Peter; Otonkoski, Timo; Lahesmaa, Riitta

    2012-11-01

    Genomic integrity of human pluripotent stem cell (hPSC) lines requires routine monitoring. We report here that novel karyotyping assay, utilizing bead-bound bacterial artificial chromosome probes, provides a fast and easy tool for detection of chromosomal abnormalities in hPSC lines. The analysis can be performed from low amounts of DNA isolated from whole cell pools with simple data analysis interface. The method enables routine screening of stem cell lines in a cost-efficient high-throughput manner. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. High-throughput karyotyping of human pluripotent stem cells

    PubMed Central

    Lund, Riikka J.; Nikula, Tuomas; Rahkonen, Nelly; Närvä, Elisa; Baker, Duncan; Harrison, Neil; Andrews, Peter; Otonkoski, Timo; Lahesmaa, Riitta

    2012-01-01

    Genomic integrity of human pluripotent stem cell (hPSC) lines requires routine monitoring. We report here that novel karyotyping assay, utilizing bead-bound bacterial artificial chromosome probes, provides a fast and easy tool for detection of chromosomal abnormalities in hPSC lines. The analysis can be performed from low amounts of DNA isolated from whole cell pools with simple data analysis interface. The method enables routine screening of stem cell lines in a cost-efficient high-throughput manner. PMID:22877823

  4. Developing soluble polymers for high-throughput synthetic chemistry.

    PubMed

    Spanka, Carsten; Wentworth, Paul; Janda, Kim D

    2002-05-01

    Soluble polymers have emerged as viable alternatives to resin supports across the broad spectrum of high-throughput organic chemistry. As the application of these supports become more widespread, issues such as broad-spectrum solubility and loading are becoming limiting factors and therefore new polymers are required to overcome such limitations. This article details the approach made within our group to new soluble polymer supports and specifically focuses on parallel libraries of block copolymers, de novo poly(styrene-co-chloromethylstyrene), PEG- stealth stars, and substituted poly(norbornylene)s.

  5. Quick 96FASP for high throughput quantitative proteome analysis.

    PubMed

    Yu, Yanbao; Bekele, Shiferaw; Pieper, Rembert

    2017-08-23

    Filter aided sample preparation (FASP) is becoming a central method for proteomic sample cleanup and peptide generation prior to LC-MS analysis. We previously adapted this method to a 96-well filter plate, and applied to prepare protein digests from cell lysate and body fluid samples in a high throughput quantitative manner. While the 96FASP approach is scalable and can handle multiple samples simultaneously, two key advantages compared to single FASP, it is also time-consuming. The centrifugation-based liquid transfer on the filter plate takes 3-5 times longer than single filter. To address this limitation, we now present a quick 96FASP (named q96FASP) approach that, relying on the use of filter membranes with a large MWCO size (~30kDa), significantly reduces centrifugal times. We show that q96FASP allows the generation of protein digests derived from whole cell lysates and body fluids in a quality similar to that of the single FASP method. Processing a sample in multiple wells in parallel, we observed excellent experimental repeatability by label-free quantitation approach. We conclude that the q96FASP approach promises to be a promising cost- and time-effective method for shotgun proteomics and will be particularly useful in large scale biomarker discovery studies. High throughput sample processing is of particular interests for quantitative proteomics. The previously developed 96FASP is high throughput and appealing, however it is time-consuming in the context of centrifugation-based liquid transfer (~1.5h per spin). This study presents a truly high throughput sample preparation method based on large cut-off 96-well filter plate, which shortens the spin time to ~20min. To our knowledge, this is the first multi-well method that is entirely comparable with conventional FASP. This study thoroughly examined two types of filter plates and performed side-by-side comparisons with single FASP. Two types of samples, whole cell lysate of a UTI (urinary tract infection

  6. Orchestrating high-throughput genomic analysis with Bioconductor

    PubMed Central

    Huber, Wolfgang; Carey, Vincent J.; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S.; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D.; Irizarry, Rafael A.; Lawrence, Michael; Love, Michael I.; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K.; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K.; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-01-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors. PMID:25633503

  7. Adaptive Sampling for High Throughput Data Using Similarity Measures

    SciTech Connect

    Bulaevskaya, V.; Sales, A. P.

    2015-05-06

    The need for adaptive sampling arises in the context of high throughput data because the rates of data arrival are many orders of magnitude larger than the rates at which they can be analyzed. A very fast decision must therefore be made regarding the value of each incoming observation and its inclusion in the analysis. In this report we discuss one approach to adaptive sampling, based on the new data point’s similarity to the other data points being considered for inclusion. We present preliminary results for one real and one synthetic data set.

  8. Genomic outlier detection in high-throughput data analysis.

    PubMed

    Ghosh, Debashis

    2013-01-01

    In the analysis of high-throughput data, a very common goal is the detection of genes or of differential expression between two groups or classes. A recent finding from the scientific literature in prostate cancer demonstrates that by searching for a different pattern of differential expression, new candidate oncogenes might be found. In this chapter, we discuss the statistical problem, termed oncogene outlier detection, and discuss a variety of proposals to this problem. A statistical model in the multiclass situation is described; links with multiple testing concepts are established. Some new nonparametric procedures are described and compared to existing methods using simulation studies.

  9. High-throughput expression in microplate format in Saccharomyces cerevisiae.

    PubMed

    Holz, Caterina; Lang, Christine

    2004-01-01

    We have developed a high-throughput technology that allows parallel expression, purification, and analysis of large numbers of cloned cDNAs in the yeast Saccharomyces cerevisiae. The technology is based on a vector for intracellular protein expression under control of the inducible CUP1 promoter, where the gene products are fused to specific peptide sequences. These N-terminal and C-terminal epitope tags allow the immunological identification and purification of the gene products independent of the protein produced. By introducing the method of recombinational cloning we avoid time-consuming re-cloning steps and enable the easy switching between different expression vectors and host systems.

  10. A High-Throughput Strategy for Dissecting Mammalian Genetic Interactions

    PubMed Central

    Stockman, Victoria B.; Ghamsari, Lila; Lasso, Gorka; Honig, Barry

    2016-01-01

    Comprehensive delineation of complex cellular networks requires high-throughput interrogation of genetic interactions. To address this challenge, we describe the development of a multiplex combinatorial strategy to assess pairwise genetic interactions using CRISPR-Cas9 genome editing and next-generation sequencing. We characterize the performance of combinatorial genome editing and analysis using different promoter and gRNA designs and identified regions of the chimeric RNA that are compatible with next-generation sequencing preparation and quantification. This approach is an important step towards elucidating genetic networks relevant to human diseases and the development of more efficient Cas9-based therapeutics. PMID:27936040

  11. High-Throughput Sequencing: A Roadmap Toward Community Ecology

    PubMed Central

    Poisot, Timothée; Péquin, Bérangère; Gravel, Dominique

    2013-01-01

    High-throughput sequencing is becoming increasingly important in microbial ecology, yet it is surprisingly under-used to generate or test biogeographic hypotheses. In this contribution, we highlight how adding these methods to the ecologist toolbox will allow the detection of new patterns, and will help our understanding of the structure and dynamics of diversity. Starting with a review of ecological questions that can be addressed, we move on to the technical and analytical issues that will benefit from an increased collaboration between different disciplines. PMID:23610649

  12. Live Cell Optical Sensing for High Throughput Applications

    NASA Astrophysics Data System (ADS)

    Fang, Ye

    Live cell optical sensing employs label-free optical biosensors to non-invasively measure stimulus-induced dynamic mass redistribution (DMR) in live cells within the sensing volume of the biosensor. The resultant DMR signal is an integrated cellular response, and reflects cell signaling mediated through the cellular target(s) with which the stimulus intervenes. This article describes the uses of live cell optical sensing for probing cell biology and ligand pharmacology, with an emphasis of resonant waveguide grating biosensor cellular assays for high throughput applications.

  13. Orchestrating high-throughput genomic analysis with Bioconductor.

    PubMed

    Huber, Wolfgang; Carey, Vincent J; Gentleman, Robert; Anders, Simon; Carlson, Marc; Carvalho, Benilton S; Bravo, Hector Corrada; Davis, Sean; Gatto, Laurent; Girke, Thomas; Gottardo, Raphael; Hahne, Florian; Hansen, Kasper D; Irizarry, Rafael A; Lawrence, Michael; Love, Michael I; MacDonald, James; Obenchain, Valerie; Oleś, Andrzej K; Pagès, Hervé; Reyes, Alejandro; Shannon, Paul; Smyth, Gordon K; Tenenbaum, Dan; Waldron, Levi; Morgan, Martin

    2015-02-01

    Bioconductor is an open-source, open-development software project for the analysis and comprehension of high-throughput data in genomics and molecular biology. The project aims to enable interdisciplinary research, collaboration and rapid development of scientific software. Based on the statistical programming language R, Bioconductor comprises 934 interoperable packages contributed by a large, diverse community of scientists. Packages cover a range of bioinformatic and statistical applications. They undergo formal initial review and continuous automated testing. We present an overview for prospective users and contributors.

  14. Computational Proteomics: High-throughput Analysis for Systems Biology

    SciTech Connect

    Cannon, William R.; Webb-Robertson, Bobbie-Jo M.

    2007-01-03

    High-throughput (HTP) proteomics is a rapidly developing field that offers the global profiling of proteins from a biological system. The HTP technological advances are fueling a revolution in biology, enabling analyses at the scales of entire systems (e.g., whole cells, tumors, or environmental communities). However, simply identifying the proteins in a cell is insufficient for understanding the underlying complexity and operating mechanisms of the overall system. Systems level investigations are relying more and more on computational analyses, especially in the field of proteomics generating large-scale global data.

  15. Analysis of High-Throughput ELISA Microarray Data

    SciTech Connect

    White, Amanda M.; Daly, Don S.; Zangar, Richard C.

    2011-02-23

    Our research group develops analytical methods and software for the high-throughput analysis of quantitative enzyme-linked immunosorbent assay (ELISA) microarrays. ELISA microarrays differ from DNA microarrays in several fundamental aspects and most algorithms for analysis of DNA microarray data are not applicable to ELISA microarrays. In this review, we provide an overview of the steps involved in ELISA microarray data analysis and how the statistically sound algorithms we have developed provide an integrated software suite to address the needs of each data-processing step. The algorithms discussed are available in a set of open-source software tools (http://www.pnl.gov/statistics/ProMAT).

  16. High-throughput quantitative real-time PCR.

    PubMed

    Arany, Zoltan P

    2008-07-01

    Recent technical advances in quantitative real-time PCR (qRT-PCR) have allowed for extensive miniaturization, thereby rendering the technique amenable to high-throughput assays. Large numbers of different nucleic acids can now rapidly be measured quantitatively. Many investigations can benefit from this approach, including determination of gene expression in hundreds of samples, determination of hundreds of genes in a few samples, or even quantification of nucleic acids other than mRNA. A simple technique is described here to quantify 1880 transcripts of choice from any number of starting RNA samples.

  17. Plant chip for high-throughput phenotyping of Arabidopsis.

    PubMed

    Jiang, Huawei; Xu, Zhen; Aluru, Maneesha R; Dong, Liang

    2014-04-07

    We report on the development of a vertical and transparent microfluidic chip for high-throughput phenotyping of Arabidopsis thaliana plants. Multiple Arabidopsis seeds can be germinated and grown hydroponically over more than two weeks in the chip, thus enabling large-scale and quantitative monitoring of plant phenotypes. The novel vertical arrangement of this microfluidic device not only allows for normal gravitropic growth of the plants but also, more importantly, makes it convenient to continuously monitor phenotypic changes in plants at the whole organismal level, including seed germination and root and shoot growth (hypocotyls, cotyledons, and leaves), as well as at the cellular level. We also developed a hydrodynamic trapping method to automatically place single seeds into seed holding sites of the device and to avoid potential damage to seeds that might occur during manual loading. We demonstrated general utility of this microfluidic device by showing clear visible phenotypes of the immutans mutant of Arabidopsis, and we also showed changes occurring during plant-pathogen interactions at different developmental stages. Arabidopsis plants grown in the device maintained normal morphological and physiological behaviour, and distinct phenotypic variations consistent with a priori data were observed via high-resolution images taken in real time. Moreover, the timeline for different developmental stages for plants grown in this device was highly comparable to growth using a conventional agar plate method. This prototype plant chip technology is expected to lead to the establishment of a powerful experimental and cost-effective framework for high-throughput and precise plant phenotyping.

  18. High throughput inclusion body sizing: Nano particle tracking analysis.

    PubMed

    Reichelt, Wieland N; Kaineder, Andreas; Brillmann, Markus; Neutsch, Lukas; Taschauer, Alexander; Lohninger, Hans; Herwig, Christoph

    2017-06-01

    The expression of pharmaceutical relevant proteins in Escherichia coli frequently triggers inclusion body (IB) formation caused by protein aggregation. In the scientific literature, substantial effort has been devoted to the quantification of IB size. However, particle-based methods used up to this point to analyze the physical properties of representative numbers of IBs lack sensitivity and/or orthogonal verification. Using high pressure freezing and automated freeze substitution for transmission electron microscopy (TEM) the cytosolic inclusion body structure was preserved within the cells. TEM imaging in combination with manual grey scale image segmentation allowed the quantification of relative areas covered by the inclusion body within the cytosol. As a high throughput method nano particle tracking analysis (NTA) enables one to derive the diameter of inclusion bodies in cell homogenate based on a measurement of the Brownian motion. The NTA analysis of fixated (glutaraldehyde) and non-fixated IBs suggests that high pressure homogenization annihilates the native physiological shape of IBs. Nevertheless, the ratio of particle counts of non-fixated and fixated samples could potentially serve as factor for particle stickiness. In this contribution, we establish image segmentation of TEM pictures as an orthogonal method to size biologic particles in the cytosol of cells. More importantly, NTA has been established as a particle-based, fast and high throughput method (1000-3000 particles), thus constituting a much more accurate and representative analysis than currently available methods. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. High throughput SNP detection system based on magnetic nanoparticles separation.

    PubMed

    Liu, Bin; Jia, Yingying; Ma, Man; Li, Zhiyang; Liu, Hongna; Li, Song; Deng, Yan; Zhang, Liming; Lu, Zhuoxuan; Wang, Wei; He, Nongyue

    2013-02-01

    Single-nucleotide polymorphism (SNP) was one-base variations in DNA sequence that can often be helpful to find genes associations for hereditary disease, communicable disease and so on. We developed a high throughput SNP detection system based on magnetic nanoparticles (MNPs) separation and dual-color hybridization or single base extension. This system includes a magnetic separation unit for sample separation, three high precision robot arms for pipetting and microtiter plate transferring respectively, an accurate temperature control unit for PCR and DNA hybridization and a high accurate and sensitive optical signal detection unit for fluorescence detection. The cyclooxygenase-2 gene promoter region--65G > C polymorphism locus SNP genotyping experiment for 48 samples from the northern Jiangsu area has been done to verify that if this system can simplify manual operation of the researchers, save time and improve efficiency in SNP genotyping experiments. It can realize sample preparation, target sequence amplification, signal detection and data analysis automatically and can be used in clinical molecule diagnosis and high throughput fluorescence immunological detection and so on.

  20. A high throughput mechanical screening device for cartilage tissue engineering.

    PubMed

    Mohanraj, Bhavana; Hou, Chieh; Meloni, Gregory R; Cosgrove, Brian D; Dodge, George R; Mauck, Robert L

    2014-06-27

    Articular cartilage enables efficient and near-frictionless load transmission, but suffers from poor inherent healing capacity. As such, cartilage tissue engineering strategies have focused on mimicking both compositional and mechanical properties of native tissue in order to provide effective repair materials for the treatment of damaged or degenerated joint surfaces. However, given the large number design parameters available (e.g. cell sources, scaffold designs, and growth factors), it is difficult to conduct combinatorial experiments of engineered cartilage. This is particularly exacerbated when mechanical properties are a primary outcome, given the long time required for testing of individual samples. High throughput screening is utilized widely in the pharmaceutical industry to rapidly and cost-effectively assess the effects of thousands of compounds for therapeutic discovery. Here we adapted this approach to develop a high throughput mechanical screening (HTMS) system capable of measuring the mechanical properties of up to 48 materials simultaneously. The HTMS device was validated by testing various biomaterials and engineered cartilage constructs and by comparing the HTMS results to those derived from conventional single sample compression tests. Further evaluation showed that the HTMS system was capable of distinguishing and identifying 'hits', or factors that influence the degree of tissue maturation. Future iterations of this device will focus on reducing data variability, increasing force sensitivity and range, as well as scaling-up to even larger (96-well) formats. This HTMS device provides a novel tool for cartilage tissue engineering, freeing experimental design from the limitations of mechanical testing throughput. © 2013 Published by Elsevier Ltd.

  1. Probabilistic Assessment of High-Throughput Wireless Sensor Networks.

    PubMed

    Kim, Robin E; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F; Song, Junho

    2016-05-31

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved.

  2. Genotype-Frequency Estimation from High-Throughput Sequencing Data.

    PubMed

    Maruki, Takahiro; Lynch, Michael

    2015-10-01

    Rapidly improving high-throughput sequencing technologies provide unprecedented opportunities for carrying out population-genomic studies with various organisms. To take full advantage of these methods, it is essential to correctly estimate allele and genotype frequencies, and here we present a maximum-likelihood method that accomplishes these tasks. The proposed method fully accounts for uncertainties resulting from sequencing errors and biparental chromosome sampling and yields essentially unbiased estimates with minimal sampling variances with moderately high depths of coverage regardless of a mating system and structure of the population. Moreover, we have developed statistical tests for examining the significance of polymorphisms and their genotypic deviations from Hardy-Weinberg equilibrium. We examine the performance of the proposed method by computer simulations and apply it to low-coverage human data generated by high-throughput sequencing. The results show that the proposed method improves our ability to carry out population-genomic analyses in important ways. The software package of the proposed method is freely available from https://github.com/Takahiro-Maruki/Package-GFE.

  3. High-throughput fragment screening by affinity LC-MS.

    PubMed

    Duong-Thi, Minh-Dao; Bergström, Maria; Fex, Tomas; Isaksson, Roland; Ohlson, Sten

    2013-02-01

    Fragment screening, an emerging approach for hit finding in drug discovery, has recently been proven effective by its first approved drug, vemurafenib, for cancer treatment. Techniques such as nuclear magnetic resonance, surface plasmon resonance, and isothemal titration calorimetry, with their own pros and cons, have been employed for screening fragment libraries. As an alternative approach, screening based on high-performance liquid chromatography separation has been developed. In this work, we present weak affinity LC/MS as a method to screen fragments under high-throughput conditions. Affinity-based capillary columns with immobilized thrombin were used to screen a collection of 590 compounds from a fragment library. The collection was divided into 11 mixtures (each containing 35 to 65 fragments) and screened by MS detection. The primary screening was performed in <4 h (corresponding to >3500 fragments per day). Thirty hits were defined, which subsequently entered a secondary screening using an active site-blocked thrombin column for confirmation of specificity. One hit showed selective binding to thrombin with an estimated dissociation constant (K (D)) in the 0.1 mM range. This study shows that affinity LC/MS is characterized by high throughput, ease of operation, and low consumption of target and fragments, and therefore it promises to be a valuable method for fragment screening.

  4. Application of computational and high-throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy to be used as a substitute for the current EDSP Ti

  5. Application of Computational and High-Throughput in vitro ...

    EPA Pesticide Factsheets

    Abstract: There are tens of thousands of man-made chemicals to which humans are exposed, but only a fraction of these have the extensive in vivo toxicity data used in most traditional risk assessments. This lack of data, coupled with concerns about testing costs and animal use, are driving the development of new methods for assessing the risk of toxicity. These methods include the use of in vitro high-throughput screening assays and computational models. This talk will review a variety of high-throughput, non-animal methods being used at the U.S. EPA to screen chemicals for a variety of toxicity endpoints, with a focus on their potential to be endocrine disruptors as part of the Endocrine Disruptor Screening Program (EDSP). These methods all start with the use of in vitro assays, e.g. for activity against the estrogen and androgen receptors (ER and AR) and targets in the steroidogenesis and thyroid signaling pathways. Because all individual assays are subject to a variety of noise processes and technology-specific assay artefacts, we have developed methods to create consensus predictions from multiple assays against the same target. The goal of these models is to both robustly predict in vivo activity, and also to provide quantitative estimates of uncertainty. This talk will describe these models, and how they are validated against both in vitro and in vivo reference chemicals. The U.S. EPA has deemed the in vitro ER model results to be of high enough accuracy t

  6. Benchmarking Procedures for High-Throughput Context Specific Reconstruction Algorithms

    PubMed Central

    Pacheco, Maria P.; Pfau, Thomas; Sauter, Thomas

    2016-01-01

    Recent progress in high-throughput data acquisition has shifted the focus from data generation to processing and understanding of how to integrate collected information. Context specific reconstruction based on generic genome scale models like ReconX or HMR has the potential to become a diagnostic and treatment tool tailored to the analysis of specific individuals. The respective computational algorithms require a high level of predictive power, robustness and sensitivity. Although multiple context specific reconstruction algorithms were published in the last 10 years, only a fraction of them is suitable for model building based on human high-throughput data. Beside other reasons, this might be due to problems arising from the limitation to only one metabolic target function or arbitrary thresholding. This review describes and analyses common validation methods used for testing model building algorithms. Two major methods can be distinguished: consistency testing and comparison based testing. The first is concerned with robustness against noise, e.g., missing data due to the impossibility to distinguish between the signal and the background of non-specific binding of probes in a microarray experiment, and whether distinct sets of input expressed genes corresponding to i.e., different tissues yield distinct models. The latter covers methods comparing sets of functionalities, comparison with existing networks or additional databases. We test those methods on several available algorithms and deduce properties of these algorithms that can be compared with future developments. The set of tests performed, can therefore serve as a benchmarking procedure for future algorithms. PMID:26834640

  7. Probabilistic Assessment of High-Throughput Wireless Sensor Networks

    PubMed Central

    Kim, Robin E.; Mechitov, Kirill; Sim, Sung-Han; Spencer, Billie F.; Song, Junho

    2016-01-01

    Structural health monitoring (SHM) using wireless smart sensors (WSS) has the potential to provide rich information on the state of a structure. However, because of their distributed nature, maintaining highly robust and reliable networks can be challenging. Assessing WSS network communication quality before and after finalizing a deployment is critical to achieve a successful WSS network for SHM purposes. Early studies on WSS network reliability mostly used temporal signal indicators, composed of a smaller number of packets, to assess the network reliability. However, because the WSS networks for SHM purpose often require high data throughput, i.e., a larger number of packets are delivered within the communication, such an approach is not sufficient. Instead, in this study, a model that can assess, probabilistically, the long-term performance of the network is proposed. The proposed model is based on readily-available measured data sets that represent communication quality during high-throughput data transfer. Then, an empirical limit-state function is determined, which is further used to estimate the probability of network communication failure. Monte Carlo simulation is adopted in this paper and applied to a small and a full-bridge wireless networks. By performing the proposed analysis in complex sensor networks, an optimized sensor topology can be achieved. PMID:27258270

  8. Iterative ACORN as a high throughput tool in structural genomics.

    PubMed

    Selvanayagam, S; Velmurugan, D; Yamane, T

    2006-08-01

    High throughput macromolecular structure determination is very essential in structural genomics as the available number of sequence information far exceeds the number of available 3D structures. ACORN, a freely available resource in the CCP4 suite of programs is a comprehensive and efficient program for phasing in the determination of protein structures, when atomic resolution data are available. ACORN with the automatic model-building program ARP/wARP and refinement program REFMAC is a suitable combination for the high throughput structural genomics. ACORN can also be run with secondary structural elements like helices and sheets as inputs with high resolution data. In situations, where ACORN phasing is not sufficient for building the protein model, the fragments (incomplete model/dummy atoms) can again be used as a starting input. Iterative ACORN is proved to work efficiently in the subsequent model building stages in congerin (PDB-ID: lis3) and catalase (PDB-ID: 1gwe) for which models are available.

  9. A high throughput respirometric assay for mitochondrial biogenesis and toxicity

    PubMed Central

    Beeson, Craig C.; Beeson, Gyda C.; Schnellmann, Rick G.

    2010-01-01

    Mitochondria are a common target of toxicity for drugs and other chemicals, and results in decreased aerobic metabolism and cell death. In contrast, mitochondrial biogenesis restores cell vitality and there is a need for new agents to induce biogenesis. Current cell-based models of mitochondrial biogenesis or toxicity are inadequate because cultured cell lines are highly glycolytic with minimal aerobic metabolism and altered mitochondrial physiology. In addition, there are no high-throughput, real-time assays that assess mitochondrial function. We adapted primary cultures of renal proximal tubular cells (RPTC) that exhibit in vivo levels of aerobic metabolism, are not glycolytic, and retain higher levels of differentiated functions and used the Seahorse Biosciences analyzer to measure mitochondrial function in real time in multi-well plates. Using uncoupled respiration as a marker of electron transport chain (ETC) integrity, the nephrotoxicants cisplatin, HgCl2 and gentamicin exhibited mitochondrial toxicity prior to decreases in basal respiration and cell death. Conversely, using FCCP-uncoupled respiration as a marker of maximal ETC activity, 1-(2,5-dimethoxy-4-iodophenyl)-2-aminopropane (DOI), SRT1720, resveratrol, daidzein, and metformin produced mitochondrial biogenesis in RPTC. The merger of the RPTC model and multi-well respirometry results in a single high throughput assay to measure mitochondrial biogenesis and toxicity, and nephrotoxic potential. PMID:20465991

  10. New high-throughput methods of investigating polymer electrolytes

    NASA Astrophysics Data System (ADS)

    Alcock, Hannah J.; White, Oliver C.; Jegelevicius, Grazvydas; Roberts, Matthew R.; Owen, John R.

    2011-03-01

    Polymer electrolyte films have been prepared by solution casting techniques from precursor solutions of a poly(vinylidene fluoride-co-hexafluoropropylene) (PVdF-HFP), lithium-bis(trifluoromethane) sulfonimide (LiTFSI), and propylene carbonate (PC). Arrays of graded composition were characterised by electrochemical impedance spectroscopy (EIS), differential scanning calorimetry (DSC) and X-ray diffraction (XRD) using high throughput techniques. Impedance analysis showed the resistance of the films as a function of LiTFSI, PC and polymer content. The ternary plot of conductivity shows an area that combines a solid-like mechanical stability with high conductivity, 1 × 10-5 S cm-1 at the composition 0.55/0.15/0.30 wt% PVdF-HFP/LiTFSI/PC, increasing with PC content. In regions with less than a 50 wt% fraction of PVdF-HFP the films were too soft to give meaningful results by this method. The DSC measurements on solvent free, salt-doped polymers show a reduced crystallinity, and high throughput XRD patterns show that non-polar crystalline phases are suppressed by the presence of LiTFSI and PC.

  11. Proton Diffusion Model for High-Throughput Calculations

    NASA Astrophysics Data System (ADS)

    Wisesa, Pandu; Mueller, Tim

    2013-03-01

    Solid oxide fuel cells (SOFCs) have many advantages over other fuel cells with high efficiency, myriad fuel choices, and low cost. The main issue however is the high operating temperature of SOFCs, which can be lowered by using an electrolyte material with high ionic conductivity, such as proton conducting oxides. Our goal is to identify promising proton-conducting materials in a manner that is time and cost efficient through the utilization of high-throughput calculations. We present a model for proton diffusion developed using machine learning techniques with training data that consists of density functional theory (DFT) calculations on various metal oxides. The built model is tested against other DFT results to see how it performs. The results of the DFT calculations and how the model fares are discussed, with focus on hydrogen diffusion pathways inside the bulk material.

  12. High-throughput flow alignment of barcoded hydrogel microparticles†

    PubMed Central

    Chapin, Stephen C.; Pregibon, Daniel C.

    2010-01-01

    Suspension (particle-based) arrays offer several advantages over conventional planar arrays in the detection and quantification of biomolecules, including the use of smaller sample volumes, more favorable probe-target binding kinetics, and rapid probe-set modification. We present a microfluidic system for the rapid alignment of multifunctional hydrogel microparticles designed to bear one or several biomolecule probe regions, as well as a graphical code to identify the embedded probes. Using high-speed imaging, we have developed and optimized a flow-through system that (1) allows for a high particle throughput, (2) ensures proper particle alignment for decoding and target quantification, and (3) can be reliably operated continuously without clogging. A tapered channel flanked by side focusing streams is used to orient the flexible, tablet-shaped particles into a well-ordered flow in the center of the channel. The effects of channel geometry, particle geometry, particle composition, particle loading density, and barcode design are explored to determine the best combination for eventual use in biological assays. Particles in the optimized system move at velocities of ~50 cm s−1 and with throughputs of ~40 particles s−1. Simple physical models and CFD simulations have been used to investigate flow behavior in the device. PMID:19823726

  13. Evaluation of a high throughput starch analysis optimised for wood.

    PubMed

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes.

  14. High-throughput gene mapping in Caenorhabditis elegans.

    PubMed

    Swan, Kathryn A; Curtis, Damian E; McKusick, Kathleen B; Voinov, Alexander V; Mapa, Felipa A; Cancilla, Michael R

    2002-07-01

    Positional cloning of mutations in model genetic systems is a powerful method for the identification of targets of medical and agricultural importance. To facilitate the high-throughput mapping of mutations in Caenorhabditis elegans, we have identified a further 9602 putative new single nucleotide polymorphisms (SNPs) between two C. elegans strains, Bristol N2 and the Hawaiian mapping strain CB4856, by sequencing inserts from a CB4856 genomic DNA library and using an informatics pipeline to compare sequences with the canonical N2 genomic sequence. When combined with data from other laboratories, our marker set of 17,189 SNPs provides even coverage of the complete worm genome. To date, we have confirmed >1099 evenly spaced SNPs (one every 91 +/- 56 kb) across the six chromosomes and validated the utility of our SNP marker set and new fluorescence polarization-based genotyping methods for systematic and high-throughput identification of genes in C. elegans by cloning several proprietary genes. We illustrate our approach by recombination mapping and confirmation of the mutation in the cloned gene, dpy-18.

  15. High-Throughput Gene Mapping in Caenorhabditis elegans

    PubMed Central

    Swan, Kathryn A.; Curtis, Damian E.; McKusick, Kathleen B.; Voinov, Alexander V.; Mapa, Felipa A.; Cancilla, Michael R.

    2002-01-01

    Positional cloning of mutations in model genetic systems is a powerful method for the identification of targets of medical and agricultural importance. To facilitate the high-throughput mapping of mutations in Caenorhabditis elegans, we have identified a further 9602 putative new single nucleotide polymorphisms (SNPs) between two C. elegans strains, Bristol N2 and the Hawaiian mapping strain CB4856, by sequencing inserts from a CB4856 genomic DNA library and using an informatics pipeline to compare sequences with the canonical N2 genomic sequence. When combined with data from other laboratories, our marker set of 17,189 SNPs provides even coverage of the complete worm genome. To date, we have confirmed >1099 evenly spaced SNPs (one every 91 ± 56 kb) across the six chromosomes and validated the utility of our SNP marker set and new fluorescence polarization-based genotyping methods for systematic and high-throughput identification of genes in C. elegans by cloning several proprietary genes. We illustrate our approach by recombination mapping and confirmation of the mutation in the cloned gene, dpy-18. [The sequence data described in this paper have been submitted to the NCBI dbSNP data library under accession nos. 4388625–4389689 and GenBank dbSTS under accession nos. 973810–974874. The following individuals and institutions kindly provided reagents, samples, or unpublished information as indicated in the paper: The C. elegans Sequencing Consortium and The Caenorhabditis Genetics Center.] PMID:12097347

  16. Compression of Structured High-Throughput Sequencing Data

    PubMed Central

    Campagne, Fabien; Dorff, Kevin C.; Chambwe, Nyasha; Robinson, James T.; Mesirov, Jill P.

    2013-01-01

    Large biological datasets are being produced at a rapid pace and create substantial storage challenges, particularly in the domain of high-throughput sequencing (HTS). Most approaches currently used to store HTS data are either unable to quickly adapt to the requirements of new sequencing or analysis methods (because they do not support schema evolution), or fail to provide state of the art compression of the datasets. We have devised new approaches to store HTS data that support seamless data schema evolution and compress datasets substantially better than existing approaches. Building on these new approaches, we discuss and demonstrate how a multi-tier data organization can dramatically reduce the storage, computational and network burden of collecting, analyzing, and archiving large sequencing datasets. For instance, we show that spliced RNA-Seq alignments can be stored in less than 4% the size of a BAM file with perfect data fidelity. Compared to the previous compression state of the art, these methods reduce dataset size more than 40% when storing exome, gene expression or DNA methylation datasets. The approaches have been integrated in a comprehensive suite of software tools (http://goby.campagnelab.org) that support common analyses for a range of high-throughput sequencing assays. PMID:24260313

  17. Evaluation of a High Throughput Starch Analysis Optimised for Wood

    PubMed Central

    Bellasio, Chandra; Fini, Alessio; Ferrini, Francesco

    2014-01-01

    Starch is the most important long-term reserve in trees, and the analysis of starch is therefore useful source of physiological information. Currently published protocols for wood starch analysis impose several limitations, such as long procedures and a neutralization step. The high-throughput standard protocols for starch analysis in food and feed represent a valuable alternative. However, they have not been optimised or tested with woody samples. These have particular chemical and structural characteristics, including the presence of interfering secondary metabolites, low reactivity of starch, and low starch content. In this study, a standard method for starch analysis used for food and feed (AOAC standard method 996.11) was optimised to improve precision and accuracy for the analysis of starch in wood. Key modifications were introduced in the digestion conditions and in the glucose assay. The optimised protocol was then evaluated through 430 starch analyses of standards at known starch content, matrix polysaccharides, and wood collected from three organs (roots, twigs, mature wood) of four species (coniferous and flowering plants). The optimised protocol proved to be remarkably precise and accurate (3%), suitable for a high throughput routine analysis (35 samples a day) of specimens with a starch content between 40 mg and 21 µg. Samples may include lignified organs of coniferous and flowering plants and non-lignified organs, such as leaves, fruits and rhizomes. PMID:24523863

  18. Discriminative motif analysis of high-throughput dataset

    PubMed Central

    Yao, Zizhen; MacQuarrie, Kyle L.; Fong, Abraham P.; Tapscott, Stephen J.; Ruzzo, Walter L.; Gentleman, Robert C.

    2014-01-01

    Motivation: High-throughput ChIP-seq studies typically identify thousands of peaks for a single transcription factor (TF). It is common for traditional motif discovery tools to predict motifs that are statistically significant against a naïve background distribution but are of questionable biological relevance. Results: We describe a simple yet effective algorithm for discovering differential motifs between two sequence datasets that is effective in eliminating systematic biases and scalable to large datasets. Tested on 207 ENCODE ChIP-seq datasets, our method identifies correct motifs in 78% of the datasets with known motifs, demonstrating improvement in both accuracy and efficiency compared with DREME, another state-of-art discriminative motif discovery tool. More interestingly, on the remaining more challenging datasets, we identify common technical or biological factors that compromise the motif search results and use advanced features of our tool to control for these factors. We also present case studies demonstrating the ability of our method to detect single base pair differences in DNA specificity of two similar TFs. Lastly, we demonstrate discovery of key TF motifs involved in tissue specification by examination of high-throughput DNase accessibility data. Availability: The motifRG package is publically available via the bioconductor repository. Contact: yzizhen@fhcrc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24162561

  19. A Quantitative High Throughput Assay for Identifying Gametocytocidal Compounds

    PubMed Central

    Tanaka, Takeshi Q.; Dehdashti, Seameen J.; Nguyen, Dac-Trung; McKew, John C.; Zheng, Wei; Williamson, Kim C.

    2013-01-01

    Current antimalarial drug treatment does not effectively kill mature Plasmodium falciparum gametocytes, the parasite stage responsible for malaria transmission from human to human via a mosquito. Consequently, following standard therapy malaria can still be transmitted for over a week after the clearance of asexual parasites. A new generation of malaria drugs with gametocytocidal properties, or a gametocytocidal drug that could be used in combinational therapy with currently available antimalarials, is needed to control the spread of the disease and facilitate eradication efforts. We have developed a 1,536-well gametocyte viability assay for the high throughput screening of large compound collections to identify novel compounds with gametocytocidal activity. The signal-to-basal ratio and Z′-factor for this assay were 3.2-fold and 0.68, respectively. The IC50 value of epoxomicin, the positive control compound, was 1.42 ± 0.09 nM that is comparable to previously reported values. This miniaturized assay significantly reduces the number of gametocytes required for the alamarBlue viability assay, and enables high throughput screening for lead discovery efforts. Additionally, the screen does not require a specialized parasite line, gametocytes from any strain, including field isolates, can be tested. A pilot screen utilizing the commercially available LOPAC library, consisting of 1,280 known compounds, revealed two selective gametocytocidal compounds having 54 and 7.8-fold gametocytocidal selectivity in comparison to their cell cytotoxicity effect against the mammalian SH-SY5Y cell line. PMID:23454872

  20. High-throughput technology for novel SO2 oxidation catalysts

    PubMed Central

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F

    2011-01-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations. PMID:27877427

  1. Management of High-Throughput DNA Sequencing Projects: Alpheus

    PubMed Central

    Miller, Neil A.; Kingsmore, Stephen F.; Farmer, Andrew; Langley, Raymond J.; Mudge, Joann; Crow, John A.; Gonzalez, Alvaro J.; Schilkey, Faye D.; Kim, Ryan J.; van Velkinburgh, Jennifer; May, Gregory D.; Black, C. Forrest; Myers, M. Kathy; Utsey, John P.; Frost, Nicholas S.; Sugarbaker, David J.; Bueno, Raphael; Gullans, Stephen R.; Baxter, Susan M.; Day, Steve W.; Retzel, Ernest F.

    2009-01-01

    High-throughput DNA sequencing has enabled systems biology to begin to address areas in health, agricultural and basic biological research. Concomitant with the opportunities is an absolute necessity to manage significant volumes of high-dimensional and inter-related data and analysis. Alpheus is an analysis pipeline, database and visualization software for use with massively parallel DNA sequencing technologies that feature multi-gigabase throughput characterized by relatively short reads, such as Illumina-Solexa (sequencing-by-synthesis), Roche-454 (pyrosequencing) and Applied Biosystem’s SOLiD (sequencing-by-ligation). Alpheus enables alignment to reference sequence(s), detection of variants and enumeration of sequence abundance, including expression levels in transcriptome sequence. Alpheus is able to detect several types of variants, including non-synonymous and synonymous single nucleotide polymorphisms (SNPs), insertions/deletions (indels), premature stop codons, and splice isoforms. Variant detection is aided by the ability to filter variant calls based on consistency, expected allele frequency, sequence quality, coverage, and variant type in order to minimize false positives while maximizing the identification of true positives. Alpheus also enables comparisons of genes with variants between cases and controls or bulk segregant pools. Sequence-based differential expression comparisons can be developed, with data export to SAS JMP Genomics for statistical analysis. PMID:20151039

  2. High throughput instruments, methods, and informatics for systems biology.

    SciTech Connect

    Sinclair, Michael B.; Cowie, Jim R.; Van Benthem, Mark Hilary; Wylie, Brian Neil; Davidson, George S.; Haaland, David Michael; Timlin, Jerilyn Ann; Aragon, Anthony D.; Keenan, Michael Robert; Boyack, Kevin W.; Thomas, Edward Victor; Werner-Washburne, Margaret C.; Mosquera-Caro, Monica P.; Martinez, M. Juanita; Martin, Shawn Bryan; Willman, Cheryl L.

    2003-12-01

    High throughput instruments and analysis techniques are required in order to make good use of the genomic sequences that have recently become available for many species, including humans. These instruments and methods must work with tens of thousands of genes simultaneously, and must be able to identify the small subsets of those genes that are implicated in the observed phenotypes, or, for instance, in responses to therapies. Microarrays represent one such high throughput method, which continue to find increasingly broad application. This project has improved microarray technology in several important areas. First, we developed the hyperspectral scanner, which has discovered and diagnosed numerous flaws in techniques broadly employed by microarray researchers. Second, we used a series of statistically designed experiments to identify and correct errors in our microarray data to dramatically improve the accuracy, precision, and repeatability of the microarray gene expression data. Third, our research developed new informatics techniques to identify genes with significantly different expression levels. Finally, natural language processing techniques were applied to improve our ability to make use of online literature annotating the important genes. In combination, this research has improved the reliability and precision of laboratory methods and instruments, while also enabling substantially faster analysis and discovery.

  3. High-throughput characterization for solar fuels materials discovery

    NASA Astrophysics Data System (ADS)

    Mitrovic, Slobodan; Becerra, Natalie; Cornell, Earl; Guevarra, Dan; Haber, Joel; Jin, Jian; Jones, Ryan; Kan, Kevin; Marcin, Martin; Newhouse, Paul; Soedarmadji, Edwin; Suram, Santosh; Xiang, Chengxiang; Gregoire, John; High-Throughput Experimentation Team

    2014-03-01

    In this talk I will present the status of the High-Throughput Experimentation (HTE) project of the Joint Center for Artificial Photosynthesis (JCAP). JCAP is an Energy Innovation Hub of the U.S. Department of Energy with a mandate to deliver a solar fuel generator based on an integrated photoelectrochemical cell (PEC). However, efficient and commercially viable catalysts or light absorbers for the PEC do not exist. The mission of HTE is to provide the accelerated discovery through combinatorial synthesis and rapid screening of material properties. The HTE pipeline also features high-throughput material characterization using x-ray diffraction and x-ray photoemission spectroscopy (XPS). In this talk I present the currently operating pipeline and focus on our combinatorial XPS efforts to build the largest free database of spectra from mixed-metal oxides, nitrides, sulfides and alloys. This work was performed at Joint Center for Artificial Photosynthesis, a DOE Energy Innovation Hub, supported through the Office of Science of the U.S. Department of Energy under Award No. DE-SC0004993.

  4. High-throughput assays for DNA gyrase and other topoisomerases

    PubMed Central

    Maxwell, Anthony; Burton, Nicolas P.; O'Hagan, Natasha

    2006-01-01

    We have developed high-throughput microtitre plate-based assays for DNA gyrase and other DNA topoisomerases. These assays exploit the fact that negatively supercoiled plasmids form intermolecular triplexes more efficiently than when they are relaxed. Two assays are presented, one using capture of a plasmid containing a single triplex-forming sequence by an oligonucleotide tethered to the surface of a microtitre plate and subsequent detection by staining with a DNA-specific fluorescent dye. The other uses capture of a plasmid containing two triplex-forming sequences by an oligonucleotide tethered to the surface of a microtitre plate and subsequent detection by a second oligonucleotide that is radiolabelled. The assays are shown to be appropriate for assaying DNA supercoiling by Escherichia coli DNA gyrase and DNA relaxation by eukaryotic topoisomerases I and II, and E.coli topoisomerase IV. The assays are readily adaptable to other enzymes that change DNA supercoiling (e.g. restriction enzymes) and are suitable for use in a high-throughput format. PMID:16936317

  5. A scalable approach for high throughput branch flow filtration.

    PubMed

    Inglis, David W; Herman, Nick

    2013-05-07

    Microfluidic continuous flow filtration methods have the potential for very high size resolution using minimum feature sizes that are larger than the separation size, thereby circumventing the problem of clogging. Branch flow filtration is particularly promising because it has an unlimited dynamic range (ratio of largest passable particle to the smallest separated particle) but suffers from very poor volume throughput because when many branches are used, they cannot be identical if each is to have the same size cut-off. We describe a new iterative approach to the design of branch filtration devices able to overcome this limitation without large dead volumes. This is demonstrated by numerical modelling, fabrication and testing of devices with 20 branches, with dynamic ranges up to 6.9, and high filtration ratios (14-29%) on beads and fungal spores. The filters have a sharp size cutoff (10× depletion for 12% size difference), with large particle rejection equivalent to a 20th order Butterworth low pass filter. The devices are fully scalable, enabling higher throughput and smaller cutoff sizes and they are compatible with ultra low cost fabrication.

  6. High-throughput nanoparticle catalysis: partial oxidation of propylene.

    PubMed

    Duan, Shici; Kahn, Michael; Senkan, Selim

    2007-02-01

    Partial oxidation of propylene was investigated at 1 atm pressure over Rh/TiO(2) catalysts as a function of reaction temperature, metal loading and particle size using high-throughput methods. Catalysts were prepared by ablating thin sheets of pure rhodium metal using an excimer laser and by collecting the nanoparticles created on the external surfaces of TiO(2) pellets that were placed inside the ablation plume. Rh nanoparticles before the experiments were characterized by transmission electron microscopy (TEM) by collecting them on carbon film. Catalyst evaluations were performed using a high-throughput array channel microreactor system coupled to quadrupole mass spectrometry (MS) and gas chromatography (GC). The reaction conditions were 23% C(3)H(6), 20% O(2) and the balance helium in the feed, 20,000 h(-1) GHSV and a temperature range of 250-325 degrees C. The reaction products included primarily acetone (AT) and to a lesser degree propionaldehyde (PaL) as the C(3) products, together with deep oxidation products COx.

  7. A medium or high throughput protein refolding assay.

    PubMed

    Cowieson, Nathan P; Wensley, Beth; Robin, Gautier; Guncar, Gregor; Forwood, Jade; Hume, David A; Kobe, Bostjan; Martin, Jennifer L

    2008-01-01

    Expression of insoluble protein in E. coli is a major bottleneck of high throughput structural biology projects. Refolding proteins into native conformations from inclusion bodies could significantly increase the number of protein targets that can be taken on to structural studies. This chapter presents a simple assay for screening insoluble protein targets and identifying those that are most amenable to refolding. The assay is based on the observation that when proteins are refolded while bound to metal affinity resin, misfolded proteins are generally not eluted by imidazole. This difference is exploited here to distinguish between folded and misfolded proteins. Two implementations of the assay are described. The assay fits well into a standard high throughput structural biology pipeline, because it begins with the inclusion body preparations that are a byproduct of small-scale, automated expression and purification trials and does not require additional facilities. Two formats of the assay are described, a manual assay that is useful for screening small numbers of targets, and an automated implementation that is useful for large numbers of targets.

  8. High-throughput screening with micro-x-ray fluorescence

    SciTech Connect

    Havrilla, George J.; Miller, Thomasin C.

    2005-06-15

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity.

  9. High-throughput screening with micro-x-ray fluorescence

    NASA Astrophysics Data System (ADS)

    Havrilla, George J.; Miller, Thomasin C.

    2005-06-01

    Micro-x-ray fluorescence (MXRF) is a useful characterization tool for high-throughput screening of combinatorial libraries. Due to the increasing threat of use of chemical warfare (CW) agents both in military actions and against civilians by terrorist extremists, there is a strong push to improve existing methods and develop means for the detection of a broad spectrum of CW agents in a minimal amount of time to increase national security. This paper describes a combinatorial high-throughput screening technique for CW receptor discovery to aid in sensor development. MXRF can screen materials for elemental composition at the mesoscale level (tens to hundreds of micrometers). The key aspect of this work is the use of commercial MXRF instrumentation coupled with the inherent heteroatom elements within the target molecules of the combinatorial reaction to provide rapid and specific identification of lead species. The method is demonstrated by screening an 11-mer oligopeptide library for selective binding of the degradation products of the nerve agent VX. The identified oligopeptides can be used as selective molecular receptors for sensor development. The MXRF screening method is nondestructive, requires minimal sample preparation or special tags for analysis, and the screening time depends on the desired sensitivity.

  10. Structuring intuition with theory: The high-throughput way

    NASA Astrophysics Data System (ADS)

    Fornari, Marco

    2015-03-01

    First principles methodologies have grown in accuracy and applicability to the point where large databases can be built, shared, and analyzed with the goal of predicting novel compositions, optimizing functional properties, and discovering unexpected relationships between the data. In order to be useful to a large community of users, data should be standardized, validated, and distributed. In addition, tools to easily manage large datasets should be made available to effectively lead to materials development. Within the AFLOW consortium we have developed a simple frame to expand, validate, and mine data repositories: the MTFrame. Our minimalistic approach complement AFLOW and other existing high-throughput infrastructures and aims to integrate data generation with data analysis. We present few examples from our work on materials for energy conversion. Our intent s to pinpoint the usefulness of high-throughput methodologies to guide the discovery process by quantitatively structuring the scientific intuition. This work was supported by ONR-MURI under Contract N00014-13-1-0635 and the Duke University Center for Materials Genomics.

  11. High-throughput screening to enhance oncolytic virus immunotherapy

    PubMed Central

    Allan, KJ; Stojdl, David F; Swift, SL

    2016-01-01

    High-throughput screens can rapidly scan and capture large amounts of information across multiple biological parameters. Although many screens have been designed to uncover potential new therapeutic targets capable of crippling viruses that cause disease, there have been relatively few directed at improving the efficacy of viruses that are used to treat disease. Oncolytic viruses (OVs) are biotherapeutic agents with an inherent specificity for treating malignant disease. Certain OV platforms – including those based on herpes simplex virus, reovirus, and vaccinia virus – have shown success against solid tumors in advanced clinical trials. Yet, many of these OVs have only undergone minimal engineering to solidify tumor specificity, with few extra modifications to manipulate additional factors. Several aspects of the interaction between an OV and a tumor-bearing host have clear value as targets to improve therapeutic outcomes. At the virus level, these include delivery to the tumor, infectivity, productivity, oncolysis, bystander killing, spread, and persistence. At the host level, these include engaging the immune system and manipulating the tumor microenvironment. Here, we review the chemical- and genome-based high-throughput screens that have been performed to manipulate such parameters during OV infection and analyze their impact on therapeutic efficacy. We further explore emerging themes that represent key areas of focus for future research. PMID:27579293

  12. High-throughput technology for novel SO2 oxidation catalysts

    NASA Astrophysics Data System (ADS)

    Loskyll, Jonas; Stoewe, Klaus; Maier, Wilhelm F.

    2011-10-01

    We review the state of the art and explain the need for better SO2 oxidation catalysts for the production of sulfuric acid. A high-throughput technology has been developed for the study of potential catalysts in the oxidation of SO2 to SO3. High-throughput methods are reviewed and the problems encountered with their adaptation to the corrosive conditions of SO2 oxidation are described. We show that while emissivity-corrected infrared thermography (ecIRT) can be used for primary screening, it is prone to errors because of the large variations in the emissivity of the catalyst surface. UV-visible (UV-Vis) spectrometry was selected instead as a reliable analysis method of monitoring the SO2 conversion. Installing plain sugar absorbents at reactor outlets proved valuable for the detection and quantitative removal of SO3 from the product gas before the UV-Vis analysis. We also overview some elements used for prescreening and those remaining after the screening of the first catalyst generations.

  13. High-Throughput Models for Exposure-Based Chemical ...

    EPA Pesticide Factsheets

    The United States Environmental Protection Agency (U.S. EPA) must characterize potential risks to human health and the environment associated with manufacture and use of thousands of chemicals. High-throughput screening (HTS) for biological activity allows the ToxCast research program to prioritize chemical inventories for potential hazard. Similar capabilities for estimating exposure potential would support rapid risk-based prioritization for chemicals with limited information; here, we propose a framework for high-throughput exposure assessment. To demonstrate application, an analysis was conducted that predicts human exposure potential for chemicals and estimates uncertainty in these predictions by comparison to biomonitoring data. We evaluated 1936 chemicals using far-field mass balance human exposure models (USEtox and RAIDAR) and an indicator for indoor and/or consumer use. These predictions were compared to exposures inferred by Bayesian analysis from urine concentrations for 82 chemicals reported in the National Health and Nutrition Examination Survey (NHANES). Joint regression on all factors provided a calibrated consensus prediction, the variance of which serves as an empirical determination of uncertainty for prioritization on absolute exposure potential. Information on use was found to be most predictive; generally, chemicals above the limit of detection in NHANES had consumer/indoor use. Coupled with hazard HTS, exposure HTS can place risk earlie

  14. A High-Throughput Cidality Screen for Mycobacterium Tuberculosis

    PubMed Central

    Kaur, Parvinder; Ghosh, Anirban; Krishnamurthy, Ramya Vadageri; Bhattacharjee, Deepa Gagwani; Achar, Vijayashree; Datta, Santanu; Narayanan, Shridhar; Anbarasu, Anand; Ramaiah, Sudha

    2015-01-01

    Exposure to Mycobacterium tuberculosis (Mtb) aerosols is a major threat to tuberculosis (TB) researchers, even in bio-safety level-3 (BSL-3) facilities. Automation and high-throughput screens (HTS) in BSL3 facilities are essential for minimizing manual aerosol-generating interventions and facilitating TB research. In the present study, we report the development and validation of a high-throughput, 24-well ‘spot-assay’ for selecting bactericidal compounds against Mtb. The bactericidal screen concept was first validated in the fast-growing surrogate Mycobacterium smegmatis (Msm) and subsequently confirmed in Mtb using the following reference anti-tubercular drugs: rifampicin, isoniazid, ofloxacin and ethambutol (RIOE, acting on different targets). The potential use of the spot-assay to select bactericidal compounds from a large library was confirmed by screening on Mtb, with parallel plating by the conventional gold standard method (correlation, r2 = 0.808). An automated spot-assay further enabled an MBC90 determination on resistant and sensitive Mtb clinical isolates. The implementation of the spot-assay in kinetic screens to enumerate residual Mtb after either genetic silencing (anti-sense RNA, AS-RNA) or chemical inhibition corroborated its ability to detect cidality. This relatively simple, economical and quantitative HTS considerably minimized the bio-hazard risk and enabled the selection of novel vulnerable Mtb targets and mycobactericidal compounds. Thus, spot-assays have great potential to impact the TB drug discovery process. PMID:25693161

  15. Image quantification of high-throughput tissue microarray

    NASA Astrophysics Data System (ADS)

    Wu, Jiahua; Dong, Junyu; Zhou, Huiyu

    2006-03-01

    Tissue microarray (TMA) technology allows rapid visualization of molecular targets in thousands of tissue specimens at a time and provides valuable information on expression of proteins within tissues at a cellular and sub-cellular level. TMA technology overcomes the bottleneck of traditional tissue analysis and allows it to catch up with the rapid advances in lead discovery. Studies using TMA on immunohistochemistry (IHC) can produce a large amount of images for interpretation within a very short time. Manual interpretation does not allow accurate quantitative analysis of staining to be undertaken. Automatic image capture and analysis has been shown to be superior to manual interpretation. The aims of this work is to develop a truly high-throughput and fully automated image capture and analysis system. We develop a robust colour segmentation algorithm using hue-saturation-intensity (HSI) colour space to provide quantification of signal intensity and partitioning of staining on high-throughput TMA. Initial segmentation results and quantification data have been achieved on 16,000 TMA colour images over 23 different tissue types.

  16. High throughput, quantitative analysis of human osteoclast differentiation and activity.

    PubMed

    Diepenhorst, Natalie A; Nowell, Cameron J; Rueda, Patricia; Henriksen, Kim; Pierce, Tracie; Cook, Anna E; Pastoureau, Philippe; Sabatini, Massimo; Charman, William N; Christopoulos, Arthur; Summers, Roger J; Sexton, Patrick M; Langmead, Christopher J

    2017-02-15

    Osteoclasts are multinuclear cells that degrade bone under both physiological and pathophysiological conditions. Osteoclasts are therefore a major target of osteoporosis therapeutics aimed at preserving bone. Consequently, analytical methods for osteoclast activity are useful for the development of novel biomarkers and/or pharmacological agents for the treatment of osteoporosis. The nucleation state of an osteoclast is indicative of its maturation and activity. To date, activity is routinely measured at the population level with only approximate consideration of the nucleation state (an 'osteoclast population' is typically defined as cells with ≥3 nuclei). Using a fluorescent substrate for tartrate-resistant acid phosphatase (TRAP), a routinely used marker of osteoclast activity, we developed a multi-labelled imaging method for quantitative measurement of osteoclast TRAP activity at the single cell level. Automated image analysis enables interrogation of large osteoclast populations in a high throughput manner using open source software. Using this methodology, we investigated the effects of receptor activator of nuclear factor kappa-B ligand (RANK-L) on osteoclast maturation and activity and demonstrated that TRAP activity directly correlates with osteoclast maturity (i.e. nuclei number). This method can be applied to high throughput screening of osteoclast-targeting compounds to determine changes in maturation and activity.

  17. Piezo-thermal Probe Array for High Throughput Applications

    PubMed Central

    Gaitas, Angelo; French, Paddy

    2012-01-01

    Microcantilevers are used in a number of applications including atomic-force microscopy (AFM). In this work, deflection-sensing elements along with heating elements are integrated onto micromachined cantilever arrays to increase sensitivity, and reduce complexity and cost. An array of probes with 5–10 nm gold ultrathin film sensors on silicon substrates for high throughput scanning probe microscopy is developed. The deflection sensitivity is 0.2 ppm/nm. Plots of the change in resistance of the sensing element with displacement are used to calibrate the probes and determine probe contact with the substrate. Topographical scans demonstrate high throughput and nanometer resolution. The heating elements are calibrated and the thermal coefficient of resistance (TCR) is 655 ppm/K. The melting temperature of a material is measured by locally heating the material with the heating element of the cantilever while monitoring the bending with the deflection sensing element. The melting point value measured with this method is in close agreement with the reported value in literature. PMID:23641125

  18. Adaptation to high throughput batch chromatography enhances multivariate screening.

    PubMed

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. A high-throughput screen for antibiotic drug discovery.

    PubMed

    Scanlon, Thomas C; Dostal, Sarah M; Griswold, Karl E

    2014-02-01

    We describe an ultra-high-throughput screening platform enabling discovery and/or engineering of natural product antibiotics. The methodology involves creation of hydrogel-in-oil emulsions in which recombinant microorganisms are co-emulsified with bacterial pathogens; antibiotic activity is assayed by use of a fluorescent viability dye. We have successfully utilized both bulk emulsification and microfluidic technology for the generation of hydrogel microdroplets that are size-compatible with conventional flow cytometry. Hydrogel droplets are ∼25 pL in volume, and can be synthesized and sorted at rates exceeding 3,000 drops/s. Using this technique, we have achieved screening throughputs exceeding 5 million clones/day. Proof-of-concept experiments demonstrate efficient selection of antibiotic-secreting yeast from a vast excess of negative controls. In addition, we have successfully used this technique to screen a metagenomic library for secreted antibiotics that kill the human pathogen Staphylococcus aureus. Our results establish the practical utility of the screening platform, and we anticipate that the accessible nature of our methods will enable others seeking to identify and engineer the next generation of antibacterial biomolecules. © 2013 Wiley Periodicals, Inc.

  20. High-Throughput Single-Cell Manipulation in Brain Tissue

    PubMed Central

    Steinmeyer, Joseph D.; Yanik, Mehmet Fatih

    2012-01-01

    The complexity of neurons and neuronal circuits in brain tissue requires the genetic manipulation, labeling, and tracking of single cells. However, current methods for manipulating cells in brain tissue are limited to either bulk techniques, lacking single-cell accuracy, or manual methods that provide single-cell accuracy but at significantly lower throughputs and repeatability. Here, we demonstrate high-throughput, efficient, reliable, and combinatorial delivery of multiple genetic vectors and reagents into targeted cells within the same tissue sample with single-cell accuracy. Our system automatically loads nanoliter-scale volumes of reagents into a micropipette from multiwell plates, targets and transfects single cells in brain tissues using a robust electroporation technique, and finally preps the micropipette by automated cleaning for repeating the transfection cycle. We demonstrate multi-colored labeling of adjacent cells, both in organotypic and acute slices, and transfection of plasmids encoding different protein isoforms into neurons within the same brain tissue for analysis of their effects on linear dendritic spine density. Our platform could also be used to rapidly deliver, both ex vivo and in vivo, a variety of genetic vectors, including optogenetic and cell-type specific agents, as well as fast-acting reagents such as labeling dyes, calcium sensors, and voltage sensors to manipulate and track neuronal circuit activity at single-cell resolution. PMID:22536416

  1. High-throughput single-cell manipulation in brain tissue.

    PubMed

    Steinmeyer, Joseph D; Yanik, Mehmet Fatih

    2012-01-01

    The complexity of neurons and neuronal circuits in brain tissue requires the genetic manipulation, labeling, and tracking of single cells. However, current methods for manipulating cells in brain tissue are limited to either bulk techniques, lacking single-cell accuracy, or manual methods that provide single-cell accuracy but at significantly lower throughputs and repeatability. Here, we demonstrate high-throughput, efficient, reliable, and combinatorial delivery of multiple genetic vectors and reagents into targeted cells within the same tissue sample with single-cell accuracy. Our system automatically loads nanoliter-scale volumes of reagents into a micropipette from multiwell plates, targets and transfects single cells in brain tissues using a robust electroporation technique, and finally preps the micropipette by automated cleaning for repeating the transfection cycle. We demonstrate multi-colored labeling of adjacent cells, both in organotypic and acute slices, and transfection of plasmids encoding different protein isoforms into neurons within the same brain tissue for analysis of their effects on linear dendritic spine density. Our platform could also be used to rapidly deliver, both ex vivo and in vivo, a variety of genetic vectors, including optogenetic and cell-type specific agents, as well as fast-acting reagents such as labeling dyes, calcium sensors, and voltage sensors to manipulate and track neuronal circuit activity at single-cell resolution.

  2. A High-Throughput Yeast Halo Assay for Bioactive Compounds.

    PubMed

    Bray, Walter; Lokey, R Scott

    2016-09-01

    When a disk of filter paper is impregnated with a cytotoxic or cytostatic drug and added to solid medium seeded with yeast, a visible clear zone forms around the disk whose size depends on the concentration and potency of the drug. This is the traditional "halo" assay and provides a convenient, if low-throughput, read-out of biological activity that has been the mainstay of antifungal and antibiotic testing for decades. Here, we describe a protocol for a high-throughput version of the halo assay, which uses an array of 384 pins to deliver ∼200 nL of stock solutions from compound plates onto single-well plates seeded with yeast. Using a plate reader in the absorbance mode, the resulting halos can be quantified and the data archived in the form of flat files that can be connected to compound databases with standard software. This assay has the convenience associated with the visual readout of the traditional halo assay but uses far less material and can be automated to screen thousands of compounds per day.

  3. A high-throughput Raman notch filter set

    NASA Astrophysics Data System (ADS)

    Puppels, G. J.; Huizinga, A.; Krabbe, H. W.; de Boer, H. A.; Gijsbers, G.; de Mul, F. F. M.

    1990-12-01

    A chevron-type Raman notch filter (RNF) set is described. lt combines a high signal throughput (up to 90% around 1600 cm-1 and ≳80% between and 700 and 2700 cm-1) with a laser line suppression of 108-109. The filter set can be used to replace the first two dispersion stages in triple-stage Raman monochromators commonly employed in multichannel detection systems. This yields a gain in intensity of the detected Raman signal of a factor of 4. It is shown that in Raman spectrometers with a backscatter geometry, the filter set can also be used to optically couple the microscope and the spectrometer. This leads to a further increase in signal intensity of a factor of 3-4 as compared to the situation where a beam splitter is used. Additional advantages of the RNF set are the fact that signal throughput is almost polarization independent over a large spectral interval and that it offers the possibility to simultaneously record Stokes and anti-Stokes spectra.

  4. Dimensioning storage and computing clusters for efficient high throughput computing

    NASA Astrophysics Data System (ADS)

    Accion, E.; Bria, A.; Bernabeu, G.; Caubet, M.; Delfino, M.; Espinal, X.; Merino, G.; Lopez, F.; Martinez, F.; Planas, E.

    2012-12-01

    Scientific experiments are producing huge amounts of data, and the size of their datasets and total volume of data continues increasing. These data are then processed by researchers belonging to large scientific collaborations, with the Large Hadron Collider being a good example. The focal point of scientific data centers has shifted from efficiently coping with PetaByte scale storage to deliver quality data processing throughput. The dimensioning of the internal components in High Throughput Computing (HTC) data centers is of crucial importance to cope with all the activities demanded by the experiments, both the online (data acceptance) and the offline (data processing, simulation and user analysis). This requires a precise setup involving disk and tape storage services, a computing cluster and the internal networking to prevent bottlenecks, overloads and undesired slowness that lead to losses cpu cycles and batch jobs failures. In this paper we point out relevant features for running a successful data storage and processing service in an intensive HTC environment.

  5. A robust robotic high-throughput antibody purification platform.

    PubMed

    Schmidt, Peter M; Abdo, Michael; Butcher, Rebecca E; Yap, Min-Yin; Scotney, Pierre D; Ramunno, Melanie L; Martin-Roussety, Genevieve; Owczarek, Catherine; Hardy, Matthew P; Chen, Chao-Guang; Fabri, Louis J

    2016-07-15

    Monoclonal antibodies (mAbs) have become the fastest growing segment in the drug market with annual sales of more than 40 billion US$ in 2013. The selection of lead candidate molecules involves the generation of large repertoires of antibodies from which to choose a final therapeutic candidate. Improvements in the ability to rapidly produce and purify many antibodies in sufficient quantities reduces the lead time for selection which ultimately impacts on the speed with which an antibody may transition through the research stage and into product development. Miniaturization and automation of chromatography using micro columns (RoboColumns(®) from Atoll GmbH) coupled to an automated liquid handling instrument (ALH; Freedom EVO(®) from Tecan) has been a successful approach to establish high throughput process development platforms. Recent advances in transient gene expression (TGE) using the high-titre Expi293F™ system have enabled recombinant mAb titres of greater than 500mg/L. These relatively high protein titres reduce the volume required to generate several milligrams of individual antibodies for initial biochemical and biological downstream assays, making TGE in the Expi293F™ system ideally suited to high throughput chromatography on an ALH. The present publication describes a novel platform for purifying Expi293F™-expressed recombinant mAbs directly from cell-free culture supernatant on a Perkin Elmer JANUS-VariSpan ALH equipped with a plate shuttle device. The purification platform allows automated 2-step purification (Protein A-desalting/size exclusion chromatography) of several hundred mAbs per week. The new robotic method can purify mAbs with high recovery (>90%) at sub-milligram level with yields of up to 2mg from 4mL of cell-free culture supernatant.

  6. High-throughput microcavitation bubble induced cellular mechanotransduction

    NASA Astrophysics Data System (ADS)

    Compton, Jonathan Lee

    inhibitor to IP 3 induced Ca2+ release. This capability opens the development of a high-throughput screening platform for molecules that modulate cellular mechanotransduction. We have applied this approach to screen the effects of a small set of small molecules, in a 96-well plate in less than an hour. These detailed studies offer a basis for the design, development, and implementation of a novel high-throughput mechanotransduction assay to rapidly screen the effect of small molecules on cellular mechanotransduction at high throughput.

  7. Multifunctional encoded particles for high-throughput biomolecule analysis.

    PubMed

    Pregibon, Daniel C; Toner, Mehmet; Doyle, Patrick S

    2007-03-09

    High-throughput screening for genetic analysis, combinatorial chemistry, and clinical diagnostics benefits from multiplexing, which allows for the simultaneous assay of several analytes but necessitates an encoding scheme for molecular identification. Current approaches for multiplexed analysis involve complicated or expensive processes for encoding, functionalizing, or decoding active substrates (particles or surfaces) and often yield a very limited number of analyte-specific codes. We present a method based on continuous-flow lithography that combines particle synthesis and encoding and probe incorporation into a single process to generate multifunctional particles bearing over a million unique codes. By using such particles, we demonstrate a multiplexed, single-fluorescence detection of DNA oligomers with encoded particle libraries that can be scanned rapidly in a flow-through microfluidic channel. Furthermore, we demonstrate with high specificity the same multiplexed detection using individual multiprobe particles.

  8. High-throughput ballistic injection nanorheology to measure cell mechanics

    PubMed Central

    Wu, Pei-Hsun; Hale, Christopher M; Chen, Wei-Chiang; Lee, Jerry S H; Tseng, Yiider; Wirtz, Denis

    2015-01-01

    High-throughput ballistic injection nanorheology is a method for the quantitative study of cell mechanics. Cell mechanics are measured by ballistic injection of submicron particles into the cytoplasm of living cells and tracking the spontaneous displacement of the particles at high spatial resolution. The trajectories of the cytoplasm-embedded particles are transformed into mean-squared displacements, which are subsequently transformed into frequency-dependent viscoelastic moduli and time-dependent creep compliance of the cytoplasm. This method allows for the study of a wide range of cellular conditions, including cells inside a 3D matrix, cell subjected to shear flows and biochemical stimuli, and cells in a live animal. Ballistic injection lasts < 1 min and is followed by overnight incubation. Multiple particle tracking for one cell lasts < 1 min. Forty cells can be examined in < 1 h. PMID:22222790

  9. Statistically invalid classification of high throughput gene expression data.

    PubMed

    Barbash, Shahar; Soreq, Hermona

    2013-01-01

    Classification analysis based on high throughput data is a common feature in neuroscience and other fields of science, with a rapidly increasing impact on both basic biology and disease-related studies. The outcome of such classifications often serves to delineate novel biochemical mechanisms in health and disease states, identify new targets for therapeutic interference, and develop innovative diagnostic approaches. Given the importance of this type of studies, we screened 111 recently-published high-impact manuscripts involving classification analysis of gene expression, and found that 58 of them (53%) based their conclusions on a statistically invalid method which can lead to bias in a statistical sense (lower true classification accuracy then the reported classification accuracy). In this report we characterize the potential methodological error and its scope, investigate how it is influenced by different experimental parameters, and describe statistically valid methods for avoiding such classification mistakes.

  10. Predicting Novel Bulk Metallic Glasses via High- Throughput Calculations

    NASA Astrophysics Data System (ADS)

    Perim, E.; Lee, D.; Liu, Y.; Toher, C.; Gong, P.; Li, Y.; Simmons, W. N.; Levy, O.; Vlassak, J.; Schroers, J.; Curtarolo, S.

    Bulk metallic glasses (BMGs) are materials which may combine key properties from crystalline metals, such as high hardness, with others typically presented by plastics, such as easy processability. However, the cost of the known BMGs poses a significant obstacle for the development of applications, which has lead to a long search for novel, economically viable, BMGs. The emergence of high-throughput DFT calculations, such as the library provided by the AFLOWLIB consortium, has provided new tools for materials discovery. We have used this data to develop a new glass forming descriptor combining structural factors with thermodynamics in order to quickly screen through a large number of alloy systems in the AFLOWLIB database, identifying the most promising systems and the optimal compositions for glass formation. National Science Foundation (DMR-1436151, DMR-1435820, DMR-1436268).

  11. Automated Transition State Theory Calculations for High-Throughput Kinetics.

    PubMed

    Bhoorasingh, Pierre L; Slakman, Belinda L; Seyedzadeh Khanshan, Fariba; Cain, Jason Y; West, Richard H

    2017-09-21

    A scarcity of known chemical kinetic parameters leads to the use of many reaction rate estimates, which are not always sufficiently accurate, in the construction of detailed kinetic models. To reduce the reliance on these estimates and improve the accuracy of predictive kinetic models, we have developed a high-throughput, fully automated, reaction rate calculation method, AutoTST. The algorithm integrates automated saddle-point geometry search methods and a canonical transition state theory kinetics calculator. The automatically calculated reaction rates compare favorably to existing estimated rates. Comparison against high level theoretical calculations show the new automated method performs better than rate estimates when the estimate is made by a poor analogy. The method will improve by accounting for internal rotor contributions and by improving methods to determine molecular symmetry.

  12. High-throughput plastic microlenses fabricated using microinjection molding techniques

    NASA Astrophysics Data System (ADS)

    Appasamy, Sreeram; Li, Weizhuo; Lee, Se Hwan; Boyd, Joseph T.; Ahn, Chong H.

    2005-12-01

    A novel fabrication scheme to develop high-throughput plastic microlenses using injection-molding techniques is realized. The initial microlens mold is fabricated using the well-known reflow technique. The reflow process is optimized to obtain reliable and repeatable microlens patterns. The master mold insert for the injection-molding process is fabricated using metal electroforming. The electroplating process is optimized for obtaining a low stress electroform. Two new plastic materials, cyclo olefin copolymer (COC) and Poly IR 2 are introduced in this work for fabricating microlenses. The plastic microlenses have been characterized for their focal lengths that range from 200 µm to 1.9 mm. This technique enables high-volume production of plastic microlenses with cycle times for a single chip being of the order of 60 s.

  13. High-throughput drawing and testing of metallic glass nanostructures.

    PubMed

    Hasan, Molla; Kumar, Golden

    2017-03-02

    Thermoplastic embossing of metallic glasses promises direct imprinting of metal nanostructures using templates. However, embossing high-aspect-ratio nanostructures faces unworkable flow resistance due to friction and non-wetting conditions at the template interface. Herein, we show that these inherent challenges of embossing can be reversed by thermoplastic drawing using templates. The flow resistance not only remains independent of wetting but also decreases with increasing feature aspect-ratio. Arrays of assembled nanotips, nanowires, and nanotubes with aspect-ratios exceeding 1000 can be produced through controlled elongation and fracture of metallic glass structures. In contrast to embossing, the drawing approach generates two sets of nanostructures upon final fracture; one set remains anchored to the metallic glass substrate while the second set is assembled on the template. This method can be readily adapted for high-throughput fabrication and testing of nanoscale tensile specimens, enabling rapid screening of size-effects in mechanical behavior.

  14. Towards high throughput screening of nanoparticle flotation collectors.

    PubMed

    Abarca, Carla; Yang, Songtao; Pelton, Robert H

    2015-12-15

    To function as flotation collectors for mineral processing, polymeric nanoparticles require a delicate balance of surface properties to give mineral-specific deposition and colloidal stability in high ionic strength alkaline media, while remaining sufficiently hydrophobic to promote flotation. Combinatorial nanoparticle surface modification, in conjunction with high throughput screening, is a promising approach for nanoparticle development. However, efficient automated screening assays are required to reject ineffective particles without having to undergo time consuming flotation testing. Herein we demonstrate that determining critical coagulation concentrations of sodium carbonate in combination with measuring the advancing water contact angle of nanoparticle-saturated glass surfaces can be used to screen ineffective nanoparticles. Finally, none of our first nanoparticle library based on poly(ethylene glycol) methyl ether methacrylate (PEG-methacrylate) were effective flotation collectors because the nanoparticles were too hydrophilic.

  15. High throughput x-ray optics: an overview.

    PubMed

    Gorenstein, P

    1988-04-15

    Several x-ray astronomy missions of the 1990s will contain focusing telescopes with significantly more collecting power than the Einstein Observatory. There is increasing emphasis on spectroscopy. ESA's XMM with 10(4) cm(2) of effective area will be the largest. A high throughput facility with over 10(5) cm(2) of effective area and 20-sec of arc angular resolution is needed ultimately for various scientific studies such as high resolution spectroscopic observations of QSOs. At least one of the following techniques currently being developed for fabricating x-ray telescopes including automated figuring of flats as parabolic reflectors, replication of cylindrical shells, and the alignment of thin lacquer-coated conical foils is likely to permit the construction of modular arrays of telescopes with the area and angular resolution required.

  16. Machine Learning for High-Throughput Stress Phenotyping in Plants.

    PubMed

    Singh, Arti; Ganapathysubramanian, Baskar; Singh, Asheesh Kumar; Sarkar, Soumik

    2016-02-01

    Advances in automated and high-throughput imaging technologies have resulted in a deluge of high-resolution images and sensor data of plants. However, extracting patterns and features from this large corpus of data requires the use of machine learning (ML) tools to enable data assimilation and feature identification for stress phenotyping. Four stages of the decision cycle in plant stress phenotyping and plant breeding activities where different ML approaches can be deployed are (i) identification, (ii) classification, (iii) quantification, and (iv) prediction (ICQP). We provide here a comprehensive overview and user-friendly taxonomy of ML tools to enable the plant community to correctly and easily apply the appropriate ML tools and best-practice guidelines for various biotic and abiotic stress traits. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Numerical techniques for high-throughput reflectance interference biosensing

    NASA Astrophysics Data System (ADS)

    Sevenler, Derin; Ünlü, M. Selim

    2016-06-01

    We have developed a robust and rapid computational method for processing the raw spectral data collected from thin film optical interference biosensors. We have applied this method to Interference Reflectance Imaging Sensor (IRIS) measurements and observed a 10,000 fold improvement in processing time, unlocking a variety of clinical and scientific applications. Interference biosensors have advantages over similar technologies in certain applications, for example highly multiplexed measurements of molecular kinetics. However, processing raw IRIS data into useful measurements has been prohibitively time consuming for high-throughput studies. Here we describe the implementation of a lookup table (LUT) technique that provides accurate results in far less time than naive methods. We also discuss an additional benefit that the LUT method can be used with a wider range of interference layer thickness and experimental configurations that are incompatible with methods that require fitting the spectral response.

  18. Single-platelet nanomechanics measured by high-throughput cytometry

    NASA Astrophysics Data System (ADS)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2016-10-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  19. Automated high-throughput nanoliter-scale protein crystallization screening.

    PubMed

    Li, Fenglei; Robinson, Howard; Yeung, Edward S

    2005-12-01

    A highly efficient method is developed for automated high-throughput screening of nanoliter-scale protein crystallization. The system integrates liquid dispensing, crystallization and detection. The automated liquid dispensing system handles nanoliters of protein and various combinations of precipitants in parallel to access diverse regions of the phase diagram. A new detection scheme, native fluorescence, with complementary visible-light detection is employed for monitoring the progress of crystallization. This detection mode can distinguish protein crystals from inorganic crystals in a nondestructive manner. A gas-permeable membrane covering the microwells simplifies evaporation rate control and probes extended conditions in the phase diagram. The system was successfully demonstrated for the screening of lysozyme crystallization under 81 different conditions.

  20. Measuring growth rate in high-throughput growth phenotyping.

    PubMed

    Blomberg, Anders

    2011-02-01

    Growth rate is an important variable and parameter in biology with a central role in evolutionary, functional genomics, and systems biology studies. In this review the pros and cons of the different technologies presently available for high-throughput measurements of growth rate are discussed. Growth rate can be measured in liquid microcultivation of individual strains, in competition between strains, as growing colonies on agar, as division of individual cells, and estimated from molecular reporters. Irrespective of methodology, statistical issues such as spatial biases and batch effects are crucial to investigate and correct for to ensure low false discovery rates. The rather low correlations between studies indicate that cross-laboratory comparison and standardization are pressing issue to assure high-quality and comparable growth-rate data. Copyright © 2010 Elsevier Ltd. All rights reserved.

  1. Automated, high-throughput IgG-antibody glycoprofiling platform.

    PubMed

    Stöckmann, Henning; Adamczyk, Barbara; Hayes, Jerrard; Rudd, Pauline M

    2013-09-17

    One of today's key challenges is the ability to decode the functions of complex carbohydrates in various biological contexts. To generate high-quality glycomics data in a high-throughput fashion, we developed a robotized and low-cost N-glycan analysis platform for glycoprofiling of immunoglobulin G antibodies (IgG), which are central players of the immune system and of vital importance in the biopharmaceutical industry. The key features include (a) rapid IgG affinity purification and sample concentration, (b) protein denaturation and glycan release on a multiwell filtration device, (c) glycan purification on solid-supported hydrazide, and (d) glycan quantification by ultra performance liquid chromatography. The sample preparation workflow was automated using a robotic liquid-handling workstation, allowing the preparation of 96 samples (or multiples thereof) in 22 h with excellent reproducibility and, thus, should greatly facilitate biomarker discovery and glycosylation monitoring of therapeutic IgGs.

  2. Single-platelet nanomechanics measured by high-throughput cytometry

    NASA Astrophysics Data System (ADS)

    Myers, David R.; Qiu, Yongzhi; Fay, Meredith E.; Tennenbaum, Michael; Chester, Daniel; Cuadrado, Jonas; Sakurai, Yumiko; Baek, Jong; Tran, Reginald; Ciciliano, Jordan C.; Ahn, Byungwook; Mannino, Robert G.; Bunting, Silvia T.; Bennett, Carolyn; Briones, Michael; Fernandez-Nieves, Alberto; Smith, Michael L.; Brown, Ashley C.; Sulchek, Todd; Lam, Wilbur A.

    2017-02-01

    Haemostasis occurs at sites of vascular injury, where flowing blood forms a clot, a dynamic and heterogeneous fibrin-based biomaterial. Paramount in the clot's capability to stem haemorrhage are its changing mechanical properties, the major drivers of which are the contractile forces exerted by platelets against the fibrin scaffold. However, how platelets transduce microenvironmental cues to mediate contraction and alter clot mechanics is unknown. This is clinically relevant, as overly softened and stiffened clots are associated with bleeding and thrombotic disorders. Here, we report a high-throughput hydrogel-based platelet-contraction cytometer that quantifies single-platelet contraction forces in different clot microenvironments. We also show that platelets, via the Rho/ROCK pathway, synergistically couple mechanical and biochemical inputs to mediate contraction. Moreover, highly contractile platelet subpopulations present in healthy controls are conspicuously absent in a subset of patients with undiagnosed bleeding disorders, and therefore may function as a clinical diagnostic biophysical biomarker.

  3. High-Throughput Genomics Enhances Tomato Breeding Efficiency

    PubMed Central

    Barone, A; Di Matteo, A; Carputo, D; Frusciante, L

    2009-01-01

    Tomato (Solanum lycopersicum) is considered a model plant species for a group of economically important crops, such as potato, pepper, eggplant, since it exhibits a reduced genomic size (950 Mb), a short generation time, and routine transformation technologies. Moreover, it shares with the other Solanaceous plants the same haploid chromosome number and a high level of conserved genomic organization. Finally, many genomic and genetic resources are actually available for tomato, and the sequencing of its genome is in progress. These features make tomato an ideal species for theoretical studies and practical applications in the genomics field. The present review describes how structural genomics assist the selection of new varieties resistant to pathogens that cause damage to this crop. Many molecular markers highly linked to resistance genes and cloned resistance genes are available and could be used for a high-throughput screening of multiresistant varieties. Moreover, a new genomics-assisted breeding approach for improving fruit quality is presented and discussed. It relies on the identification of genetic mechanisms controlling the trait of interest through functional genomics tools. Following this approach, polymorphisms in major gene sequences responsible for variability in the expression of the trait under study are then exploited for tracking simultaneously favourable allele combinations in breeding programs using high-throughput genomic technologies. This aims at pyramiding in the genetic background of commercial cultivars alleles that increase their performances. In conclusion, tomato breeding strategies supported by advanced technologies are expected to target increased productivity and lower costs of improved genotypes even for complex traits. PMID:19721805

  4. High-throughput purification of affinity-tagged recombinant proteins.

    PubMed

    Wiesler, Simone C; Weinzierl, Robert O J

    2012-08-26

    X-ray crystallography is the method of choice for obtaining a detailed view of the structure of proteins. Such studies need to be complemented by further biochemical analyses to obtain detailed insights into structure/function relationships. Advances in oligonucleotide- and gene synthesis technology make large-scale mutagenesis strategies increasingly feasible, including the substitution of target residues by all 19 other amino acids. Gain- or loss-of-function phenotypes then allow systematic conclusions to be drawn, such as the contribution of particular residues to catalytic activity, protein stability and/or protein-protein interaction specificity. In order to attribute the different phenotypes to the nature of the mutation--rather than to fluctuating experimental conditions--it is vital to purify and analyse the proteins in a controlled and reproducible manner. High-throughput strategies and the automation of manual protocols on robotic liquid-handling platforms have created opportunities to perform such complex molecular biological procedures with little human intervention and minimal error rates. Here, we present a general method for the purification of His-tagged recombinant proteins in a high-throughput manner. In a recent study, we applied this method to a detailed structure-function investigation of TFIIB, a component of the basal transcription machinery. TFIIB is indispensable for promoter-directed transcription in vitro and is essential for the recruitment of RNA polymerase into a preinitiation complex. TFIIB contains a flexible linker domain that penetrates the active site cleft of RNA polymerase. This linker domain confers two biochemically quantifiable activities on TFIIB, namely (i) the stimulation of the catalytic activity during the 'abortive' stage of transcript initiation, and (ii) an additional contribution to the specific recruitment of RNA polymerase into the preinitiation complex. We exploited the high-throughput purification method to

  5. Interpretation of mass spectrometry data for high-throughput proteomics.

    PubMed

    Chamrad, Daniel C; Koerting, Gerhard; Gobom, Johan; Thiele, Herbert; Klose, Joachim; Meyer, Helmut E; Blueggel, Martin

    2003-08-01

    Recent developments in proteomics have revealed a bottleneck in bioinformatics: high-quality interpretation of acquired MS data. The ability to generate thousands of MS spectra per day, and the demand for this, makes manual methods inadequate for analysis and underlines the need to transfer the advanced capabilities of an expert human user into sophisticated MS interpretation algorithms. The identification rate in current high-throughput proteomics studies is not only a matter of instrumentation. We present software for high-throughput PMF identification, which enables robust and confident protein identification at higher rates. This has been achieved by automated calibration, peak rejection, and use of a meta search approach which employs various PMF search engines. The automatic calibration consists of a dynamic, spectral information-dependent algorithm, which combines various known calibration methods and iteratively establishes an optimised calibration. The peak rejection algorithm filters signals that are unrelated to the analysed protein by use of automatically generated and dataset-dependent exclusion lists. In the "meta search" several known PMF search engines are triggered and their results are merged by use of a meta score. The significance of the meta score was assessed by simulation of PMF identification with 10,000 artificial spectra resembling a data situation close to the measured dataset. By means of this simulation the meta score is linked to expectation values as a statistical measure. The presented software is part of the proteome database ProteinScape which links the information derived from MS data to other relevant proteomics data. We demonstrate the performance of the presented system with MS data from 1891 PMF spectra. As a result of automatic calibration and peak rejection the identification rate increased from 6% to 44%.

  6. High-Throughput Next-Generation Sequencing of Polioviruses.

    PubMed

    Montmayeur, Anna M; Ng, Terry Fei Fan; Schmidt, Alexander; Zhao, Kun; Magaña, Laura; Iber, Jane; Castro, Christina J; Chen, Qi; Henderson, Elizabeth; Ramos, Edward; Shaw, Jing; Tatusov, Roman L; Dybdahl-Sissoko, Naomi; Endegue-Zanga, Marie Claire; Adeniji, Johnson A; Oberste, M Steven; Burns, Cara C

    2017-02-01

    The poliovirus (PV) is currently targeted for worldwide eradication and containment. Sanger-based sequencing of the viral protein 1 (VP1) capsid region is currently the standard method for PV surveillance. However, the whole-genome sequence is sometimes needed for higher resolution global surveillance. In this study, we optimized whole-genome sequencing protocols for poliovirus isolates and FTA cards using next-generation sequencing (NGS), aiming for high sequence coverage, efficiency, and throughput. We found that DNase treatment of poliovirus RNA followed by random reverse transcription (RT), amplification, and the use of the Nextera XT DNA library preparation kit produced significantly better results than other preparations. The average viral reads per total reads, a measurement of efficiency, was as high as 84.2% ± 15.6%. PV genomes covering >99 to 100% of the reference length were obtained and validated with Sanger sequencing. A total of 52 PV genomes were generated, multiplexing as many as 64 samples in a single Illumina MiSeq run. This high-throughput, sequence-independent NGS approach facilitated the detection of a diverse range of PVs, especially for those in vaccine-derived polioviruses (VDPV), circulating VDPV, or immunodeficiency-related VDPV. In contrast to results from previous studies on other viruses, our results showed that filtration and nuclease treatment did not discernibly increase the sequencing efficiency of PV isolates. However, DNase treatment after nucleic acid extraction to remove host DNA significantly improved the sequencing results. This NGS method has been successfully implemented to generate PV genomes for molecular epidemiology of the most recent PV isolates. Additionally, the ability to obtain full PV genomes from FTA cards will aid in facilitating global poliovirus surveillance.

  7. High-throughput identification of protein localization dependency networks.

    PubMed

    Christen, Beat; Fero, Michael J; Hillson, Nathan J; Bowman, Grant; Hong, Sun-Hae; Shapiro, Lucy; McAdams, Harley H

    2010-03-09

    Bacterial cells are highly organized with many protein complexes and DNA loci dynamically positioned to distinct subcellular sites over the course of a cell cycle. Such dynamic protein localization is essential for polar organelle development, establishment of asymmetry, and chromosome replication during the Caulobacter crescentus cell cycle. We used a fluorescence microscopy screen optimized for high-throughput to find strains with anomalous temporal or spatial protein localization patterns in transposon-generated mutant libraries. Automated image acquisition and analysis allowed us to identify genes that affect the localization of two polar cell cycle histidine kinases, PleC and DivJ, and the pole-specific pili protein CpaE, each tagged with a different fluorescent marker in a single strain. Four metrics characterizing the observed localization patterns of each of the three labeled proteins were extracted for hundreds of cell images from each of 854 mapped mutant strains. Using cluster analysis of the resulting set of 12-element vectors for each of these strains, we identified 52 strains with mutations that affected the localization pattern of the three tagged proteins. This information, combined with quantitative localization data from epitasis experiments, also identified all previously known proteins affecting such localization. These studies provide insights into factors affecting the PleC/DivJ localization network and into regulatory links between the localization of the pili assembly protein CpaE and the kinase localization pathway. Our high-throughput screening methodology can be adapted readily to any sequenced bacterial species, opening the potential for databases of localization regulatory networks across species, and investigation of localization network phylogenies.

  8. Surrogate-assisted feature extraction for high-throughput phenotyping.

    PubMed

    Yu, Sheng; Chakrabortty, Abhishek; Liao, Katherine P; Cai, Tianrun; Ananthakrishnan, Ashwin N; Gainer, Vivian S; Churchill, Susanne E; Szolovits, Peter; Murphy, Shawn N; Kohane, Isaac S; Cai, Tianxi

    2017-04-01

    Phenotyping algorithms are capable of accurately identifying patients with specific phenotypes from within electronic medical records systems. However, developing phenotyping algorithms in a scalable way remains a challenge due to the extensive human resources required. This paper introduces a high-throughput unsupervised feature selection method, which improves the robustness and scalability of electronic medical record phenotyping without compromising its accuracy. The proposed Surrogate-Assisted Feature Extraction (SAFE) method selects candidate features from a pool of comprehensive medical concepts found in publicly available knowledge sources. The target phenotype's International Classification of Diseases, Ninth Revision and natural language processing counts, acting as noisy surrogates to the gold-standard labels, are used to create silver-standard labels. Candidate features highly predictive of the silver-standard labels are selected as the final features. Algorithms were trained to identify patients with coronary artery disease, rheumatoid arthritis, Crohn's disease, and ulcerative colitis using various numbers of labels to compare the performance of features selected by SAFE, a previously published automated feature extraction for phenotyping procedure, and domain experts. The out-of-sample area under the receiver operating characteristic curve and F -score from SAFE algorithms were remarkably higher than those from the other two, especially at small label sizes. SAFE advances high-throughput phenotyping methods by automatically selecting a succinct set of informative features for algorithm training, which in turn reduces overfitting and the needed number of gold-standard labels. SAFE also potentially identifies important features missed by automated feature extraction for phenotyping or experts.

  9. Proteome-wide detection and quantitative analysis of irreversible cysteine oxidation using long column UPLC-pSRM

    PubMed Central

    Lee, Chia-Fang; Paull, Tanya T.; Person, Maria D.

    2013-01-01

    Reactive oxygen species (ROS) play an important role in normal biological functions and pathological processes. ROS is one of the driving forces for oxidizing proteins, especially on cysteine thiols. The labile, transient, and dynamic nature of oxidative modifications poses enormous technical challenges for both accurate modification site determination and quantitation of cysteine thiols. The present study describes a mass spectrometry-based approach that allows effective discovery and quantification of irreversible cysteine modifications. The utilization of a long reverse phase column provides high-resolution chromatography to separate different forms of modified cysteine thiols from protein complexes or cell lysates. This FT-MS approach enabled detection and quantitation of ATM complex cysteine sulfoxidation states using Skyline MS1 filtering. When we applied the long column UPLC-MS/MS analysis, 61 and 44 peptides from cell lysates and cell were identified with cysteine modifications in response to in vitro and in vivo H2O2 oxidation, respectively. Long column Ultra High Pressure Liquid Chromatography -pseudo Selected Reaction Monitoring (UPLC-pSRM) was then developed to monitor the oxidative level of cysteine thiols in cell lysate under varying concentrations of H2O2 treatment. From UPLC-pSRM analysis, the dynamic conversion of sulfinic (S-O2H) and sulfonic acid (S-O3H) was observed within nucleoside diphosphate kinase (Nm23-H1) and heat shock 70 kDa protein 8 (Hsc70). These methods are suitable for proteome-wide studies, providing a highly sensitive, straight-forward approach to identify proteins containing redox-sensitive cysteine thiols in biological systems. PMID:23964713

  10. A Primer on High-Throughput Computing for Genomic Selection

    PubMed Central

    Wu, Xiao-Lin; Beissinger, Timothy M.; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J. M.; Weigel, Kent A.; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin–Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  11. Quantitative high throughput analytics to support polysaccharide production process development.

    PubMed

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  12. A primer on high-throughput computing for genomic selection.

    PubMed

    Wu, Xiao-Lin; Beissinger, Timothy M; Bauck, Stewart; Woodward, Brent; Rosa, Guilherme J M; Weigel, Kent A; Gatti, Natalia de Leon; Gianola, Daniel

    2011-01-01

    High-throughput computing (HTC) uses computer clusters to solve advanced computational problems, with the goal of accomplishing high-throughput over relatively long periods of time. In genomic selection, for example, a set of markers covering the entire genome is used to train a model based on known data, and the resulting model is used to predict the genetic merit of selection candidates. Sophisticated models are very computationally demanding and, with several traits to be evaluated sequentially, computing time is long, and output is low. In this paper, we present scenarios and basic principles of how HTC can be used in genomic selection, implemented using various techniques from simple batch processing to pipelining in distributed computer clusters. Various scripting languages, such as shell scripting, Perl, and R, are also very useful to devise pipelines. By pipelining, we can reduce total computing time and consequently increase throughput. In comparison to the traditional data processing pipeline residing on the central processors, performing general-purpose computation on a graphics processing unit provide a new-generation approach to massive parallel computing in genomic selection. While the concept of HTC may still be new to many researchers in animal breeding, plant breeding, and genetics, HTC infrastructures have already been built in many institutions, such as the University of Wisconsin-Madison, which can be leveraged for genomic selection, in terms of central processing unit capacity, network connectivity, storage availability, and middleware connectivity. Exploring existing HTC infrastructures as well as general-purpose computing environments will further expand our capability to meet increasing computing demands posed by unprecedented genomic data that we have today. We anticipate that HTC will impact genomic selection via better statistical models, faster solutions, and more competitive products (e.g., from design of marker panels to realized

  13. Microgradient-heaters as tools for high-throughput experimentation.

    PubMed

    Meyer, Robert; Hamann, Sven; Ehmann, Michael; Thienhaus, Sigurd; Jaeger, Stefanie; Thiede, Tobias; Devi, Anjana; Fischer, Roland A; Ludwig, Alfred

    2012-10-08

    A microgradient-heater (MGH) was developed, and its feasibility as a tool for high-throughput materials science experimentation was tested. The MGH is derived from microhot plate (MHP) systems and allows combinatorial thermal processing on the micronano scale. The temperature gradient is adjustable by the substrate material. For an Au-coated MGH membrane a temperature drop from 605 to 100 °C was measured over a distance of 965 μm, resulting in an average temperature change of 0.52 K/μm. As a proof of principle, we demonstrate the feasibility of MGHs on the example of a chemical vapor deposition (CVD) process. The achieved results show discontinuous changes in surface morphology within a continuous TiO2 film. Furthermore the MGH can be used to get insights into the energetic relations of film growth processes, giving it the potential for microcalorimetry measurements.

  14. Interactive Visual Analysis of High Throughput Text Streams

    SciTech Connect

    Steed, Chad A; Potok, Thomas E; Patton, Robert M; Goodall, John R; Maness, Christopher S; Senter, James K; Potok, Thomas E

    2012-01-01

    The scale, velocity, and dynamic nature of large scale social media systems like Twitter demand a new set of visual analytics techniques that support near real-time situational awareness. Social media systems are credited with escalating social protest during recent large scale riots. Virtual communities form rapidly in these online systems, and they occasionally foster violence and unrest which is conveyed in the users language. Techniques for analyzing broad trends over these networks or reconstructing conversations within small groups have been demonstrated in recent years, but state-of- the-art tools are inadequate at supporting near real-time analysis of these high throughput streams of unstructured information. In this paper, we present an adaptive system to discover and interactively explore these virtual networks, as well as detect sentiment, highlight change, and discover spatio- temporal patterns.

  15. Writing strategy for a high-throughput SCALPEL system

    NASA Astrophysics Data System (ADS)

    Stanton, Stuart T.; Liddle, James A.; Felker, Joseph A.; Waskiewicz, Warren K.; Harriott, Lloyd R.

    1999-06-01

    Successful deployment of SCALPEL for several post-optical production lithography generations requires a unique optimum writing-strategy. Since the electron optics sub-field and the strutted mask patten segment are both smaller than the final device image area, SCALPEL utilizes a stitching approach to image-formation. A dynamic sub-field placement scheme, or 'writing strategy', must provide precise 2D stitching at high speed, and eliminate mask strut images on the wafer. It should also provide the extended dynamic lens field necessary for good throughput, while minimizing all non-exposure times per wafer and maintaining the time- averaged current near the instantaneous space-charge limit. The preferred writing-strategy replaces mechanical stage acceleration events with beam deflection wherever possible. The unique writing-strategy presented here also generates the required 2D seam-blending dose-profiles, which are vital to robust CD control with stitching.

  16. High throughput sequencing reveals a novel fabavirus infecting sweet cherry.

    PubMed

    Villamor, D E V; Pillai, S S; Eastwell, K C

    2017-03-01

    The genus Fabavirus currently consists of five species represented by viruses that infect a wide range of hosts but none reported from temperate climate fruit trees. A virus with genomic features resembling fabaviruses (tentatively named Prunus virus F, PrVF) was revealed by high throughput sequencing of extracts from a sweet cherry tree (Prunus avium). PrVF was subsequently shown to be graft transmissible and further identified in three other non-symptomatic Prunus spp. from different geographical locations. Two genetic variants of RNA1 and RNA2 coexisted in the same samples. RNA1 consisted of 6,165 and 6,163 nucleotides, and RNA2 consisted of 3,622 and 3,468 nucleotides.

  17. High-throughput sequencing in veterinary infection biology and diagnostics.

    PubMed

    Belák, S; Karlsson, O E; Leijon, M; Granberg, F

    2013-12-01

    Sequencing methods have improved rapidly since the first versions of the Sanger techniques, facilitating the development of very powerful tools for detecting and identifying various pathogens, such as viruses, bacteria and other microbes. The ongoing development of high-throughput sequencing (HTS; also known as next-generation sequencing) technologies has resulted in a dramatic reduction in DNA sequencing costs, making the technology more accessible to the average laboratory. In this White Paper of the World Organisation for Animal Health (OIE) Collaborating Centre for the Biotechnology-based Diagnosis of Infectious Diseases in Veterinary Medicine (Uppsala, Sweden), several approaches and examples of HTS are summarised, and their diagnostic applicability is briefly discussed. Selected future aspects of HTS are outlined, including the need for bioinformatic resources, with a focus on improving the diagnosis and control of infectious diseases in veterinary medicine.

  18. Resolving postglacial phylogeography using high-throughput sequencing

    PubMed Central

    Emerson, Kevin J.; Merz, Clayton R.; Catchen, Julian M.; Hohenlohe, Paul A.; Cresko, William A.; Bradshaw, William E.; Holzapfel, Christina M.

    2010-01-01

    The distinction between model and nonmodel organisms is becoming increasingly blurred. High-throughput, second-generation sequencing approaches are being applied to organisms based on their interesting ecological, physiological, developmental, or evolutionary properties and not on the depth of genetic information available for them. Here, we illustrate this point using a low-cost, efficient technique to determine the fine-scale phylogenetic relationships among recently diverged populations in a species. This application of restriction site-associated DNA tags (RAD tags) reveals previously unresolved genetic structure and direction of evolution in the pitcher plant mosquito, Wyeomyia smithii, from a southern Appalachian Mountain refugium following recession of the Laurentide Ice Sheet at 22,000–19,000 B.P. The RAD tag method can be used to identify detailed patterns of phylogeography in any organism regardless of existing genomic data, and, more broadly, to identify incipient speciation and genome-wide variation in natural populations in general. PMID:20798348

  19. High-throughput electronic biology: mining information for drug discovery.

    PubMed

    Loging, William; Harland, Lee; Williams-Jones, Bryn

    2007-03-01

    The vast range of in silico resources that are available in life sciences research hold much promise towards aiding the drug discovery process. To fully realize this opportunity, computational scientists must consider the practical issues of data integration and identify how best to apply these resources scientifically. In this article we describe in silico approaches that are driven towards the identification of testable laboratory hypotheses; we also address common challenges in the field. We focus on flexible, high-throughput techniques, which may be initiated independently of 'wet-lab' experimentation, and which may be applied to multiple disease areas. The utility of these approaches in drug discovery highlights the contribution that in silico techniques can make and emphasizes the need for collaboration between the areas of disease research and computational science.

  20. A Colloidal Stability Assay Suitable for High-Throughput Screening.

    PubMed

    Abarca, Carla; Ali, M Monsur; Yang, Songtao; Dong, Xiaofei; Pelton, Robert H

    2016-03-01

    A library of 32 polystyrene copolymer latexes, with diameters ranging between 53 and 387 nm, was used to develop and demonstrate a high-throughput assay using a 96-well microplate platform to measure critical coagulation concentrations, a measure of colloidal stability. The most robust assay involved an automated centrifugation-decantation step to remove latex aggregates before absorbance measurements, eliminating aggregate interference with optical measurements made through the base of the multiwell plates. For smaller nanoparticles (diameter <150 nm), the centrifugation-decantation step was not required as the interference was less than with larger particles. Parallel measurements with a ChemiDoc MP plate scanner gave indications of aggregation; however, the results were less sensitive than the absorbance measurements.

  1. High throughput full Stokes Fourier transform imaging spectropolarimetry.

    PubMed

    Meng, Xin; Li, Jianxin; Xu, Tingting; Liu, Defang; Zhu, Rihong

    2013-12-30

    A complete full Stokes imaging spectropolarimeter is proposed. Four separate polarized spectra are fed into the Sagnac Fourier transform spectrometer without slit using different angle combinations of the polarized elements. The four polarized spectra are separated without spatial aliasing. And the system has a good performance to resist the instrument noise due to its high light throughput. The mathematical model for the approach is derived and an optimization of the retardance is discussed. For acquiring the four spectra simultaneously, an improved robust polarization modulator using aperture division is outlined. Then the system is discussed in detail including the imaging principle and spectral resolution. Lastly, two proven experiments are carried out and the experimental results in visible light are outlined.

  2. Noise and non-linearities in high-throughput data

    NASA Astrophysics Data System (ADS)

    Nguyen, Viet-Anh; Koukolíková-Nicola, Zdena; Bagnoli, Franco; Lió, Pietro

    2009-01-01

    High-throughput data analyses are becoming common in biology, communications, economics and sociology. The vast amounts of data are usually represented in the form of matrices and can be considered as knowledge networks. Spectra-based approaches have proved useful in extracting hidden information within such networks and for estimating missing data, but these methods are based essentially on linear assumptions. The physical models of matching, when applicable, often suggest non-linear mechanisms, that may sometimes be identified as noise. The use of non-linear models in data analysis, however, may require the introduction of many parameters, which lowers the statistical weight of the model. According to the quality of data, a simpler linear analysis may be more convenient than more complex approaches. In this paper, we show how a simple non-parametric Bayesian model may be used to explore the role of non-linearities and noise in synthetic and experimental data sets.

  3. High throughput fingerprint analysis of large-insert clones.

    PubMed

    Marra, M A; Kucaba, T A; Dietrich, N L; Green, E D; Brownstein, B; Wilson, R K; McDonald, K M; Hillier, L W; McPherson, J D; Waterston, R H

    1997-11-01

    As part of the Human Genome Project, the Washington University Genome Sequencing Center has commenced systematic sequencing of human chromsome 7. To organize and supply the effort, we have undertaken the construction of sequence-ready physical maps for defined chromosomal intervals. Map construction is a serial process composed of three main activities. First, candidate STS-positive large-insert PAC and BAC clones are identified. Next, these candidate clones are subjected to fingerprint analysis. Finally, the fingerprint data are used to assemble sequence-ready maps. The fingerprinting method we have devised is key to the success of the overall approach. We present here the details of the method and show that the fingerprints are of sufficient quality to permit the construction of megabase-size contigs in defined regions of the human genome. We anticipate that the high throughput and precision characteristic of our fingerprinting method will make it of general utility.

  4. UAV-based high-throughput phenotyping in legume crops

    NASA Astrophysics Data System (ADS)

    Sankaran, Sindhuja; Khot, Lav R.; Quirós, Juan; Vandemark, George J.; McGee, Rebecca J.

    2016-05-01

    In plant breeding, one of the biggest obstacles in genetic improvement is the lack of proven rapid methods for measuring plant responses in field conditions. Therefore, the major objective of this research was to evaluate the feasibility of utilizing high-throughput remote sensing technology for rapid measurement of phenotyping traits in legume crops. The plant responses of several chickpea and peas varieties to the environment were assessed with an unmanned aerial vehicle (UAV) integrated with multispectral imaging sensors. Our preliminary assessment showed that the vegetation indices are strongly correlated (p<0.05) with seed yield of legume crops. Results endorse the potential of UAS-based sensing technology to rapidly measure those phenotyping traits.

  5. Proposed high throughput electrorefining treatment for spent N- Reactor fuel

    SciTech Connect

    Gay, E.C.; Miller, W.E.; Laidler, J.J.

    1996-05-01

    A high-throughput electrorefining process is being adapted to treat spent N-Reactor fuel for ultimate disposal in a geologic repository. Anodic dissolution tests were made with unirradiated N-Reactor fuel to determine the type of fragmentation necessary to provide fuel segments suitable for this process. Based on these tests, a conceptual design was produced of a plant-scale electrorefiner. In this design, the diameter of an electrode assembly is about 1.07 m (42 in.). Three of these assemblies in an electrorefiner would accommodate a 3-metric-ton batch of N-Reactor fuel that would be processed at a rate of 42 kg of uranium per hour.

  6. Muscle plasticity and high throughput gene expression studies.

    PubMed

    Reggiani, Carlo; Kronnie, Geertruuy Te

    2004-01-01

    Changes in gene expression are known to contribute to muscle plasticity. Until recently most studies have described differences of one or few genes at a time, in the last few years, however, the development of new technology of high throughput mRNA expression analysis has allowed the study of a large part if not all transcripts in the same experiment. Knowledge on any muscle adaptive response has already gained from the application of this novel approach, but the most important new findings have come from studies on muscle atrophy. A new and unexpected groups of genes, which increase their expression during atrophy and are, therefore, designated as atrogins, have been discovered. In spite of the impressive power of the new technology many problems are still to be resolved to optimize the experimental design and to extract all information which are provided by the outcome of the global mRNA assessment.

  7. High-throughput ab-initio dilute solute diffusion database

    PubMed Central

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-01-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world. PMID:27434308

  8. Microfluidic cell chips for high-throughput drug screening.

    PubMed

    Chi, Chun-Wei; Ahmed, Ah Rezwanuddin; Dereli-Korkut, Zeynep; Wang, Sihong

    2016-05-01

    The current state of screening methods for drug discovery is still riddled with several inefficiencies. Although some widely used high-throughput screening platforms may enhance the drug screening process, their cost and oversimplification of cell-drug interactions pose a translational difficulty. Microfluidic cell-chips resolve many issues found in conventional HTS technology, providing benefits such as reduced sample quantity and integration of 3D cell culture physically more representative of the physiological/pathological microenvironment. In this review, we introduce the advantages of microfluidic devices in drug screening, and outline the critical factors which influence device design, highlighting recent innovations and advances in the field including a summary of commercialization efforts on microfluidic cell chips. Future perspectives of microfluidic cell devices are also provided based on considerations of present technological limitations and translational barriers.

  9. Macromolecular Crystallography conventional and high-throughput methods

    SciTech Connect

    Wasserman, Stephen R.; Smith, David W.; D'Amico, Kevin L.; Koss, John W.; Morisco, Laura L.; Burley, Stephen K.

    2007-09-27

    High-throughput data collection requires the seamless interoperation of various hardware components. User-supplied descriptions of protein crystals must also be directly linked with the diffraction data. Such linkages can be achieved efficiently with computer databases. A database that tracks production of the protein samples, crystallization, and diffraction from the resultant crystals serves as the glue that holds the entire gene-to-structure process together. This chapter begins by discussing data collection processes and hardware. It then illustrates how a well-constructed database ensures information flow through the steps of data acquisition. Such a database allows synchrotron beamline measurements to be directly and efficiently integrated into the process of protein crystallographic structure determination.

  10. Automated sample area definition for high-throughput microscopy.

    PubMed

    Zeder, M; Ellrott, A; Amann, R

    2011-04-01

    High-throughput screening platforms based on epifluorescence microscopy are powerful tools in a variety of scientific fields. Although some applications are based on imaging geometrically defined samples such as microtiter plates, multiwell slides, or spotted gene arrays, others need to cope with inhomogeneously located samples on glass slides. The analysis of microbial communities in aquatic systems by sample filtration on membrane filters followed by multiple fluorescent staining, or the investigation of tissue sections are examples. Therefore, we developed a strategy for flexible and fast definition of sample locations by the acquisition of whole slide overview images and automated sample recognition by image analysis. Our approach was tested on different microscopes and the computer programs are freely available (http://www.technobiology.ch). Copyright © 2011 International Society for Advancement of Cytometry.

  11. Characterizing immune repertoires by high throughput sequencing: strategies and applications

    PubMed Central

    Calis, Jorg J.A.; Rosenberg, Brad R.

    2014-01-01

    As the key cellular effectors of adaptive immunity, T and B lymphocytes utilize specialized receptors to recognize, respond to, and neutralize a diverse array of extrinsic threats. These receptors (immunoglobulins in B lymphocytes, T cell receptors in T lymphocytes) are incredibly variable, the products of specialized genetic diversification mechanisms that generate complex lymphocyte repertoires with extensive collections of antigen specificities. Recent advances in high throughput sequencing (HTS) technologies have transformed our ability to examine antigen receptor repertoires at single nucleotide, and more recently, single cell, resolution. Here we review current approaches to examining antigen receptor repertoires by HTS, and discuss inherent biological and technical challenges. We further describe emerging applications of this powerful methodology for exploring the adaptive immune system. PMID:25306219

  12. Quantitative High-throughput Luciferase Screening in Identifying CAR Modulators

    PubMed Central

    Lynch, Caitlin; Zhao, Jinghua; Wang, Hongbing; Xia, Menghang

    2017-01-01

    Summary The constitutive androstane receptor (CAR, NR1I3) is responsible for the transcription of multiple drug metabolizing enzymes and transporters. There are two possible methods of activation for CAR, direct ligand binding and a ligand-independent method, which makes this a unique nuclear receptor. Both of these mechanisms require translocation of CAR from the cytoplasm into the nucleus. Interestingly, CAR is constitutively active in immortalized cell lines due to the basal nuclear location of this receptor. This creates an important challenge in most in vitro assay models because immortalized cells cannot be used without inhibiting the basal activity. In this book chapter, we go into detail of how to perform quantitative high-throughput screens to identify hCAR1 modulators through the employment of a double stable cell line. Using this line, we are able to identify activators, as well as deactivators, of the challenging nuclear receptor, CAR. PMID:27518621

  13. High-Throughput Automation in Chemical Process Development.

    PubMed

    Selekman, Joshua A; Qiu, Jun; Tran, Kristy; Stevens, Jason; Rosso, Victor; Simmons, Eric; Xiao, Yi; Janey, Jacob

    2017-06-07

    High-throughput (HT) techniques built upon laboratory automation technology and coupled to statistical experimental design and parallel experimentation have enabled the acceleration of chemical process development across multiple industries. HT technologies are often applied to interrogate wide, often multidimensional experimental spaces to inform the design and optimization of any number of unit operations that chemical engineers use in process development. In this review, we outline the evolution of HT technology and provide a comprehensive overview of how HT automation is used throughout different industries, with a particular focus on chemical and pharmaceutical process development. In addition, we highlight the common strategies of how HT automation is incorporated into routine development activities to maximize its impact in various academic and industrial settings.

  14. High-throughput process development for biopharmaceutical drug substances.

    PubMed

    Bhambure, Rahul; Kumar, Kaushal; Rathore, Anurag S

    2011-03-01

    Quality by Design (QbD) is gaining industry acceptance as an approach towards development and commercialization of biotechnology therapeutic products that are expressed via microbial or mammalian cell lines. In QbD, the process is designed and controlled to deliver specified quality attributes consistently. To acquire the enhanced understanding that is necessary to achieve the above, however, requires more extensive experimentation to establish the design space for the process and the product. With biotechnology companies operating under ever-increasing pressure towards lowering the cost of manufacturing, the use of high-throughput tools has emerged as a necessary enabler of QbD in a time- and resource-constrained environment. We review this topic for those in academia and industry that are engaged in drug substance process development. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Native mass spectrometry: towards high-throughput structural proteomics.

    PubMed

    Kondrat, Frances D L; Struwe, Weston B; Benesch, Justin L P

    2015-01-01

    Native mass spectrometry (MS) has become a sensitive method for structural proteomics, allowing practitioners to gain insight into protein self-assembly, including stoichiometry and three-dimensional architecture, as well as complementary thermodynamic and kinetic aspects. Although MS is typically performed in vacuum, a body of literature has described how native solution-state structure is largely retained on the timescale of the experiment. Native MS offers the benefit that it requires substantially smaller quantities of a sample than traditional structural techniques such as NMR and X-ray crystallography, and is therefore well suited to high-throughput studies. Here we first describe the native MS approach and outline the structural proteomic data that it can deliver. We then provide practical details of experiments to examine the structural and dynamic properties of protein assemblies, highlighting potential pitfalls as well as principles of best practice.

  16. EDITORIAL: Combinatorial and High-Throughput Materials Research

    NASA Astrophysics Data System (ADS)

    Potyrailo, Radislav A.; Takeuchi, Ichiro

    2005-01-01

    The success of combinatorial and high-throughput methodologies relies greatly on the availability of various characterization tools with new and improved capabilities [1]. Indeed, how useful can a combinatorial library of 250, 400, 25 000 or 2 000 000 compounds be [2-5] if one is unable to characterize its properties of interest fairly quickly? How useful can a set of thousands of spectra or chromatograms be if one is unable to analyse them in a timely manner? For these reasons, the development of new approaches for materials characterization is one of the most active areas in combinatorial materials science. The importance of this aspect of research in the field has been discussed in numerous conferences including the Pittsburgh Conferences, the American Chemical Society Meetings, the American Physical Society Meetings, the Materials Research Society Symposia and various Gordon Research Conferences. Naturally, the development of new measurement instrumentation attracts the attention not only of practitioners of combinatorial materials science but also of those who design new software for data manipulation and mining. Experimental designs of combinatorial libraries are pursued with available and realistic synthetic and characterization capabilities in mind. It is becoming increasingly critical to link the design of new equipment for high-throughput parallel materials synthesis with integrated measurement tools in order to enhance the efficacy of the overall experimental strategy. We have received an overwhelming response to our proposal and call for papers for this Special Issue on Combinatorial Materials Science. The papers in this issue of Measurement Science and Technology are a very timely collection that captures the state of modern combinatorial materials science. They demonstrate the significant advances that are taking place in the field. In some cases, characterization tools are now being operated in the factory mode. At the same time, major challenges

  17. High-throughput ab-initio dilute solute diffusion database

    NASA Astrophysics Data System (ADS)

    Wu, Henry; Mayeshiba, Tam; Morgan, Dane

    2016-07-01

    We demonstrate automated generation of diffusion databases from high-throughput density functional theory (DFT) calculations. A total of more than 230 dilute solute diffusion systems in Mg, Al, Cu, Ni, Pd, and Pt host lattices have been determined using multi-frequency diffusion models. We apply a correction method for solute diffusion in alloys using experimental and simulated values of host self-diffusivity. We find good agreement with experimental solute diffusion data, obtaining a weighted activation barrier RMS error of 0.176 eV when excluding magnetic solutes in non-magnetic alloys. The compiled database is the largest collection of consistently calculated ab-initio solute diffusion data in the world.

  18. Exploring proteome-wide occurrence of clusters of charged residues in eukaryotes.

    PubMed

    Belmabrouk, Sabrine; Kharrat, Najla; Benmarzoug, Riadh; Rebai, Ahmed

    2015-07-01

    Clusters of charged residues are one of the key features of protein primary structure since they have been associated to important functions of proteins. Here, we present a proteome wide scan for the occurrence of Charge Clusters in Protein sequences using a new search tool (FCCP) based on a score-based methodology. The FCCP was run to search charge clusters in seven eukaryotic proteomes: Arabidopsis thaliana, Caenorhabditis elegans, Danio rerio, Drosophila melanogaster, Homo sapiens, Mus musculus, and Saccharomyces cerevisiae. We found that negative charge clusters (NCCs) are three to four times more frequent than positive charge clusters (PCCs). The Drosophila proteome is on average the most charged, whereas the human proteome is the least charged. Only 3 to 8% of the studied protein sequences have negative charge clusters, while 1.6 to 3% having PCCs and only 0.07 to 0.6% have both types of clusters. NCCs are localized predominantly in the N-terminal and C-terminal domains, while PCCs tend to be localized within the functional domains of the protein sequences. Furthermore, the gene ontology classification revealed that the protein sequences with negative and PCCs are mainly binding proteins.

  19. Proteome-wide identification of ubiquitylation sites by conjugation of engineered lysine-less ubiquitin.

    PubMed

    Oshikawa, Kiyotaka; Matsumoto, Masaki; Oyamada, Koji; Nakayama, Keiichi I

    2012-02-03

    Ubiquitin conjugation (ubiquitylation) plays important roles not only in protein degradation but also in many other cellular functions. However, the sites of proteins that are targeted for such modification have remained poorly characterized at the proteomic level. We have now developed a method for the efficient identification of ubiquitylation sites in target proteins with the use of an engineered form of ubiquitin (K0-Ub), in which all seven lysine residues are replaced with arginine. K0-Ub is covalently attached to lysine residues of target proteins via an isopeptide bond, but further formation of a polyubiquitin chain does not occur on K0-Ub. We identified a total of 1392 ubiquitylation sites of 794 proteins from HEK293T cells. Profiling of ubiquitylation sites indicated that the sequences surrounding lysine residues targeted for ubiquitin conjugation do not share a common motif or structural feature. Furthermore, we identified a critical ubiquitylation site of the cyclin-dependent kinase inhibitor p27(Kip1). Mutation of this site thus inhibited ubiquitylation of and stabilized p27(Kip1), suggesting that this lysine residue is the target site of p27(Kip1) for ubiquitin conjugation in vivo. In conclusion, our method based on K0-Ub is a powerful tool for proteome-wide identification of ubiquitylation sites of target proteins.

  20. Critical controllability in proteome-wide protein interaction network integrating transcriptome

    NASA Astrophysics Data System (ADS)

    Ishitsuka, Masayuki; Akutsu, Tatsuya; Nacher, Jose C.

    2016-04-01

    Recently, the number of essential gene entries has considerably increased. However, little is known about the relationships between essential genes and their functional roles in critical network control at both the structural (protein interaction network) and dynamic (transcriptional) levels, in part because the large size of the network prevents extensive computational analysis. Here, we present an algorithm that identifies the critical control set of nodes by reducing the computational time by 180 times and by expanding the computable network size up to 25 times, from 1,000 to 25,000 nodes. The developed algorithm allows a critical controllability analysis of large integrated systems composed of a transcriptome- and proteome-wide protein interaction network for the first time. The data-driven analysis captures a direct triad association of the structural controllability of genes, lethality and dynamic synchronization of co-expression. We believe that the identified optimized critical network control subsets may be of interest as drug targets; thus, they may be useful for drug design and development.

  1. Proteome-wide identification of predominant subcellular protein localizations in a bacterial model organism

    SciTech Connect

    Stekhoven, Daniel J.; Omasits, Ulrich; Quebatte, Maxime; Dehio, Christoph; Ahrens, Christian H.

    2014-03-01

    Proteomics data provide unique insights into biological systems, including the predominant subcellular localization (SCL) of proteins, which can reveal important clues about their functions. Here we analyzed data of a complete prokaryotic proteome expressed under two conditions mimicking interaction of the emerging pathogen Bartonella henselae with its mammalian host. Normalized spectral count data from cytoplasmic, total membrane, inner and outer membrane fractions allowed us to identify the predominant SCL for 82% of the identified proteins. The spectral count proportion of total membrane versus cytoplasmic fractions indicated the propensity of cytoplasmic proteins to co-fractionate with the inner membrane, and enabled us to distinguish cytoplasmic, peripheral innermembrane and bona fide inner membrane proteins. Principal component analysis and k-nearest neighbor classification training on selected marker proteins or predominantly localized proteins, allowed us to determine an extensive catalog of at least 74 expressed outer membrane proteins, and to extend the SCL assignment to 94% of the identified proteins, including 18% where in silico methods gave no prediction. Suitable experimental proteomics data combined with straightforward computational approaches can thus identify the predominant SCL on a proteome-wide scale. Finally, we present a conceptual approach to identify proteins potentially changing their SCL in a condition-dependent fashion.

  2. Critical controllability in proteome-wide protein interaction network integrating transcriptome

    PubMed Central

    Ishitsuka, Masayuki; Akutsu, Tatsuya; Nacher, Jose C.

    2016-01-01

    Recently, the number of essential gene entries has considerably increased. However, little is known about the relationships between essential genes and their functional roles in critical network control at both the structural (protein interaction network) and dynamic (transcriptional) levels, in part because the large size of the network prevents extensive computational analysis. Here, we present an algorithm that identifies the critical control set of nodes by reducing the computational time by 180 times and by expanding the computable network size up to 25 times, from 1,000 to 25,000 nodes. The developed algorithm allows a critical controllability analysis of large integrated systems composed of a transcriptome- and proteome-wide protein interaction network for the first time. The data-driven analysis captures a direct triad association of the structural controllability of genes, lethality and dynamic synchronization of co-expression. We believe that the identified optimized critical network control subsets may be of interest as drug targets; thus, they may be useful for drug design and development. PMID:27040162

  3. High-throughput DNA extraction of forensic adhesive tapes.

    PubMed

    Forsberg, Christina; Jansson, Linda; Ansell, Ricky; Hedman, Johannes

    2016-09-01

    Tape-lifting has since its introduction in the early 2000's become a well-established sampling method in forensic DNA analysis. Sampling is quick and straightforward while the following DNA extraction is more challenging due to the "stickiness", rigidity and size of the tape. We have developed, validated and implemented a simple and efficient direct lysis DNA extraction protocol for adhesive tapes that requires limited manual labour. The method uses Chelex beads and is applied with SceneSafe FAST tape. This direct lysis protocol provided higher mean DNA yields than PrepFiler Express BTA on Automate Express, although the differences were not significant when using clothes worn in a controlled fashion as reference material (p=0.13 and p=0.34 for T-shirts and button-down shirts, respectively). Through in-house validation we show that the method is fit-for-purpose for application in casework, as it provides high DNA yields and amplifiability, as well as good reproducibility and DNA extract stability. After implementation in casework, the proportion of extracts with DNA concentrations above 0.01ng/μL increased from 71% to 76%. Apart from providing higher DNA yields compared with the previous method, the introduction of the developed direct lysis protocol also reduced the amount of manual labour by half and doubled the potential throughput for tapes at the laboratory. Generally, simplified manual protocols can serve as a cost-effective alternative to sophisticated automation solutions when the aim is to enable high-throughput DNA extraction of complex crime scene samples.

  4. Evaluation of sequencing approaches for high-throughput ...

    EPA Pesticide Factsheets

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platforms for potential application to high-throughput screening: 1. TempO-Seq utilizing custom designed paired probes per gene; 2. Targeted sequencing (TSQ) utilizing Illumina’s TruSeq RNA Access Library Prep Kit containing tiled exon-specific probe sets; 3. Low coverage whole transcriptome sequencing (LSQ) using Illumina’s TruSeq Stranded mRNA Kit. Each platform was required to cover the ~20,000 genes of the full transcriptome, operate directly with cell lysates, and be automatable with 384-well plates. Technical reproducibility was assessed using MAQC control RNA samples A and B, while functional utility for chemical screening was evaluated using six treatments at a single concentration after 6 hr in MCF7 breast cancer cells: 10 µM chlorpromazine, 10 µM ciclopriox, 10 µM genistein, 100 nM sirolimus, 1 µM tanespimycin, and 1 µM trichostatin A. All RNA samples and chemical treatments were run with 5 technical replicates. The three platforms achieved different read depths, with the TempO-Seq having ~34M mapped reads per sample, while TSQ and LSQ averaged 20M and 11M aligned reads per sample, respectively. Inter-replicate correlation averaged ≥0.95 for raw log2 expression values i

  5. A bioimage informatics platform for high-throughput embryo phenotyping.

    PubMed

    Brown, James M; Horner, Neil R; Lawson, Thomas N; Fiegel, Tanja; Greenaway, Simon; Morgan, Hugh; Ring, Natalie; Santos, Luis; Sneddon, Duncan; Teboul, Lydia; Vibert, Jennifer; Yaikhom, Gagarine; Westerberg, Henrik; Mallon, Ann-Marie

    2016-10-14

    High-throughput phenotyping is a cornerstone of numerous functional genomics projects. In recent years, imaging screens have become increasingly important in understanding gene-phenotype relationships in studies of cells, tissues and whole organisms. Three-dimensional (3D) imaging has risen to prominence in the field of developmental biology for its ability to capture whole embryo morphology and gene expression, as exemplified by the International Mouse Phenotyping Consortium (IMPC). Large volumes of image data are being acquired by multiple institutions around the world that encompass a range of modalities, proprietary software and metadata. To facilitate robust downstream analysis, images and metadata must be standardized to account for these differences. As an open scientific enterprise, making the data readily accessible is essential so that members of biomedical and clinical research communities can study the images for themselves without the need for highly specialized software or technical expertise. In this article, we present a platform of software tools that facilitate the upload, analysis and dissemination of 3D images for the IMPC. Over 750 reconstructions from 80 embryonic lethal and subviable lines have been captured to date, all of which are openly accessible at mousephenotype.org Although designed for the IMPC, all software is available under an open-source licence for others to use and develop further. Ongoing developments aim to increase throughput and improve the analysis and dissemination of image data. Furthermore, we aim to ensure that images are searchable so that users can locate relevant images associated with genes, phenotypes or human diseases of interest.

  6. Evaluation of sequencing approaches for high-throughput ...

    EPA Pesticide Factsheets

    Whole-genome in vitro transcriptomics has shown the capability to identify mechanisms of action and estimates of potency for chemical-mediated effects in a toxicological framework, but with limited throughput and high cost. We present the evaluation of three toxicogenomics platforms for potential application to high-throughput screening: 1. TempO-Seq utilizing custom designed paired probes per gene; 2. Targeted sequencing (TSQ) utilizing Illumina’s TruSeq RNA Access Library Prep Kit containing tiled exon-specific probe sets; 3. Low coverage whole transcriptome sequencing (LSQ) using Illumina’s TruSeq Stranded mRNA Kit. Each platform was required to cover the ~20,000 genes of the full transcriptome, operate directly with cell lysates, and be automatable with 384-well plates. Technical reproducibility was assessed using MAQC control RNA samples A and B, while functional utility for chemical screening was evaluated using six treatments at a single concentration after 6 hr in MCF7 breast cancer cells: 10 µM chlorpromazine, 10 µM ciclopriox, 10 µM genistein, 100 nM sirolimus, 1 µM tanespimycin, and 1 µM trichostatin A. All RNA samples and chemical treatments were run with 5 technical replicates. The three platforms achieved different read depths, with the TempO-Seq having ~34M mapped reads per sample, while TSQ and LSQ averaged 20M and 11M aligned reads per sample, respectively. Inter-replicate correlation averaged ≥0.95 for raw log2 expression values i

  7. High-throughput analysis of peptide binding modules

    PubMed Central

    Liu, Bernard A.; Engelmann, Brett; Nash, Piers D.

    2014-01-01

    Modular protein interaction domains that recognize linear peptide motifs are found in hundreds of proteins within the human genome. Some protein interaction domains such as SH2, 14-3-3, Chromo and Bromo domains serve to recognize post-translational modification of amino acids (such as phosphorylation, acetylation, methylation etc.) and translate these into discrete cellular responses. Other modules such as SH3 and PDZ domains recognize linear peptide epitopes and serve to organize protein complexes based on localization and regions of elevated concentration. In both cases, the ability to nucleate specific signaling complexes is in large part dependent on the selectivity of a given protein module for its cognate peptide ligand. High throughput analysis of peptide-binding domains by peptide or protein arrays, phage display, mass spectrometry or other HTP techniques provides new insight into the potential protein-protein interactions prescribed by individual or even whole families of modules. Systems level analyses have also promoted a deeper understanding of the underlying principles that govern selective protein-protein interactions and how selectivity evolves. Lastly, there is a growing appreciation for the limitations and potential pitfalls of high-throughput analysis of protein-peptide interactomes. This review will examine some of the common approaches utilized for large-scale studies of protein interaction domains and suggest a set of standards for the analysis and validation of datasets from large-scale studies of peptide-binding modules. We will also highlight how data from large-scale studies of modular interaction domain families can provide insight into systems level properties such as the linguistics of selective interactions. PMID:22610655

  8. A Robotic Platform for Quantitative High-Throughput Screening

    PubMed Central

    Michael, Sam; Auld, Douglas; Klumpp, Carleen; Jadhav, Ajit; Zheng, Wei; Thorne, Natasha; Austin, Christopher P.; Inglese, James

    2008-01-01

    Abstract High-throughput screening (HTS) is increasingly being adopted in academic institutions, where the decoupling of screening and drug development has led to unique challenges, as well as novel uses of instrumentation, assay formulations, and software tools. Advances in technology have made automated unattended screening in the 1,536-well plate format broadly accessible and have further facilitated the exploration of new technologies and approaches to screening. A case in point is our recently developed quantitative HTS (qHTS) paradigm, which tests each library compound at multiple concentrations to construct concentration-response curves (CRCs) generating a comprehensive data set for each assay. The practical implementation of qHTS for cell-based and biochemical assays across libraries of > 100,000 compounds (e.g., between 700,000 and 2,000,000 sample wells tested) requires maximal efficiency and miniaturization and the ability to easily accommodate many different assay formats and screening protocols. Here, we describe the design and utilization of a fully integrated and automated screening system for qHTS at the National Institutes of Health's Chemical Genomics Center. We report system productivity, reliability, and flexibility, as well as modifications made to increase throughput, add additional capabilities, and address limitations. The combination of this system and qHTS has led to the generation of over 6 million CRCs from > 120 assays in the last 3 years and is a technology that can be widely implemented to increase efficiency of screening and lead generation. PMID:19035846

  9. Emerging metrology for high-throughput nanomaterial genotoxicology.

    PubMed

    Nelson, Bryant C; Wright, Christa W; Ibuki, Yuko; Moreno-Villanueva, Maria; Karlsson, Hanna L; Hendriks, Giel; Sims, Christopher M; Singh, Neenu; Doak, Shareen H

    2017-01-01

    The rapid development of the engineered nanomaterial (ENM) manufacturing industry has accelerated the incorporation of ENMs into a wide variety of consumer products across the globe. Unintentionally or not, some of these ENMs may be introduced into the environment or come into contact with humans or other organisms resulting in unexpected biological effects. It is thus prudent to have rapid and robust analytical metrology in place that can be used to critically assess and/or predict the cytotoxicity, as well as the potential genotoxicity of these ENMs. Many of the traditional genotoxicity test methods [e.g. unscheduled DNA synthesis assay, bacterial reverse mutation (Ames) test, etc.,] for determining the DNA damaging potential of chemical and biological compounds are not suitable for the evaluation of ENMs, due to a variety of methodological issues ranging from potential assay interferences to problems centered on low sample throughput. Recently, a number of sensitive, high-throughput genotoxicity assays/platforms (CometChip assay, flow cytometry/micronucleus assay, flow cytometry/γ-H2AX assay, automated 'Fluorimetric Detection of Alkaline DNA Unwinding' (FADU) assay, ToxTracker reporter assay) have been developed, based on substantial modifications and enhancements of traditional genotoxicity assays. These new assays have been used for the rapid measurement of DNA damage (strand breaks), chromosomal damage (micronuclei) and for detecting upregulated DNA damage signalling pathways resulting from ENM exposures. In this critical review, we describe and discuss the fundamental measurement principles and measurement endpoints of these new assays, as well as the modes of operation, analytical metrics and potential interferences, as applicable to ENM exposures. An unbiased discussion of the major technical advantages and limitations of each assay for evaluating and predicting the genotoxic potential of ENMs is also provided. Published by Oxford University Press on

  10. High throughput single molecule detection for monitoring biochemical reactions

    PubMed Central

    Okagbare, Paul I.; Soper, Steven A.

    2009-01-01

    The design, performance and application of a novel optical system for high throughput single molecule detection (SMD) configured in a continuous flow format using microfluidics is reported. The system consisted of a microfabricated polymer-based multi-channel fluidic network situated within the optical path of a laser source (λex = 660 nm) with photon transduction accomplished using an electron-multiplying charge coupled device (EMCCD) operated in a frame transfer mode that allowed tracking single molecules as they passed through a large field-of-view (FoV) illumination zone. The microfluidic device consisted of 30 microchannels possessing dimensions of 30 μm (width) × 20 μm (depth) with a 25 mm pitch. Individual molecules were electrokinetically driven through the fluidic network and excited within the wide-field illumination area with the resulting fluorescence collected via an objective and imaged onto the EMCCD camera. The detection system demonstrated sufficient sensitivity to detect single DNA molecules labeled with a fluorescent tag (AlexaFluor 660) identified through their characteristic emission wavelength and the burst of photons produced during their transit through the excitation volume. In its present configuration and fluidic architecture, the sample processing throughput was ∼4.02 × 105 molecules s−1, but could be increased dramatically through the use of narrower channels and a smaller pitch. The system was further evaluated using a single molecule-based fluorescence quenching assay for measuring the population differences between duplexed and single-stranded DNA molecules as a function of temperature for determining the duplex melting temperature, Tm. PMID:19082181

  11. High throughput optoelectronic smart pixel systems using diffractive optics

    NASA Astrophysics Data System (ADS)

    Chen, Chih-Hao

    1999-12-01

    Recent developments in digital video, multimedia technology and data networks have greatly increased the demand for high bandwidth communication channels and high throughput data processing. Electronics is particularly suited for switching, amplification and logic functions, while optics is more suitable for interconnections and communications with lower energy and crosstalk. In this research, we present the design, testing, integration and demonstration of several optoelectronic smart pixel devices and system architectures. These systems integrate electronic switching/processing capability with parallel optical interconnections to provide high throughput network communication and pipeline data processing. The Smart Pixel Array Cellular Logic processor (SPARCL) is designed in 0.8 m m CMOS and hybrid integrated with Multiple-Quantum-Well (MQW) devices for pipeline image processing. The Smart Pixel Network Interface (SAPIENT) is designed in 0.6 m m GaAs and monolithically integrated with LEDs to implement a highly parallel optical interconnection network. The Translucent Smart Pixel Array (TRANSPAR) design is implemented in two different versions. The first version, TRANSPAR-MQW, is designed in 0.5 m m CMOS and flip-chip integrated with MQW devices to provide 2-D pipeline processing and translucent networking using the Carrier- Sense-MultipleAccess/Collision-Detection (CSMA/CD) protocol. The other version, TRANSPAR-VM, is designed in 1.2 m m CMOS and discretely integrated with VCSEL-MSM (Vertical-Cavity-Surface- Emitting-Laser and Metal-Semiconductor-Metal detectors) chips and driver/receiver chips on a printed circuit board. The TRANSPAR-VM provides an option of using the token ring network protocol in addition to the embedded functions of TRANSPAR-MQW. These optoelectronic smart pixel systems also require micro-optics devices to provide high resolution, high quality optical interconnections and external source arrays. In this research, we describe an innovative

  12. High-Throughput In Vitro Glycoside Hydrolase (HIGH) Screening for Enzyme Discovery

    SciTech Connect

    Kim, Tae-Wan; Chokhawala, Harshal A.; Hess, Matthias; Dana, Craig M.; Baer, Zachary; Sczyrba, Alexander; Rubin, Edward M.; Blanch, Harvey W.; Clark, Douglas S.

    2011-09-16

    A high-throughput protein-expression and screening method (HIGH method, see picture) provides a rapid approach to the discovery of active glycoside hydrolases in environmental samples. Finally, HIGH screening combines cloning, protein expression, and enzyme hydrolysis in one pot; thus, the entire process from gene expression to activity detection requires only three hours.

  13. Tiered High-Throughput Screening Approach to Identify ...

    EPA Pesticide Factsheets

    High-throughput screening (HTS) for potential thyroid–disrupting chemicals requires a system of assays to capture multiple molecular-initiating events (MIEs) that converge on perturbed thyroid hormone (TH) homeostasis. Screening for MIEs specific to TH-disrupting pathways is limited in the US EPA ToxCast screening assay portfolio. To fill one critical screening gap, the Amplex UltraRed-thyroperoxidase (AUR-TPO) assay was developed to identify chemicals that inhibit TPO, as decreased TPO activity reduces TH synthesis. The ToxCast Phase I and II chemical libraries, comprised of 1,074 unique chemicals, were initially screened using a single, high concentration to identify potential TPO inhibitors. Chemicals positive in the single concentration screen were retested in concentration-response. Due to high false positive rates typically observed with loss-of-signal assays such as AUR-TPO, we also employed two additional assays in parallel to identify possible sources of nonspecific assay signal loss, enabling stratification of roughly 300 putative TPO inhibitors based upon selective AUR-TPO activity. A cell-free luciferase inhibition assay was used to identify nonspecific enzyme inhibition among the putative TPO inhibitors, and a cytotoxicity assay using a human cell line was used to estimate the cellular tolerance limit. Additionally, the TPO inhibition activities of 150 chemicals were compared between the AUR-TPO and an orthogonal peroxidase oxidation assay using

  14. High-Throughput Screening Using Mass Spectrometry within Drug Discovery.

    PubMed

    Rohman, Mattias; Wingfield, Jonathan

    2016-01-01

    In order to detect a biochemical analyte with a mass spectrometer (MS) it is necessary to ionize the analyte of interest. The analyte can be ionized by a number of different mechanisms, however, one common method is electrospray ionization (ESI). Droplets of analyte are sprayed through a highly charged field, the droplets pick up charge, and this is transferred to the analyte. High levels of salt in the assay buffer will potentially steal charge from the analyte and suppress the MS signal. In order to avoid this suppression of signal, salt is often removed from the sample prior to injection into the MS. Traditional ESI MS relies on liquid chromatography (LC) to remove the salt and reduce matrix effects, however, this is a lengthy process. Here we describe the use of RapidFire™ coupled to a triple-quadrupole MS for high-throughput screening. This system uses solid-phase extraction to de-salt samples prior to injection, reducing processing time such that a sample is injected into the MS ~every 10 s.

  15. High-throughput purification of single compounds and libraries.

    PubMed

    Schaffrath, Mathias; von Roedern, Erich; Hamley, Peter; Stilz, Hans Ulrich

    2005-01-01

    The need for increasing productivity in medicinal chemistry and associated improvements in automated synthesis technologies for compound library production during the past few years have resulted in a major challenge for compound purification technology and its organization. To meet this challenge, we have recently set up three full-service chromatography units with the aid of in-house engineers, different HPLC suppliers, and several companies specializing in custom laboratory automation technologies. Our goal was to combine high-throughput purification with the high attention to detail which would be afforded by a dedicated purification service. The resulting final purification laboratory can purify up to 1000 compounds/week in amounts ranging from 5 to 300 mg, whereas the two service intermediate purification units take 100 samples per week from 0.3 to 100 g. The technologies consist of normal-phase and reversed-phase chromatography, robotic fraction pooling and reformatting, a bottling system, an automated external solvent supply and removal system, and a customized, high-capacity freeze-dryer. All work processes are linked by an electronic sample registration and tracking system.

  16. High throughput jet singlet oxygen generator for multi kilowatt SCOIL

    NASA Astrophysics Data System (ADS)

    Rajesh, R.; Singhal, Gaurav; Mainuddin; Tyagi, R. K.; Dawar, A. L.

    2010-06-01

    A jet flow singlet oxygen generator (JSOG) capable of handling chlorine flows of nearly 1.5 mol s -1 has been designed, developed, and tested. The generator is designed in a modular configuration taking into consideration the practical aspects of handling high throughput flows without catastrophic BHP carry over. While for such high flow rates a cross-flow configuration has been reported, the generator utilized in the present study is a counter flow configuration. A near vertical extraction of singlet oxygen is effected at the generator exit, followed by a 90° rotation of the flow forming a novel verti-horizontal COIL scheme. This allows the COIL to be operated with a vertical extraction SOG followed by the horizontal arrangement of subsequent COIL systems such as supersonic nozzle, cavity, supersonic diffuser, etc. This enables a more uniform weight distribution from point of view of mobile and other platform mounted systems, which is highly relevant for large scale systems. The present study discusses the design aspects of the jet singlet oxygen generator along with its test results for various operating ranges. Typically, for the intended design flow rates, the chlorine utilization and singlet oxygen yield have been observed to be ˜94% and ˜64%, respectively.

  17. Surface free energy activated high-throughput cell sorting.

    PubMed

    Zhang, Xinru; Zhang, Qian; Yan, Tao; Jiang, Zeyi; Zhang, Xinxin; Zuo, Yi Y

    2014-09-16

    Cell sorting is an important screening process in microbiology, biotechnology, and clinical research. Existing methods are mainly based on single-cell analysis as in flow cytometric and microfluidic cell sorters. Here we report a label-free bulk method for sorting cells by differentiating their characteristic surface free energies (SFEs). We demonstrated the feasibility of this method by sorting model binary cell mixtures of various bacterial species, including Pseudomonas putida KT2440, Enterococcus faecalis ATCC 29212, Salmonella Typhimurium ATCC 14028, and Escherichia coli DH5α. This method can effectively separate 10(10) bacterial cells within 30 min. Individual bacterial species can be sorted with up to 96% efficiency, and the cell viability ratio can be as high as 99%. In addition to its capacity of sorting evenly mixed bacterial cells, we demonstrated the feasibility of this method in selecting and enriching cells of minor populations in the mixture (presenting at only 1% in quantity) to a purity as high as 99%. This SFE-activated method may be used as a stand-alone method for quickly sorting a large quantity of bacterial cells or as a prescreening tool for microbial discrimination. Given its advantages of label-free, high-throughput, low cost, and simplicity, this SFE-activated cell sorting method has potential in various applications of sorting cells and abiotic particles.

  18. Aptamers as reagents for high-throughput screening.

    PubMed

    Green, L S; Bell, C; Janjic, N

    2001-05-01

    The identification of new drug candidates from chemical libraries is a major component of discovery research in many pharmaceutical companies. Given the large size of many conventional and combinatorial libraries and the rapid increase in the number of possible therapeutic targets, the speed with which efficient high-throughput screening (HTS) assays can be developed can be a rate-limiting step in the discovery process. We show here that aptamers, nucleic acids that bind other molecules with high affinity, can be used as versatile reagents in competition binding HTS assays to identify and optimize small-molecule ligands to protein targets. To illustrate this application, we have used labeled aptamers to platelet-derived growth factor B-chain and wheat germ agglutinin to screen two sets of potential small-molecule ligands. In both cases, binding affinities of all ligands tested (small molecules and aptamers) were strongly correlated with their inhibitory potencies in functional assays. The major advantages of using aptamers in HTS assays are speed of aptamer identification, high affinity of aptamers for protein targets, relatively large aptamer-protein interaction surfaces, and compatibility with various labeling/detection strategies. Aptamers may be particularly useful in HTS assays with protein targets that have no known binding partners such as orphan receptors. Since aptamers that bind to proteins are often specific and potent antagonists of protein function, the use of aptamers for target validation can be coupled with their subsequent use in HTS.

  19. High-throughput process development: I. Process chromatography.

    PubMed

    Rathore, Anurag S; Bhambure, Rahul

    2014-01-01

    Chromatographic separation serves as "a workhorse" for downstream process development and plays a key role in removal of product-related, host cell-related, and process-related impurities. Complex and poorly characterized raw materials and feed material, low feed concentration, product instability, and poor mechanistic understanding of the processes are some of the critical challenges that are faced during development of a chromatographic step. Traditional process development is performed as trial-and-error-based evaluation and often leads to a suboptimal process. High-throughput process development (HTPD) platform involves an integration of miniaturization, automation, and parallelization and provides a systematic approach for time- and resource-efficient chromatography process development. Creation of such platforms requires integration of mechanistic knowledge of the process with various statistical tools for data analysis. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of process chromatography step. It described operation of a commercially available device (PreDictor™ plates from GE Healthcare). This device is available in 96-well format with 2 or 6 μL well size. We also discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion-exchange chromatography of granulocyte colony-stimulating factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the

  20. Combinatorial and High Throughput Discovery of High Temperature Piezoelectric Ceramics

    DTIC Science & Technology

    2011-10-10

    new proposed compounds based on our work nearly doubles the known candidate piezoelectric ferroelectric perovskites . Unlike most computational...potential new high temperature ferroelectric piezoelectric perovskite compounds. Our predictions of the Curie temperature (Tc) ranging from 700C...1100C are the highest reported in either experimental or theoretical studies and the number of new proposed compounds based on our work nearly doubles

  1. Silicon microphysiometer for high-throughput drug screening

    NASA Astrophysics Data System (ADS)

    Verhaegen, Katarina; Baert, Christiaan; Puers, Bob; Sansen, Willy; Simaels, Jeannine; Van Driessche, Veerle; Hermans, Lou; Mertens, Robert P.

    1999-06-01

    We report on a micromachined silicon chip that is capable of providing a high-throughput functional assay based on calorimetry. A prototype twin microcalorimeter based on the Seebeck effect has been fabricated by IC technology and micromachined postprocessing techniques. A biocompatible liquid rubber membrane supports two identical 0.5 X 2 cm2 measurement chambers, situated at the cold and hot junction of a 666-junction aluminum/p+-polysilicon thermopile. The chambers can house up to 106 eukaryotic cells cultured to confluence. The advantage of the device over microcalorimeters on the market, is the integration of the measurement channels on chip, rendering microvolume reaction vessels, ranging from 10 to 600 (mu) l, in the closest possible contact with the thermopile sensor (no springs are needed). Power and temperature sensitivity of the sensor are 23 V/W and 130 mV/K, respectively. The small thermal inertia of the microchannels results in the short response time of 70 s, when filled with 50 (mu) l of water. Biological experiments were done with cultured kidney cells of Xenopus laevis (A6). The thermal equilibration time of the device is 45 min. Stimulation of transport mechanisms by reducing bath osmolality by 50% increased metabolism by 20%. Our results show that it is feasible to apply this large-area, small- volume whole-cell biosensor for drug discovery, where the binding assays that are commonly used to provide high- throughput need to be complemented with a functional assay. Solutions are brought onto the sensor by a simple pipette, making the use of an industrial microtiterplate dispenser feasible on a nx96-array of the microcalorimeter biosensor. Such an array of biosensors has been designed based on a new set of requirements as set forth by people in the field as this project moved on. The results obtained from the prototype large-area sensor were used to obtain an accurate model of the calorimeter, checked for by the simulation software ANSYS. At

  2. Improving the specificity of high-throughput ortholog prediction

    PubMed Central

    Fulton, Debra L; Li, Yvonne Y; Laird, Matthew R; Horsman, Benjamin GS; Roche, Fiona M; Brinkman, Fiona SL

    2006-01-01

    Background Orthologs (genes that have diverged after a speciation event) tend to have similar function, and so their prediction has become an important component of comparative genomics and genome annotation. The gold standard phylogenetic analysis approach of comparing available organismal phylogeny to gene phylogeny is not easily automated for genome-wide analysis; therefore, ortholog prediction for large genome-scale datasets is typically performed using a reciprocal-best-BLAST-hits (RBH) approach. One problem with RBH is that it will incorrectly predict a paralog as an ortholog when incomplete genome sequences or gene loss is involved. In addition, there is an increasing interest in identifying orthologs most likely to have retained similar function. Results To address these issues, we present here a high-throughput computational method named Ortholuge that further evaluates previously predicted orthologs (including those predicted using an RBH-based approach) – identifying which orthologs most closely reflect species divergence and may more likely have similar function. Ortholuge analyzes phylogenetic distance ratios involving two comparison species and an outgroup species, noting cases where relative gene divergence is atypical. It also identifies some cases of gene duplication after species divergence. Through simulations of incomplete genome data/gene loss, we show that the vast majority of genes falsely predicted as orthologs by an RBH-based method can be identified. Ortholuge was then used to estimate the number of false-positives (predominantly paralogs) in selected RBH-predicted ortholog datasets, identifying approximately 10% paralogs in a eukaryotic data set (mouse-rat comparison) and 5% in a bacterial data set (Pseudomonas putida – Pseudomonas syringae species comparison). Higher quality (more precise) datasets of orthologs, which we term "ssd-orthologs" (supporting-species-divergence-orthologs), were also constructed. These datasets, as well as

  3. High throughput illumination systems for solar simulators and photoresist exposure

    NASA Astrophysics Data System (ADS)

    Feldman, Arkady

    2010-08-01

    High throughput illumination systems are critical component in photolithography, solar simulators, UV curing, microscopy, and spectral analysis. A good refractive condenser system has F/# .60, or N.A .80, but it captures only 10 to 15% of energy emitted by an incandescent or gas-discharge lamp, as these sources emit light in all directions. Systems with ellipsoidal or parabolic reflectors are much more efficient, they capture up to 80% of total energy emitted by lamps. However, these reflectors have large aberrations when working with real sources of finite dimensions, resulting in poor light concentrating capability. These aberrations also increase beam divergence, collimation, and affect edge definition in flood exposure systems. The problem is aggravated by the geometry of high power Arc lamps where, for thermal considerations, the anode has a larger diameter than the cathode and absorbs and obscures part of the energy. This results in an asymmetrical energy distribution emitted by the lamp and makes efficiency of Lamp - reflector configuration dependent on orientation of lamp in the reflector. This paper presents the analysis of different configurations of Lamp - Reflector systems of different power levels and their energy distribution in the image plane. Configuration, which results in significant improvement of brightness, is derived.

  4. High-throughput literature mining to support read-across ...

    EPA Pesticide Factsheets

    Building scientific confidence in the development and evaluation of read-across remains an ongoing challenge. Approaches include establishing systematic frameworks to identify sources of uncertainty and ways to address them. One source of uncertainty is related to characterizing biological similarity. Many research efforts are underway such as structuring mechanistic data in adverse outcome pathways and investigating the utility of high throughput (HT)/high content (HC) screening data. A largely untapped resource for read-across to date is the biomedical literature. This information has the potential to support read-across by facilitating the identification of valid source analogues with similar biological and toxicological profiles as well as providing the mechanistic understanding for any prediction made. A key challenge in using biomedical literature is to convert and translate its unstructured form into a computable format that can be linked to chemical structure. We developed a novel text-mining strategy to represent literature information for read across. Keywords were used to organize literature into toxicity signatures at the chemical level. These signatures were integrated with HT in vitro data and curated chemical structures. A rule-based algorithm assessed the strength of the literature relationship, providing a mechanism to rank and visualize the signature as literature ToxPIs (LitToxPIs). LitToxPIs were developed for over 6,000 chemicals for a varie

  5. Validation of high throughput sequencing and microbial forensics applications.

    PubMed

    Budowle, Bruce; Connell, Nancy D; Bielecka-Oder, Anna; Colwell, Rita R; Corbett, Cindi R; Fletcher, Jacqueline; Forsman, Mats; Kadavy, Dana R; Markotic, Alemka; Morse, Stephen A; Murch, Randall S; Sajantila, Antti; Schmedes, Sarah E; Ternus, Krista L; Turner, Stephen D; Minot, Samuel

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security.

  6. Analysis of DNA Sequence Variants Detected by High Throughput Sequencing

    PubMed Central

    Adams, David R; Sincan, Murat; Fajardo, Karin Fuentes; Mullikin, James C; Pierson, Tyler M; Toro, Camilo; Boerkoel, Cornelius F; Tifft, Cynthia J; Gahl, William A; Markello, Tom C

    2014-01-01

    The Undiagnosed Diseases Program at the National Institutes of Health uses High Throughput Sequencing (HTS) to diagnose rare and novel diseases. HTS techniques generate large numbers of DNA sequence variants, which must be analyzed and filtered to find candidates for disease causation. Despite the publication of an increasing number of successful exome-based projects, there has been little formal discussion of the analytic steps applied to HTS variant lists. We present the results of our experience with over 30 families for whom HTS sequencing was used in an attempt to find clinical diagnoses. For each family, exome sequence was augmented with high-density SNP-array data. We present a discussion of the theory and practical application of each analytic step and provide example data to illustrate our approach. The paper is designed to provide an analytic roadmap for variant analysis, thereby enabling a wide range of researchers and clinical genetics practitioners to perform direct analysis of HTS data for their patients and projects. PMID:22290882

  7. Efficient Management of High-Throughput Screening Libraries with SAVANAH.

    PubMed

    List, Markus; Elnegaard, Marlene Pedersen; Schmidt, Steffen; Christiansen, Helle; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2017-02-01

    High-throughput screening (HTS) has become an indispensable tool for the pharmaceutical industry and for biomedical research. A high degree of automation allows for experiments in the range of a few hundred up to several hundred thousand to be performed in close succession. The basis for such screens are molecular libraries, that is, microtiter plates with solubilized reagents such as siRNAs, shRNAs, miRNA inhibitors or mimics, and sgRNAs, or small compounds, that is, drugs. These reagents are typically condensed to provide enough material for covering several screens. Library plates thus need to be serially diluted before they can be used as assay plates. This process, however, leads to an explosion in the number of plates and samples to be tracked. Here, we present SAVANAH, the first tool to effectively manage molecular screening libraries across dilution series. It conveniently links (connects) sample information from the library to experimental results from the assay plates. All results can be exported to the R statistical environment or piped into HiTSeekR ( http://hitseekr.compbio.sdu.dk ) for comprehensive follow-up analyses. In summary, SAVANAH supports the HTS community in managing and analyzing HTS experiments with an emphasis on serially diluted molecular libraries.

  8. High-throughput charge exchange recombination spectroscopy system on MAST

    SciTech Connect

    Conway, N. J.; Carolan, P. G.; McCone, J.; Walsh, M. J.; Wisse, M.

    2006-10-15

    A major upgrade to the charge exchange recombination spectroscopy system on MAST has recently been implemented. The new system consists of a high-throughput spectrometer coupled to a total of 224 spatial channels, including toroidal and poloidal views of both neutral heating beams on MAST. Radial resolution is {approx}1 cm, comparable to the ion Larmor radius. The toroidal views are configured with 64 channels per beam, while the poloidal views have 32 channels per beam. Background channels for both poloidal and toroidal views are also provided. A large transmission grating is at the heart of the new spectrometer, with high quality single lens reflex lenses providing excellent imaging performance and permitting the full exploitation of the available etendue of the camera sensor. The charge-coupled device camera chosen has four-tap readout at a maximum aggregate speed of 8.8 MHz, and it is capable of reading out the full set of 224 channels in less than 4 ms. The system normally operates at 529 nm, viewing the C{sup 5+} emission line, but can operate at any wavelength in the range of 400-700 nm. Results from operating the system on MAST are shown, including impurity ion temperature and velocity profiles. The system's excellent spatial resolution is ideal for the study of transport barrier phenomena on MAST, an activity which has already been advanced significantly by data from the new diagnostic.

  9. High throughput screening for anti-Trypanosoma cruzi drug discovery.

    PubMed

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-12-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti-T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti-T. cruzi drug entities in the near future, are reviewed here.

  10. Molecular Pathways: Extracting Medical Knowledge from High Throughput Genomic Data

    PubMed Central

    Goldstein, Theodore; Paull, Evan O.; Ellis, Matthew J.; Stuart, Joshua M.

    2013-01-01

    High-throughput genomic data that measures RNA expression, DNA copy number, mutation status and protein levels provide us with insights into the molecular pathway structure of cancer. Genomic lesions (amplifications, deletions, mutations) and epigenetic modifications disrupt biochemical cellular pathways. While the number of possible lesions is vast, different genomic alterations may result in concordant expression and pathway activities, producing common tumor subtypes that share similar phenotypic outcomes. How can these data be translated into medical knowledge that provides prognostic and predictive information? First generation mRNA expression signatures such as Genomic Health's Oncotype DX already provide prognostic information, but do not provide therapeutic guidance beyond the current standard of care – which is often inadequate in high-risk patients. Rather than building molecular signatures based on gene expression levels, evidence is growing that signatures based on higher-level quantities such as from genetic pathways may provide important prognostic and diagnostic cues. We provide examples of how activities for molecular entities can be predicted from pathway analysis and how the composite of all such activities, referred to here as the “activitome,” help connect genomic events to clinical factors in order to predict the drivers of poor outcome. PMID:23430023

  11. High Throughput, Continuous, Mass Production of Photovoltaic Modules

    SciTech Connect

    Kurt Barth

    2008-02-06

    AVA Solar has developed a very low cost solar photovoltaic (PV) manufacturing process and has demonstrated the significant economic and commercial potential of this technology. This I & I Category 3 project provided significant assistance toward accomplishing these milestones. The original goals of this project were to design, construct and test a production prototype system, fabricate PV modules and test the module performance. The original module manufacturing costs in the proposal were estimated at $2/Watt. The objectives of this project have been exceeded. An advanced processing line was designed, fabricated and installed. Using this automated, high throughput system, high efficiency devices and fully encapsulated modules were manufactured. AVA Solar has obtained 2 rounds of private equity funding, expand to 50 people and initiated the development of a large scale factory for 100+ megawatts of annual production. Modules will be manufactured at an industry leading cost which will enable AVA Solar's modules to produce power that is cost-competitive with traditional energy resources. With low manufacturing costs and the ability to scale manufacturing, AVA Solar has been contacted by some of the largest customers in the PV industry to negotiate long-term supply contracts. The current market for PV has continued to grow at 40%+ per year for nearly a decade and is projected to reach $40-$60 Billion by 2012. Currently, a crystalline silicon raw material supply shortage is limiting growth and raising costs. Our process does not use silicon, eliminating these limitations.

  12. Edge electrospinning for high throughput production of quality nanofibers.

    PubMed

    Thoppey, N M; Bochinski, J R; Clarke, L I; Gorga, R E

    2011-08-26

    A novel, simple geometry for high throughput electrospinning from a bowl edge is presented that utilizes a vessel filled with a polymer solution and a concentric cylindrical collector. Successful fiber formation is presented for two different polymer systems with differing solution viscosity and solvent volatility. The process of jet initiation, resultant fiber morphology and fiber production rate are discussed for this unconfined feed approach. Under high voltage initiation, the jets spontaneously form directly on the fluid surface and rearrange along the circumference of the bowl to provide approximately equal spacing between spinning sites. Nanofibers currently produced from bowl electrospinning are identical in quality to those fabricated by traditional needle electrospinning (TNE) with a demonstrated ∼ 40 times increase in the production rate for a single batch of solution due primarily to the presence of many simultaneous jets. In the bowl electrospinning geometry, the electric field pattern and subsequent effective feed rate are very similar to those parameters found under optimized TNE experiments. Consequently, the electrospinning process per jet is directly analogous to that in TNE and thereby results in the same quality of nanofibers.

  13. High Throughput Screening for Anti–Trypanosoma cruzi Drug Discovery

    PubMed Central

    Alonso-Padilla, Julio; Rodríguez, Ana

    2014-01-01

    The discovery of new therapeutic options against Trypanosoma cruzi, the causative agent of Chagas disease, stands as a fundamental need. Currently, there are only two drugs available to treat this neglected disease, which represents a major public health problem in Latin America. Both available therapies, benznidazole and nifurtimox, have significant toxic side effects and their efficacy against the life-threatening symptomatic chronic stage of the disease is variable. Thus, there is an urgent need for new, improved anti–T. cruzi drugs. With the objective to reliably accelerate the drug discovery process against Chagas disease, several advances have been made in the last few years. Availability of engineered reporter gene expressing parasites triggered the development of phenotypic in vitro assays suitable for high throughput screening (HTS) as well as the establishment of new in vivo protocols that allow faster experimental outcomes. Recently, automated high content microscopy approaches have also been used to identify new parasitic inhibitors. These in vitro and in vivo early drug discovery approaches, which hopefully will contribute to bring better anti–T. cruzi drug entities in the near future, are reviewed here. PMID:25474364

  14. A rapid transglutaminase assay for high-throughput screening applications.

    PubMed

    Wu, Yu-Wei; Tsai, Yu-Hui

    2006-10-01

    Transglutaminases (TGs) are widely distributed enzymes that catalyze posttranslational modification of proteins by Ca(2+)-dependent cross-linking reactions. The family members of TGs participate in many significant processes of biological functions such as tissue regeneration, cell differentiation, apoptosis, and certain pathologies. A novel technique for TG activity assay was developed in this study. It was based on the rapid capturing, fluorescence quenching, and fast separation of the unreacted fluorescent molecules from the macromolecular product with magnetic dextran-coated charcoal. As few as 3 ng of guinea pig liver transglutaminase (gpTG) could be detected by the method; activities of 96 TG samples could be measured within an hour. The K(m) of gpTG determined by this method for monodansylcadaverine (dansyl-CAD) and N, N-dimethylcasein was 14 and 5 muM, respectively. A typical competitive inhibition pattern of cystamine on dansyl-CAD for gpTG activity was also demonstrated. The application of this technique is not limited to the use of dansyl-CAD as the fluorescent substrate of TG; other small fluor-labeled TG substrates may substitute dansyl-CAD. Finally, this method is rapid, highly sensitive, and inexpensive. It is suitable not only for high-throughput screening of enzymes or enzyme inhibitors but also for enzyme kinetic analysis.

  15. Microfluidic system for high throughput characterisation of echogenic particles.

    PubMed

    Rademeyer, Paul; Carugo, Dario; Lee, Jeong Yu; Stride, Eleanor

    2015-01-21

    Echogenic particles, such as microbubbles and volatile liquid micro/nano droplets, have shown considerable potential in a variety of clinical diagnostic and therapeutic applications. The accurate prediction of their response to ultrasound excitation is however extremely challenging, and this has hindered the optimisation of techniques such as quantitative ultrasound imaging and targeted drug delivery. Existing characterisation techniques, such as ultra-high speed microscopy provide important insights, but suffer from a number of limitations; most significantly difficulty in obtaining large data sets suitable for statistical analysis and the need to physically constrain the particles, thereby altering their dynamics. Here a microfluidic system is presented that overcomes these challenges to enable the measurement of single echogenic particle response to ultrasound excitation. A co-axial flow focusing device is used to direct a continuous stream of unconstrained particles through the combined focal region of an ultrasound transducer and a laser. Both the optical and acoustic scatter from individual particles are then simultaneously recorded. Calibration of the device and example results for different types of echogenic particle are presented, demonstrating a high throughput of up to 20 particles per second and the ability to resolve changes in particle radius down to 0.1 μm with an uncertainty of less than 3%.

  16. Edge electrospinning for high throughput production of quality nanofibers

    NASA Astrophysics Data System (ADS)

    Thoppey, N. M.; Bochinski, J. R.; Clarke, L. I.; Gorga, R. E.

    2011-08-01

    A novel, simple geometry for high throughput electrospinning from a bowl edge is presented that utilizes a vessel filled with a polymer solution and a concentric cylindrical collector. Successful fiber formation is presented for two different polymer systems with differing solution viscosity and solvent volatility. The process of jet initiation, resultant fiber morphology and fiber production rate are discussed for this unconfined feed approach. Under high voltage initiation, the jets spontaneously form directly on the fluid surface and rearrange along the circumference of the bowl to provide approximately equal spacing between spinning sites. Nanofibers currently produced from bowl electrospinning are identical in quality to those fabricated by traditional needle electrospinning (TNE) with a demonstrated ~ 40 times increase in the production rate for a single batch of solution due primarily to the presence of many simultaneous jets. In the bowl electrospinning geometry, the electric field pattern and subsequent effective feed rate are very similar to those parameters found under optimized TNE experiments. Consequently, the electrospinning process per jet is directly analogous to that in TNE and thereby results in the same quality of nanofibers.

  17. Validation of high throughput sequencing and microbial forensics applications

    PubMed Central

    2014-01-01

    High throughput sequencing (HTS) generates large amounts of high quality sequence data for microbial genomics. The value of HTS for microbial forensics is the speed at which evidence can be collected and the power to characterize microbial-related evidence to solve biocrimes and bioterrorist events. As HTS technologies continue to improve, they provide increasingly powerful sets of tools to support the entire field of microbial forensics. Accurate, credible results allow analysis and interpretation, significantly influencing the course and/or focus of an investigation, and can impact the response of the government to an attack having individual, political, economic or military consequences. Interpretation of the results of microbial forensic analyses relies on understanding the performance and limitations of HTS methods, including analytical processes, assays and data interpretation. The utility of HTS must be defined carefully within established operating conditions and tolerances. Validation is essential in the development and implementation of microbial forensics methods used for formulating investigative leads attribution. HTS strategies vary, requiring guiding principles for HTS system validation. Three initial aspects of HTS, irrespective of chemistry, instrumentation or software are: 1) sample preparation, 2) sequencing, and 3) data analysis. Criteria that should be considered for HTS validation for microbial forensics are presented here. Validation should be defined in terms of specific application and the criteria described here comprise a foundation for investigators to establish, validate and implement HTS as a tool in microbial forensics, enhancing public safety and national security. PMID:25101166

  18. High Throughput Heuristics for Prioritizing Human Exposure to ...

    EPA Pesticide Factsheets

    The risk posed to human health by any of the thousands of untested anthropogenic chemicals in our environment is a function of both the potential hazard presented by the chemical, and the possibility of being exposed. Without the capacity to make quantitative, albeit uncertain, forecasts of exposure, the putative risk of adverse health effect from a chemical cannot be evaluated. We used Bayesian methodology to infer ranges of exposure intakes that are consistent with biomarkers of chemical exposures identified in urine samples from the U.S. population by the National Health and Nutrition Examination Survey (NHANES). We perform linear regression on inferred exposure for demographic subsets of NHANES demarked by age, gender, and weight using high throughput chemical descriptors gleaned from databases and chemical structure-based calculators. We find that five of these descriptors are capable of explaining roughly 50% of the variability across chemicals for all the demographic groups examined, including children aged 6-11. For the thousands of chemicals with no other source of information, this approach allows rapid and efficient prediction of average exposure intake of environmental chemicals. The methods described by this manuscript provide a highly improved methodology for HTS of human exposure to environmental chemicals. The manuscript includes a ranking of 7785 environmental chemicals with respect to potential human exposure, including most of the Tox21 in vit

  19. PRISM: a data management system for high-throughput proteomics.

    PubMed

    Kiebel, Gary R; Auberry, Ken J; Jaitly, Navdeep; Clark, David A; Monroe, Matthew E; Peterson, Elena S; Tolić, Nikola; Anderson, Gordon A; Smith, Richard D

    2006-03-01

    Advanced proteomic research efforts involving areas such as systems biology or biomarker discovery are enabled by the use of high level informatics tools that allow the effective analysis of large quantities of differing types of data originating from various studies. Performing such analyses on a large scale is not feasible without a computational platform that performs data processing and management tasks. Such a platform must be able to provide high-throughput operation while having sufficient flexibility to accommodate evolving data analysis tools and methodologies. The Proteomics Research Information Storage and Management system (PRISM) provides a platform that serves the needs of the accurate mass and time tag approach developed at Pacific Northwest National Laboratory. PRISM incorporates a diverse set of analysis tools and allows a wide range of operations to be incorporated by using a state machine that is accessible to independent, distributed computational nodes. The system has scaled well as data volume has increased over several years, while allowing adaptability for incorporating new and improved data analysis tools for more effective proteomics research.

  20. High Resolution Genotyping of Campylobacter Using PCR and High-Throughput Mass Spectrometry

    USDA-ARS?s Scientific Manuscript database

    In this work we report a high throughput mass spectrometry-based technique for rapid high resolution strain identification of Campylobacter jejuni. This method readily distinguishes C. jejuni from C. coli, has comparable resolving power to multi-locus sequence typing (MLST), is applicable to mixtur...

  1. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  2. High-Throughput/High-Content Screening Assays with Engineered Nanomaterials in ToxCast

    EPA Science Inventory

    High-throughput and high-content screens are attractive approaches for prioritizing nanomaterial hazards and informing targeted testing due to the impracticality of using traditional toxicological testing on the large numbers and varieties of nanomaterials. The ToxCast program a...

  3. High-throughput Protein Purification and Quality Assessment for Crystallization

    PubMed Central

    Kim, Youngchang; Babnigg, Gyorgy; Jedrzejczak, Robert; Eschenfeldt, William H.; Li, Hui; Maltseva, Natalia; Hatzos-Skintges, Catherine; Gu, Minyi; Makowska-Grzyska, Magdalena; Wu, Ruiying; An, Hao; Chhor, Gekleng; Joachimiak, Andrzej

    2012-01-01

    The ultimate goal of structural biology is to understand the structural basis of proteins in cellular processes. In structural biology, the most critical issue is the availability of high-quality samples. “Structural biology-grade” proteins must be generated in the quantity and quality suitable for structure determination using X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The purification procedures must reproducibly yield homogeneous proteins or their derivatives containing marker atom(s) in milligram quantities. The choice of protein purification and handling procedures plays a critical role in obtaining high-quality protein samples. With structural genomics emphasizing a genome-based approach in understanding protein structure and function, a number of unique structures covering most of the protein folding space have been determined and new technologies with high efficiency have been developed. At the Midwest Center for Structural Genomics (MCSG), we have developed semi-automated protocols for high-throughput parallel protein expression and purification. A protein, expressed as a fusion with a cleavable affinity tag, is purified in two consecutive immobilized metal affinity chromatography (IMAC) steps: (i) the first step is an IMAC coupled with buffer-exchange, or size exclusion chromatography (IMAC-I), followed by the cleavage of the affinity tag using the highly specific Tobacco Etch Virus (TEV) protease; [1] the second step is IMAC and buffer exchange (IMAC-II) to remove the cleaved tag and tagged TEV protease. These protocols have been implemented on multidimensional chromatography workstations and, as we have shown, many proteins can be successfully produced in large-scale. All methods and protocols used for purification, some developed by MCSG, others adopted and integrated into the MCSG purification pipeline and more recently the Center for Structural Genomics of Infectious Diseases (CSGID) purification pipeline, are

  4. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  5. High-throughput RAD-SNP genotyping for characterization of sugar beet genotypes

    USDA-ARS?s Scientific Manuscript database

    High-throughput SNP genotyping provides a rapid way of developing resourceful set of markers for delineating the genetic architecture and for effective species discrimination. In the presented research, we demonstrate a set of 192 SNPs for effective genotyping in sugar beet using high-throughput mar...

  6. High throughput transmission optical projection tomography using low cost graphics processing unit.

    PubMed

    Vinegoni, Claudio; Fexon, Lyuba; Feruglio, Paolo Fumene; Pivovarov, Misha; Figueiredo, Jose-Luiz; Nahrendorf, Matthias; Pozzo, Antonio; Sbarbati, Andrea; Weissleder, Ralph

    2009-12-07

    We implement the use of a graphics processing unit (GPU) in order to achieve real time data processing for high-throughput transmission optical projection tomography imaging. By implementing the GPU we have obtained a 300 fold performance enhancement in comparison to a CPU workstation implementation. This enables to obtain on-the-fly reconstructions enabling for high throughput imaging.

  7. Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays (SOT)

    EPA Science Inventory

    Alginate Immobilization of Metabolic Enzymes (AIME) for High-Throughput Screening Assays DE DeGroot, RS Thomas, and SO SimmonsNational Center for Computational Toxicology, US EPA, Research Triangle Park, NC USAThe EPA’s ToxCast program utilizes a wide variety of high-throughput s...

  8. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  9. Performance of high-throughput DNA quantification methods

    PubMed Central

    Haque, Kashif A; Pfeiffer, Ruth M; Beerman, Michael B; Struewing, Jeff P; Chanock, Stephen J; Bergen, Andrew W

    2003-01-01

    Background The accuracy and precision of estimates of DNA concentration are critical factors for efficient use of DNA samples in high-throughput genotype and sequence analyses. We evaluated the performance of spectrophotometric (OD) DNA quantification, and compared it to two fluorometric quantification methods, the PicoGreen® assay (PG), and a novel real-time quantitative genomic PCR assay (QG) specific to a region at the human BRCA1 locus. Twenty-Two lymphoblastoid cell line DNA samples with an initial concentration of ~350 ng/uL were diluted to 20 ng/uL. DNA concentration was estimated by OD and further diluted to 5 ng/uL. The concentrations of multiple aliquots of the final dilution were measured by the OD, QG and PG methods. The effects of manual and robotic laboratory sample handling procedures on the estimates of DNA concentration were assessed using variance components analyses. Results The OD method was the DNA quantification method most concordant with the reference sample among the three methods evaluated. A large fraction of the total variance for all three methods (36.0–95.7%) was explained by sample-to-sample variation, whereas the amount of variance attributable to sample handling was small (0.8–17.5%). Residual error (3.2–59.4%), corresponding to un-modelled factors, contributed a greater extent to the total variation than the sample handling procedures. Conclusion The application of a specific DNA quantification method to a particular molecular genetic laboratory protocol must take into account the accuracy and precision of the specific method, as well as the requirements of the experimental workflow with respect to sample volumes and throughput. While OD was the most concordant and precise DNA quantification method in this study, the information provided by the quantitative PCR assay regarding the suitability of DNA samples for PCR may be an essential factor for some protocols, despite the decreased concordance and precision of this method

  10. High-throughput microfluidic line scan imaging for cytological characterization

    NASA Astrophysics Data System (ADS)

    Hutcheson, Joshua A.; Powless, Amy J.; Majid, Aneeka A.; Claycomb, Adair; Fritsch, Ingrid; Balachandran, Kartik; Muldoon, Timothy J.

    2015-03-01

    Imaging cells in a microfluidic chamber with an area scan camera is difficult due to motion blur and data loss during frame readout causing discontinuity of data acquisition as cells move at relatively high speeds through the chamber. We have developed a method to continuously acquire high-resolution images of cells in motion through a microfluidics chamber using a high-speed line scan camera. The sensor acquires images in a line-by-line fashion in order to continuously image moving objects without motion blur. The optical setup comprises an epi-illuminated microscope with a 40X oil immersion, 1.4 NA objective and a 150 mm tube lens focused on a microfluidic channel. Samples containing suspended cells fluorescently stained with 0.01% (w/v) proflavine in saline are introduced into the microfluidics chamber via a syringe pump; illumination is provided by a blue LED (455 nm). Images were taken of samples at the focal plane using an ELiiXA+ 8k/4k monochrome line-scan camera at a line rate of up to 40 kHz. The system's line rate and fluid velocity are tightly controlled to reduce image distortion and are validated using fluorescent microspheres. Image acquisition was controlled via MATLAB's Image Acquisition toolbox. Data sets comprise discrete images of every detectable cell which may be subsequently mined for morphological statistics and definable features by a custom texture analysis algorithm. This high-throughput screening method, comparable to cell counting by flow cytometry, provided efficient examination including counting, classification, and differentiation of saliva, blood, and cultured human cancer cells.

  11. Technologies for Proteome-Wide Discovery of Extracellular Host-Pathogen Interactions

    PubMed Central

    2017-01-01

    Pathogens have evolved unique mechanisms to breach the cell surface barrier and manipulate the host immune response to establish a productive infection. Proteins exposed to the extracellular environment, both cell surface-expressed receptors and secreted proteins, are essential targets for initial invasion and play key roles in pathogen recognition and subsequent immunoregulatory processes. The identification of the host and pathogen extracellular molecules and their interaction networks is fundamental to understanding tissue tropism and pathogenesis and to inform the development of therapeutic strategies. Nevertheless, the characterization of the proteins that function in the host-pathogen interface has been challenging, largely due to the technical challenges associated with detection of extracellular protein interactions. This review discusses available technologies for the high throughput study of extracellular protein interactions between pathogens and their hosts, with a focus on mammalian viruses and bacteria. Emerging work illustrates a rich landscape for extracellular host-pathogen interaction and points towards the evolution of multifunctional pathogen-encoded proteins. Further development and application of technologies for genome-wide identification of extracellular protein interactions will be important in deciphering functional host-pathogen interaction networks, laying the foundation for development of novel therapeutics. PMID:28321417

  12. High Throughput Analysis of Integron Gene Cassettes in Wastewater Environments.

    PubMed

    Gatica, Joao; Tripathi, Vijay; Green, Stefan; Manaia, Celia M; Berendonk, Thomas; Cacace, Damiano; Merlin, Christophe; Kreuzinger, Norbert; Schwartz, Thomas; Fatta-Kassinos, Despo; Rizzo, Luigi; Schwermer, Carsten U; Garelick, Hemda; Jurkevitch, Edouard; Cytryn, Eddie

    2016-11-01

    Integrons are extensively targeted as a proxy for anthropogenic impact in the environment. We developed a novel high-throughput amplicon sequencing pipeline that enables characterization of thousands of integron gene cassette-associated reads, and applied it to acquire a comprehensive overview of gene cassette composition in effluents from wastewater treatment facilities across Europe. Between 38 100 and 172 995 reads per-sample were generated and functionally characterized by screening against nr, SEED, ARDB and β-lactamase databases. Over 75% of the reads were characterized as hypothetical, but thousands were associated with toxin-antitoxin systems, DNA repair, cell membrane function, detoxification and aminoglycoside and β-lactam resistance. Among the reads characterized as β-lactamases, the carbapenemase blaOXA was dominant in most of the effluents, except for Cyprus and Israel where blaGES was also abundant. Quantitative PCR assessment of blaOXA and blaGES genes in the European effluents revealed similar trends to those displayed in the integron amplicon sequencing pipeline described above, corroborating the robustness of this method and suggesting that these integron-associated genes may be excellent targets for source tracking of effluents in downstream environments. Further application of the above analyses revealed several order-of-magnitude reductions in effluent-associated β-lactamase genes in effluent-saturated soils, suggesting marginal persistence in the soil microbiome.

  13. Translational informatics: enabling high-throughput research paradigms

    PubMed Central

    Embi, Peter J.; Sen, Chandan K.

    2009-01-01

    A common thread throughout the clinical and translational research domains is the need to collect, manage, integrate, analyze, and disseminate large-scale, heterogeneous biomedical data sets. However, well-established and broadly adopted theoretical and practical frameworks and models intended to address such needs are conspicuously absent in the published literature or other reputable knowledge sources. Instead, the development and execution of multidisciplinary, clinical, or translational studies are significantly limited by the propagation of “silos” of both data and expertise. Motivated by this fundamental challenge, we report upon the current state and evolution of biomedical informatics as it pertains to the conduct of high-throughput clinical and translational research and will present both a conceptual and practical framework for the design and execution of informatics-enabled studies. The objective of presenting such findings and constructs is to provide the clinical and translational research community with a common frame of reference for discussing and expanding upon such models and methodologies. PMID:19737991

  14. The JCSG high-throughput structural biology pipeline

    PubMed Central

    Elsliger, Marc-André; Deacon, Ashley M.; Godzik, Adam; Lesley, Scott A.; Wooley, John; Wüthrich, Kurt; Wilson, Ian A.

    2010-01-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications. PMID:20944202

  15. High throughput virus plaque quantitation using a flatbed scanner.

    PubMed

    Sullivan, Kate; Kloess, Johannes; Qian, Chen; Bell, Donald; Hay, Alan; Lin, Yi Pu; Gu, Yan

    2012-01-01

    The plaque assay is a standard technique for measuring influenza virus infectivity and inhibition of virus replication. Counting plaque numbers and quantifying virus infection of cells in multiwell plates quickly, accurately and automatically remain a challenge. Visual inspection relies upon experience, is subjective, often time consuming, and has less reproducibility than automated methods. In this paper, a simple, high throughput imaging-based alternative is proposed which uses a flatbed scanner and image processing software to quantify the infected cell population and plaque formation. Quantitation results were evaluated with reference to visual counting and achieved better than 80% agreement. The method was shown to be particularly advantageous in titration of the number of plaques and infected cells when influenza viruses produce a heterogeneous population of small plaques. It was also shown to be insensitive to the densities of plaques in determination of neutralization titres and IC(50)s of drug susceptibility. In comparison to other available techniques, this approach is cost-effective, relatively accurate, and readily available. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  17. High-throughput screening and biophysical interrogation of hepatotropic AAV.

    PubMed

    Murphy, Samuel L; Bhagwat, Anand; Edmonson, Shyrie; Zhou, Shangzhen; High, Katherine A

    2008-12-01

    We set out to analyze the fundamental biological differences between AAV2 and AAV8 that may contribute to their different performances in vivo. High-throughput protein interaction screens were used to identify binding partners for each serotype. Of the >8,000 proteins probed, 115 and 134 proteins were identified that interact with AAV2 and AAV8, respectively. Notably, 76 of these protein interactions were shared between the two serotypes. CDK2/cyclinA kinase was identified as a binding partner for both serotypes in the screen. Subsequent analysis confirmed direct binding of CDK2/cyclinA by AAV2 and AAV8. Inhibition of CDK2/cyclinA resulted in increased levels of vector transduction. Biophysical study of vector particle stability and genome uncoating demonstrated slightly greater thermostability for AAV8 than for AAV2. Heat-induced genome uncoating occurred at the same temperature as particle degradation, suggesting that these two processes may be intrinsically related for adeno-associated virus (AAV). Together, these analyses provide insight into commonalities and divergences in the biology of functionally distinct hepatotropic AAV serotypes.

  18. High Throughput T Epitope Mapping and Vaccine Development

    PubMed Central

    Li Pira, Giuseppina; Ivaldi, Federico; Moretti, Paolo; Manca, Fabrizio

    2010-01-01

    Mapping of antigenic peptide sequences from proteins of relevant pathogens recognized by T helper (Th) and by cytolytic T lymphocytes (CTL) is crucial for vaccine development. In fact, mapping of T-cell epitopes provides useful information for the design of peptide-based vaccines and of peptide libraries to monitor specific cellular immunity in protected individuals, patients and vaccinees. Nevertheless, epitope mapping is a challenging task. In fact, large panels of overlapping peptides need to be tested with lymphocytes to identify the sequences that induce a T-cell response. Since numerous peptide panels from antigenic proteins are to be screened, lymphocytes available from human subjects are a limiting factor. To overcome this limitation, high throughput (HTP) approaches based on miniaturization and automation of T-cell assays are needed. Here we consider the most recent applications of the HTP approach to T epitope mapping. The alternative or complementary use of in silico prediction and experimental epitope definition is discussed in the context of the recent literature. The currently used methods are described with special reference to the possibility of applying the HTP concept to make epitope mapping an easier procedure in terms of time, workload, reagents, cells and overall cost. PMID:20617148

  19. Inter-Individual Variability in High-Throughput Risk ...

    EPA Pesticide Factsheets

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have little or no existing TK data. Chemicals are prioritized based on model estimates of hazard and exposure, to decide which chemicals should be first in line for further study. Hazard may be estimated with in vitro HT screening assays, e.g., U.S. EPA’s ToxCast program. Bioactive ToxCast concentrations can be extrapolated to doses that produce equivalent concentrations in body tissues using a reverse TK approach in which generic TK models are parameterized with 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with physiological parameters for a virtual population. Here we draw physiological parameters from realistic estimates of distributions of demographic and anthropometric quantities in the modern U.S. population, based on the most recent CDC NHANES data. A Monte Carlo approach, accounting for the correlation structure in physiological parameters, is used to estimate ToxCast equivalent doses for the most sensitive portion of the population. To quantify risk, ToxCast equivalent doses are compared to estimates of exposure rates based on Bayesian inferences drawn from NHANES urinary analyte biomonitoring data. The inclusion

  20. High Throughput Profiling of Molecular Shapes in Crystals

    NASA Astrophysics Data System (ADS)

    Spackman, Peter R.; Thomas, Sajesh P.; Jayatilaka, Dylan

    2016-02-01

    Molecular shape is important in both crystallisation and supramolecular assembly, yet its role is not completely understood. We present a computationally efficient scheme to describe and classify the molecular shapes in crystals. The method involves rotation invariant description of Hirshfeld surfaces in terms of of spherical harmonic functions. Hirshfeld surfaces represent the boundaries of a molecule in the crystalline environment, and are widely used to visualise and interpret crystalline interactions. The spherical harmonic description of molecular shapes are compared and classified by means of principal component analysis and cluster analysis. When applied to a series of metals, the method results in a clear classification based on their lattice type. When applied to around 300 crystal structures comprising of series of substituted benzenes, naphthalenes and phenylbenzamide it shows the capacity to classify structures based on chemical scaffolds, chemical isosterism, and conformational similarity. The computational efficiency of the method is demonstrated with an application to over 14 thousand crystal structures. High throughput screening of molecular shapes and interaction surfaces in the Cambridge Structural Database (CSD) using this method has direct applications in drug discovery, supramolecular chemistry and materials design.

  1. High throughput screening for drug discovery of autophagy modulators.

    PubMed

    Shu, Chih-Wen; Liu, Pei-Feng; Huang, Chun-Ming

    2012-11-01

    Autophagy is an evolutionally conserved process in cells for cleaning abnormal proteins and organelles in a lysosome dependent manner. Growing studies have shown that defects or induced autophagy contributes to many diseases including aging, neurodegeneration, pathogen infection, and cancer. However, the precise involvement of autophagy in health and disease remains controversial because the theories are built on limited assays and chemical modulators, indicating that the role of autophagy in diseases may require further verification. Many food and drug administration (FDA) approved drugs modulate autophagy signaling, suggesting that modulation of autophagy with pharmacological agonists or antagonists provides a potential therapy for autophagy-related diseases. This suggestion raises an attractive issue on drug discovery for exploring chemical modulators of autophagy. High throughput screening (HTS) is becoming a powerful tool for drug discovery that may accelerate screening specific autophagy modulators to clarify the role of autophagy in diseases. Herein, this review lays out current autophagy assays to specifically measure autophagy components such as LC3 (mammalian homologue of yeast Atg8) and Atg4. These assays are feasible or successful for HTS with certain chemical libraries, which might be informative for this intensively growing field as research tools and hopefully developing new drugs for autophagy-related diseases.

  2. Hypothesis testing in high-throughput screening for drug discovery.

    PubMed

    Prummer, Michael

    2012-04-01

    Following the success of small-molecule high-throughput screening (HTS) in drug discovery, other large-scale screening techniques are currently revolutionizing the biological sciences. Powerful new statistical tools have been developed to analyze the vast amounts of data in DNA chip studies, but have not yet found their way into compound screening. In HTS, characterization of single-point hit lists is often done only in retrospect after the results of confirmation experiments are available. However, for prioritization, for optimal use of resources, for quality control, and for comparison of screens it would be extremely valuable to predict the rates of false positives and false negatives directly from the primary screening results. Making full use of the available information about compounds and controls contained in HTS results and replicated pilot runs, the Z score and from it the p value can be estimated for each measurement. Based on this consideration, we have applied the concept of p-value distribution analysis (PVDA), which was originally developed for gene expression studies, to HTS data. PVDA allowed prediction of all relevant error rates as well as the rate of true inactives, and excellent agreement with confirmation experiments was found.

  3. Savant: genome browser for high-throughput sequencing data.

    PubMed

    Fiume, Marc; Williams, Vanessa; Brook, Andrew; Brudno, Michael

    2010-08-15

    The advent of high-throughput sequencing (HTS) technologies has made it affordable to sequence many individuals' genomes. Simultaneously the computational analysis of the large volumes of data generated by the new sequencing machines remains a challenge. While a plethora of tools are available to map the resulting reads to a reference genome, and to conduct primary analysis of the mappings, it is often necessary to visually examine the results and underlying data to confirm predictions and understand the functional effects, especially in the context of other datasets. We introduce Savant, the Sequence Annotation, Visualization and ANalysis Tool, a desktop visualization and analysis browser for genomic data. Savant was developed for visualizing and analyzing HTS data, with special care taken to enable dynamic visualization in the presence of gigabases of genomic reads and references the size of the human genome. Savant supports the visualization of genome-based sequence, point, interval and continuous datasets, and multiple visualization modes that enable easy identification of genomic variants (including single nucleotide polymorphisms, structural and copy number variants), and functional genomic information (e.g. peaks in ChIP-seq data) in the context of genomic annotations. Savant is freely available at http://compbio.cs.toronto.edu/savant.

  4. Advances in High Throughput Screening of Biomass Recalcitrance (Poster)

    SciTech Connect

    Turner, G. B.; Decker, S. R.; Tucker, M. P.; Law, C.; Doeppke, C.; Sykes, R. W.; Davis, M. F.; Ziebell, A.

    2012-06-01

    This was a poster displayed at the Symposium. Advances on previous high throughput screening of biomass recalcitrance methods have resulted in improved conversion and replicate precision. Changes in plate reactor metallurgy, improved preparation of control biomass, species-specific pretreatment conditions, and enzymatic hydrolysis parameters have reduced overall coefficients of variation to an average of 6% for sample replicates. These method changes have improved plate-to-plate variation of control biomass recalcitrance and improved confidence in sugar release differences between samples. With smaller errors plant researchers can have a higher degree of assurance more low recalcitrance candidates can be identified. Significant changes in plate reactor, control biomass preparation, pretreatment conditions and enzyme have significantly reduced sample and control replicate variability. Reactor plate metallurgy significantly impacts sugar release aluminum leaching into reaction during pretreatment degrades sugars and inhibits enzyme activity. Removal of starch and extractives significantly decreases control biomass variability. New enzyme formulations give more consistent and higher conversion levels, however required re-optimization for switchgrass. Pretreatment time and temperature (severity) should be adjusted to specific biomass types i.e. woody vs. herbaceous. Desalting of enzyme preps to remove low molecular weight stabilizers and improved conversion levels likely due to water activity impacts on enzyme structure and substrate interactions not attempted here due to need to continually desalt and validate precise enzyme concentration and activity.

  5. Detecting Alu insertions from high-throughput sequencing data

    PubMed Central

    David, Matei; Mustafa, Harun; Brudno, Michael

    2013-01-01

    High-throughput sequencing technologies have allowed for the cataloguing of variation in personal human genomes. In this manuscript, we present alu-detect, a tool that combines read-pair and split-read information to detect novel Alus and their precise breakpoints directly from either whole-genome or whole-exome sequencing data while also identifying insertions directly in the vicinity of existing Alus. To set the parameters of our method, we use simulation of a faux reference, which allows us to compute the precision and recall of various parameter settings using real sequencing data. Applying our method to 100 bp paired Illumina data from seven individuals, including two trios, we detected on average 1519 novel Alus per sample. Based on the faux-reference simulation, we estimate that our method has 97% precision and 85% recall. We identify 808 novel Alus not previously described in other studies. We also demonstrate the use of alu-detect to study the local sequence and global location preferences for novel Alu insertions. PMID:23921633

  6. High Throughput Multispectral Image Processing with Applications in Food Science.

    PubMed

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing's outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples.

  7. New high throughput screening method for drug release measurements.

    PubMed

    Pelczarska, Aleksandra; Delie, Florence; Domańska, Urszula; Carrupt, Pierre-Alain; Martel, Sophie

    2013-09-01

    In the field of drug delivery systems, microparticles made of polymeric matrix appear as an attractive approach. The in vitro release kinetic profile is crucial information when developing new particulate formulations. These data are essential for batch to batch comparison, quality control as well as for anticipation of in vivo behavior to select the best formulation to go further in preclinical investigations. The methods available present common drawbacks such as the time- and compound-consumption that does not fit with formulation screening requirements in early development stages. In this study, a new microscale high throughput screening (HTS) method has been developed to investigate drug release kinetic from piroxicam-loaded polylactic acid (PLA) and polylactic-co-glycolic acid (PLGA) microparticles. The method is a sample- and separation-based method where separation is performed by filtration using 96-well micro filter plates. 96 experiments can therefore be performed on one plate in one time in a fully automated way and with a very low sample and particle consumption. The influence of different parameters controlling release profiles was also investigated using this technique. The HTS method gave the same release profile than the standard dialysis method. Shaking, particle concentration, and the nature of the release medium were found to be of influence. The HTS method appears as a reliable method to evaluate drug release from particles with smaller standard deviation and less consumption of material.

  8. New Lung Cancer Panel for High-Throughput Targeted Resequencing

    PubMed Central

    Kim, Eun-Hye; Lee, Sunghoon; Park, Jongsun; Lee, Kyusang; Bhak, Jong

    2014-01-01

    We present a new next-generation sequencing-based method to identify somatic mutations of lung cancer. It is a comprehensive mutation profiling protocol to detect somatic mutations in 30 genes found frequently in lung adenocarcinoma. The total length of the target regions is 107 kb, and a capture assay was designed to cover 99% of it. This method exhibited about 97% mean coverage at 30× sequencing depth and 42% average specificity when sequencing of more than 3.25 Gb was carried out for the normal sample. We discovered 513 variations from targeted exome sequencing of lung cancer cells, which is 3.9-fold higher than in the normal sample. The variations in cancer cells included previously reported somatic mutations in the COSMIC database, such as variations in TP53, KRAS, and STK11 of sample H-23 and in EGFR of sample H-1650, especially with more than 1,000× coverage. Among the somatic mutations, up to 91% of single nucleotide polymorphisms from the two cancer samples were validated by DNA microarray-based genotyping. Our results demonstrated the feasibility of high-throughput mutation profiling with lung adenocarcinoma samples, and the profiling method can be used as a robust and effective protocol for somatic variant screening. PMID:25031567

  9. PrimerView: high-throughput primer design and visualization.

    PubMed

    O'Halloran, Damien M

    2015-01-01

    High-throughput primer design is routinely performed in a wide number of molecular applications including genotyping specimens using traditional PCR techniques as well as assembly PCR, nested PCR, and primer walking experiments. Batch primer design is also required in validation experiments from RNA-seq transcriptome sequencing projects, as well as in generating probes for microarray experiments. The growing popularity of next generation sequencing and microarray technology has created a greater need for more primer design tools to validate large numbers of candidate genes and markers. To meet these demands I here present a tool called PrimerView that designs forward and reverse primers from multi-sequence datasets, and generates graphical outputs that map the position and distribution of primers to the target sequence. This module operates from the command-line and can collect user-defined input for the design phase of each primer. PrimerView is a straightforward to use module that implements a primer design algorithm to return forward and reverse primers from any number of FASTA formatted sequences to generate text based output of the features for each primer, and also graphical outputs that map the designed primers to the target sequence. PrimerView is freely available without restrictions.

  10. High-throughput automated refolding screening of inclusion bodies.

    PubMed

    Vincentelli, Renaud; Canaan, Stéphane; Campanacci, Valérie; Valencia, Christel; Maurin, Damien; Frassinetti, Frédéric; Scappucini-Calvo, Loréna; Bourne, Yves; Cambillau, Christian; Bignon, Christophe

    2004-10-01

    One of the main stumbling blocks encountered when attempting to express foreign proteins in Escherichia coli is the occurrence of amorphous aggregates of misfolded proteins, called inclusion bodies (IB). Developing efficient protein native structure recovery procedures based on IB refolding is therefore an important challenge. Unfortunately, there is no "universal" refolding buffer: Experience shows that refolding buffer composition varies from one protein to another. In addition, the methods developed so far for finding a suitable refolding buffer suffer from a number of weaknesses. These include the small number of refolding formulations, which often leads to negative results, solubility assays incompatible with high-throughput, and experiment formatting not suitable for automation. To overcome these problems, it was proposed in the present study to address some of these limitations. This resulted in the first completely automated IB refolding screening procedure to be developed using a 96-well format. The 96 refolding buffers were obtained using a fractional factorial approach. The screening procedure is potentially applicable to any nonmembrane protein, and was validated with 24 proteins in the framework of two Structural Genomics projects. The tests used for this purpose included the use of quality control methods such as circular dichroism, dynamic light scattering, and crystallogenesis. Out of the 24 proteins, 17 remained soluble in at least one of the 96 refolding buffers, 15 passed large-scale purification tests, and five gave crystals.

  11. High Throughput Sequencing: An Overview of Sequencing Chemistry.

    PubMed

    Ambardar, Sheetal; Gupta, Rikita; Trakroo, Deepika; Lal, Rup; Vakhlu, Jyoti

    2016-12-01

    In the present century sequencing is to the DNA science, what gel electrophoresis was to it in the last century. From 1977 to 2016 three generation of the sequencing technologies of various types have been developed. Second and third generation sequencing technologies referred commonly to as next generation sequencing technology, has evolved significantly with increase in sequencing speed, decrease in sequencing cost, since its inception in 2004. GS FLX by 454 Life Sciences/Roche diagnostics, Genome Analyzer, HiSeq, MiSeq and NextSeq by Illumina, Inc., SOLiD by ABI, Ion Torrent by Life Technologies are various type of the sequencing platforms available for second generation sequencing. The platforms available for the third generation sequencing are Helicos™ Genetic Analysis System by SeqLL, LLC, SMRT Sequencing by Pacific Biosciences, Nanopore sequencing by Oxford Nanopore's, Complete Genomics by Beijing Genomics Institute and GnuBIO by BioRad, to name few. The present article is an overview of the principle and the sequencing chemistry of these high throughput sequencing technologies along with brief comparison of various types of sequencing platforms available.

  12. Analysis of High Throughput Screening Assays using Cluster Enrichment

    PubMed Central

    Pu, Minya; Hayashi, Tomoko; Cottam, Howard; Mulvaney, Joseph; Arkin, Michelle; Corr, Maripat; Carson, Dennis; Messer, Karen

    2013-01-01

    In this paper we describe implementation and evaluation of a cluster-based enrichment strategy to call hits from a high-throughput screen (HTS), using a typical cell-based assay of 160,000 chemical compounds. Our focus is on statistical properties of the prospective design choices throughout the analysis, including how to choose the number of clusters for optimal power, the choice of test statistic, the significance thresholds for clusters and the activity threshold for candidate hits, how to rank selected hits for carry-forward to the confirmation screen, and how to identify confirmed hits in a data-driven manner. While previously the literature has focused on choice of test statistic or chemical descriptors, our studies suggest cluster size is the more important design choice. We recommend clusters be ranked by enrichment odds ratio, not p-value. Our conceptually simple test statistic is seen to identify the same set of hits as more complex scoring methods proposed in the literature. We prospectively confirm that such a cluster-based approach can outperform the naive top X approach, and estimate that we improved confirmation rates by about 31.5%, from 813 using the Top X approach to 1187 using our cluster-based method. PMID:22763983

  13. Analysis of high-throughput screening assays using cluster enrichment.

    PubMed

    Pu, Minya; Hayashi, Tomoko; Cottam, Howard; Mulvaney, Joseph; Arkin, Michelle; Corr, Maripat; Carson, Dennis; Messer, Karen

    2012-12-30

    In this paper, we describe the implementation and evaluation of a cluster-based enrichment strategy to call hits from a high-throughput screen using a typical cell-based assay of 160,000 chemical compounds. Our focus is on statistical properties of the prospective design choices throughout the analysis, including how to choose the number of clusters for optimal power, the choice of test statistic, the significance thresholds for clusters and the activity threshold for candidate hits, how to rank selected hits for carry-forward to the confirmation screen, and how to identify confirmed hits in a data-driven manner. Whereas previously the literature has focused on choice of test statistic or chemical descriptors, our studies suggest that cluster size is the more important design choice. We recommend clusters to be ranked by enrichment odds ratio, not by p-value. Our conceptually simple test statistic is seen to identify the same set of hits as more complex scoring methods proposed in the literature do. We prospectively confirm that such a cluster-based approach can outperform the naive top X approach and estimate that we improved confirmation rates by about 31.5% from 813 using the top X approach to 1187 using our cluster-based method. Copyright © 2012 John Wiley & Sons, Ltd.

  14. A Fully Automated High-Throughput Training System for Rodents

    PubMed Central

    Poddar, Rajesh; Kawai, Risa; Ölveczky, Bence P.

    2013-01-01

    Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal’s home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors. PMID:24349451

  15. A Microfluidic, High Throughput Protein Crystal Growth Method for Microgravity

    PubMed Central

    Carruthers Jr, Carl W.; Gerdts, Cory; Johnson, Michael D.; Webb, Paul

    2013-01-01

    The attenuation of sedimentation and convection in microgravity can sometimes decrease irregularities formed during macromolecular crystal growth. Current terrestrial protein crystal growth (PCG) capabilities are very different than those used during the Shuttle era and that are currently on the International Space Station (ISS). The focus of this experiment was to demonstrate the use of a commercial off-the-shelf, high throughput, PCG method in microgravity. Using Protein BioSolutions’ microfluidic Plug Maker™/CrystalCard™ system, we tested the ability to grow crystals of the regulator of glucose metabolism and adipogenesis: peroxisome proliferator-activated receptor gamma (apo-hPPAR-γ LBD), as well as several PCG standards. Overall, we sent 25 CrystalCards™ to the ISS, containing ~10,000 individual microgravity PCG experiments in a 3U NanoRacks NanoLab (1U = 103 cm.). After 70 days on the ISS, our samples were returned with 16 of 25 (64%) microgravity cards having crystals, compared to 12 of 25 (48%) of the ground controls. Encouragingly, there were more apo-hPPAR-γ LBD crystals in the microgravity PCG cards than the 1g controls. These positive results hope to introduce the use of the PCG standard of low sample volume and large experimental density to the microgravity environment and provide new opportunities for macromolecular samples that may crystallize poorly in standard laboratories. PMID:24278480

  16. High-Throughput Genotyping with Single Nucleotide Polymorphisms

    PubMed Central

    Ranade, Koustubh; Chang, Mau-Song; Ting, Chih-Tai; Pei, Dee; Hsiao, Chin-Fu; Olivier, Michael; Pesich, Robert; Hebert, Joan; Chen, Yii-Der I.; Dzau, Victor J.; Curb, David; Olshen, Richard; Risch, Neil; Cox, David R.; Botstein, David

    2001-01-01

    To make large-scale association studies a reality, automated high-throughput methods for genotyping with single-nucleotide polymorphisms (SNPs) are needed. We describe PCR conditions that permit the use of the TaqMan or 5′ nuclease allelic discrimination assay for typing large numbers of individuals with any SNP and computational methods that allow genotypes to be assigned automatically. To demonstrate the utility of these methods, we typed >1600 individuals for a G-to-T transversion that results in a glutamate-to-aspartate substitution at position 298 in the endothelial nitric oxide synthase gene, and a G/C polymorphism (newly identified in our laboratory) in intron 8 of the 11–β hydroxylase gene. The genotyping method is accurate—we estimate an error rate of fewer than 1 in 2000 genotypes, rapid—with five 96-well PCR machines, one fluorescent reader, and no automated pipetting, over one thousand genotypes can be generated by one person in one day, and flexible—a new SNP can be tested for association in less than one week. Indeed, large-scale genotyping has been accomplished for 23 other SNPs in 13 different genes using this method. In addition, we identified three “pseudo-SNPs” (WIAF1161, WIAF2566, and WIAF335) that are probably a result of duplication. PMID:11435409

  17. Fulcrum: condensing redundant reads from high-throughput sequencing studies

    PubMed Central

    Burriesci, Matthew S.; Lehnert, Erik M.; Pringle, John R.

    2012-01-01

    Motivation: Ultra-high-throughput sequencing produces duplicate and near-duplicate reads, which can consume computational resources in downstream applications. A tool that collapses such reads should reduce storage and assembly complications and costs. Results: We developed Fulcrum to collapse identical and near-identical Illumina and 454 reads (such as those from PCR clones) into single error-corrected sequences; it can process paired-end as well as single-end reads. Fulcrum is customizable and can be deployed on a single machine, a local network or a commercially available MapReduce cluster, and it has been optimized to maximize ease-of-use, cross-platform compatibility and future scalability. Sequence datasets have been collapsed by up to 71%, and the reduced number and improved quality of the resulting sequences allow assemblers to produce longer contigs while using less memory. Availability and implementation: Source code and a tutorial are available at http://pringlelab.stanford.edu/protocols.html under a BSD-like license. Fulcrum was written and tested in Python 2.6, and the single-machine and local-network modes depend on a modified version of the Parallel Python library (provided). Contact: erik.m.lehnert@gmail.com Supplementary information: Supplementary information is available at Bioinformatics online. PMID:22419786

  18. Comprehensive analysis of high-throughput screening data

    NASA Astrophysics Data System (ADS)

    Heyse, Stephan

    2002-06-01

    High-Throughput Screening (HTS) data in its entirety is a valuable raw material for the drug-discovery process. It provides the most compete information about the biological activity of a company's compounds. However, its quantity, complexity and heterogeneity require novel, sophisticated approaches in data analysis. At GeneData, we are developing methods for large-scale, synoptical mining of screening data in a five-step analysis: (1) Quality Assurance: Checking data for experimental artifacts and eliminating low quality data. (2) Biological Profiling: Clustering and ranking of compounds based on their biological activity, taking into account specific characteristics of HTS data. (3) Rule-based Classification: Applying user-defined rules to biological and chemical properties, and providing hypotheses on the biological mode-of-action of compounds. (4) Joint Biological-Chemical Analysis: Associating chemical compound data to HTS data, providing hypotheses for structure- activity relationships. (5) integration with Genomic and Gene Expression Data: Linking into other components of GeneData's bioinformatics platform, and assessing the compounds' modes-of-action, toxicity, and metabolic properties. These analyses address issues that are crucial for a correct interpretation and full exploitation of screening data. They lead to a sound rating of assays and compounds at an early state of the lead-finding process.

  19. Picking Cell Lines for High-Throughput Transcriptomic Toxicity ...

    EPA Pesticide Factsheets

    High throughput, whole genome transcriptomic profiling is a promising approach to comprehensively evaluate chemicals for potential biological effects. To be useful for in vitro toxicity screening, gene expression must be quantified in a set of representative cell types that captures the diversity of potential responses across chemicals. The ideal dataset to select these cell types would consist of hundreds of cell types treated with thousands of chemicals, but does not yet exist. However, basal gene expression data may be useful as a surrogate for representing the relevant biological space necessary for cell type selection. The goal of this study was to identify a small (< 20) number of cell types that capture a large, quantifiable fraction of basal gene expression diversity. Three publicly available collections of Affymetrix U133+2.0 cellular gene expression data were used: 1) 59 cell lines from the NCI60 set; 2) 303 primary cell types from the Mabbott et al (2013) expression atlas; and 3) 1036 cell lines from the Cancer Cell Line Encyclopedia. The data were RMA normalized, log-transformed, and the probe sets mapped to HUGO gene identifiers. The results showed that <20 cell lines capture only a small fraction of the total diversity in basal gene expression when evaluated using either the entire set of 20960 HUGO genes or a subset of druggable genes likely to be chemical targets. The fraction of the total gene expression variation explained was consistent when

  20. High-throughput screening and optimization of photoembossed relief structures.

    PubMed

    Adams, Nico; De Gans, Berend-Jan; Kozodaev, Dimitri; Sanchez, Carlos; Bastiaansen, Cees W M; Broer, Dirk J; Schubert, Ulrich S

    2006-01-01

    A methodology for the rapid design, screening, and optimization of coating systems with surface relief structures, using a combination of statistical experimental design, high-throughput experimentation, data mining, and graphical and mathematical optimization routines was developed. The methodology was applied to photopolymers used in photoembossing applications. A library of 72 films was prepared by dispensing a given amount of sample onto a chemically patterned substrate consisting of hydrophilic areas separated by fluorinated hydrophobic barriers. Film composition and film processing conditions were determined using statistical experimental design. The surface topology of the films was characterized by automated AFM. Subsequently, models explaining the dependence of surface topologies on sample composition and processing parameters were developed and used for screening a virtual 4000-membered in silico library of photopolymer lacquers. Simple graphical optimization or Pareto algorithms were subsequently used to find an ensemble of formulations, which were optimal with respect to a predefined set of properties, such as aspect ratio and shape of the relief structures.

  1. High Throughput Multispectral Image Processing with Applications in Food Science

    PubMed Central

    Tsakanikas, Panagiotis; Pavlidis, Dimitris; Nychas, George-John

    2015-01-01

    Recently, machine vision is gaining attention in food science as well as in food industry concerning food quality assessment and monitoring. Into the framework of implementation of Process Analytical Technology (PAT) in the food industry, image processing can be used not only in estimation and even prediction of food quality but also in detection of adulteration. Towards these applications on food science, we present here a novel methodology for automated image analysis of several kinds of food products e.g. meat, vanilla crème and table olives, so as to increase objectivity, data reproducibility, low cost information extraction and faster quality assessment, without human intervention. Image processing’s outcome will be propagated to the downstream analysis. The developed multispectral image processing method is based on unsupervised machine learning approach (Gaussian Mixture Models) and a novel unsupervised scheme of spectral band selection for segmentation process optimization. Through the evaluation we prove its efficiency and robustness against the currently available semi-manual software, showing that the developed method is a high throughput approach appropriate for massive data extraction from food samples. PMID:26466349

  2. Enzyme assay design for high-throughput screening.

    PubMed

    Williams, Kevin P; Scott, John E

    2009-01-01

    Enzymes continue to be a major drug target class for the pharmaceutical industry with high-throughput screening the approach of choice for identifying initial active chemical compounds. The development of fluorescent- or absorbance-based readouts typically remains the formats of choice for enzyme screens and a wealth of experience from both industry and academia has led to a comprehensive set of standardized assay development and validation guidelines for enzyme assays. In this chapter, we generalize approaches to developing, validating, and troubleshooting assays that should be applicable in both industrial and academic settings. Real-life examples of various enzyme classes including kinases, proteases, transferases, and phosphatases are used to illustrate assay development approaches and solutions. Practical examples are given for how to deal with low-purity enzyme targets, compound interference, and identification of activators. Assay acceptance criteria and a number of assay notes on pitfalls to avoid should provide pointers on how to develop a suitable enzymatic assay applicable for HTS.

  3. A Call for Nominations of Quantitative High-Throughput ...

    EPA Pesticide Factsheets

    The National Research Council of the United States National Academies of Science has recently released a document outlining a long-range vision and strategy for transforming toxicity testing from largely whole animal-based testing to one based on in vitro assays. “Toxicity Testing in the 21st Century: A Vision and a Strategy” advises a focus on relevant human toxicity pathway assays. Toxicity pathways are defined in the document as “Cellular response pathways that, when sufficiently perturbed, are expected to result in adverse health effects”. Results of such pathway screens would serve as a filter to drive selection of more specific, targeted testing that will complement and validate the pathway assays. In response to this report, the US EPA has partnered with two NIH organizations, the National Toxicology Program and the NIH Chemical Genomics Center (NCGC), in a program named Tox21. A major goal of this collaboration is to screen chemical libraries consisting of known toxicants, chemicals of environmental and occupational exposure concern, and human pharmaceuticals in cell-based pathway assays. Currently, approximately 3000 compounds (increasing to 9000 by the end of 2009) are being validated and screened in quantitative high-throughput (qHTS) format at the NCGC producing extensive concentration-response data for a diverse set of potential toxicity pathways. The Tox21 collaboration is extremely interested in accessing additional toxicity pathway assa

  4. RNA Secondary Structure Prediction Using High-throughput SHAPE

    PubMed Central

    Purzycka, Katarzyna J.; Rausch, Jason W.; Le Grice, Stuart F.J.

    2013-01-01

    Understanding the function of RNA involved in biological processes requires a thorough knowledge of RNA structure. Toward this end, the methodology dubbed "high-throughput selective 2' hydroxyl acylation analyzed by primer extension", or SHAPE, allows prediction of RNA secondary structure with single nucleotide resolution. This approach utilizes chemical probing agents that preferentially acylate single stranded or flexible regions of RNA in aqueous solution. Sites of chemical modification are detected by reverse transcription of the modified RNA, and the products of this reaction are fractionated by automated capillary electrophoresis (CE). Since reverse transcriptase pauses at those RNA nucleotides modified by the SHAPE reagents, the resulting cDNA library indirectly maps those ribonucleotides that are single stranded in the context of the folded RNA. Using ShapeFinder software, the electropherograms produced by automated CE are processed and converted into nucleotide reactivity tables that are themselves converted into pseudo-energy constraints used in the RNAStructure (v5.3) prediction algorithm. The two-dimensional RNA structures obtained by combining SHAPE probing with in silico RNA secondary structure prediction have been found to be far more accurate than structures obtained using either method alone. PMID:23748604

  5. A simple high-throughput enzymeimmunoassay for norethisterone (norethindrone).

    PubMed

    Turkes, A; Read, G F; Riad-Fahmy, D

    1982-05-01

    A direct enzymeimmunoassay having the sensitivity required for determining norethisterone concentrations in small aliquots of plasma (10 microliter) has been developed. This assay featured a solid phase antiserum raised against a norethisterone-11 alpha-hemisuccinyl/bovine serum albumin conjugate. The antiserum was coupled to cyanogen bromide-activated magnetisable cellulose, and antibody-bound and free fractions were separated by a simple magnetic device. A norethisterone/horseradish peroxidase conjugate was used as the label; o-phenylenediamine/hydrogen peroxide being the substrate for colour development. The results obtained by this direct EIA, which allowed processing of at least 100 samples per day, were compared with those of a well-validated enzymeimmunoassay featuring solvent extraction and centrifugal separation of antibody-bound and free steroid; the results were in excellent agreement (n = 30; r greater than 0.99) suggesting the usefulness of the simple high-throughput procedure for processing the large sample numbers generated by field investigations and pharmacokinetic studies.

  6. High-throughput screening of chemicals as functional ...

    EPA Pesticide Factsheets

    Identifying chemicals that provide a specific function within a product, yet have minimal impact on the human body or environment, is the goal of most formulation chemists and engineers practicing green chemistry. We present a methodology to identify potential chemical functional substitutes from large libraries of chemicals using machine learning based models. We collect and analyze publicly available information on the function of chemicals in consumer products or industrial processes to identify a suite of harmonized function categories suitable for modeling. We use structural and physicochemical descriptors for these chemicals to build 41 quantitative structure–use relationship (QSUR) models for harmonized function categories using random forest classification. We apply these models to screen a library of nearly 6400 chemicals with available structure information for potential functional substitutes. Using our Functional Use database (FUse), we could identify uses for 3121 chemicals; 4412 predicted functional uses had a probability of 80% or greater. We demonstrate the potential application of the models to high-throughput (HT) screening for “candidate alternatives” by merging the valid functional substitute classifications with hazard metrics developed from HT screening assays for bioactivity. A descriptor set could be obtained for 6356 Tox21 chemicals that have undergone a battery of HT in vitro bioactivity screening assays. By applying QSURs, we wer

  7. High-throughput optical screening of cellular mechanotransduction

    NASA Astrophysics Data System (ADS)

    Compton, Jonathan L.; Luo, Justin C.; Ma, Huan; Botvinick, Elliot; Venugopalan, Vasan

    2014-09-01

    We introduce an optical platform for rapid, high-throughput screening of exogenous molecules that affect cellular mechanotransduction. Our method initiates mechanotransduction in adherent cells using single laser-microbeam generated microcavitation bubbles without requiring flow chambers or microfluidics. These microcavitation bubbles expose adherent cells to a microtsunami, a transient microscale burst of hydrodynamic shear stress, which stimulates cells over areas approaching 1 mm2. We demonstrate microtsunami-initiated mechanosignalling in primary human endothelial cells. This observed signalling is consistent with G-protein-coupled receptor stimulation, resulting in Ca2+ release by the endoplasmic reticulum. Moreover, we demonstrate the dose-dependent modulation of microtsunami-induced Ca2+ signalling by introducing a known inhibitor to this pathway. The imaging of Ca2+ signalling and its modulation by exogenous molecules demonstrates the capacity to initiate and assess cellular mechanosignalling in real time. We utilize this capability to screen the effects of a set of small molecules on cellular mechanotransduction in 96-well plates using standard imaging cytometry.

  8. Functional approach to high-throughput plant growth analysis

    PubMed Central

    2013-01-01

    Method Taking advantage of the current rapid development in imaging systems and computer vision algorithms, we present HPGA, a high-throughput phenotyping platform for plant growth modeling and functional analysis, which produces better understanding of energy distribution in regards of the balance between growth and defense. HPGA has two components, PAE (Plant Area Estimation) and GMA (Growth Modeling and Analysis). In PAE, by taking the complex leaf overlap problem into consideration, the area of every plant is measured from top-view images in four steps. Given the abundant measurements obtained with PAE, in the second module GMA, a nonlinear growth model is applied to generate growth curves, followed by functional data analysis. Results Experimental results on model plant Arabidopsis thaliana show that, compared to an existing approach, HPGA reduces the error rate of measuring plant area by half. The application of HPGA on the cfq mutant plants under fluctuating light reveals the correlation between low photosynthetic rates and small plant area (compared to wild type), which raises a hypothesis that knocking out cfq changes the sensitivity of the energy distribution under fluctuating light conditions to repress leaf growth. Availability HPGA is available at http://www.msu.edu/~jinchen/HPGA. PMID:24565437

  9. High-throughput screening of chemical effects on ...

    EPA Pesticide Factsheets

    Disruption of steroidogenesis by environmental chemicals can result in altered hormone levels causing adverse reproductive and developmental effects. A high-throughput assay using H295R human adrenocortical carcinoma cells was used to evaluate the effect of 2,060 chemical samples on steroidogenesis via HPLC-MS/MS quantification of 10 steroid hormones, including progestagens, glucocorticoids, androgens, and estrogens. The study employed a three stage screening strategy. The first stage established the maximum tolerated concentration (MTC; >70% viability) per sample. The second stage quantified changes in hormone levels at the MTC while the third stage performed concentration-response (CR) on a subset of samples. At all stages, cells were pre-stimulated with 10 µM forskolin for 48 h to induce steroidogenesis followed by chemical treatment for 48 h. Of the 2,060 chemical samples evaluated, 524 samples were selected for six-point CR screening, based in part on significantly altering at least 4 hormones at the MTC. CR screening identified 232 chemical samples with concentration-dependent effects on 17β-estradiol and/or testosterone, with 411 chemical samples showing an effect on at least one hormone across the steroidogenesis pathway. Clustering of the concentration-dependent chemical-mediated steroid hormone effects grouped chemical samples into five distinct profiles generally representing putative mechanisms of action, including CYP17A1 and HSD3B inhibition. A d

  10. High-Throughput Preparation of New Photoactive Nanocomposites.

    PubMed

    Conterosito, Eleonora; Benesperi, Iacopo; Toson, Valentina; Saccone, Davide; Barbero, Nadia; Palin, Luca; Barolo, Claudia; Gianotti, Valentina; Milanesio, Marco

    2016-06-08

    New low-cost photoactive hybrid materials based on organic luminescent molecules inserted into hydrotalcite (layered double hydroxides; LDH) were produced, which exploit the high-throughput liquid-assisted grinding (LAG) method. These materials are conceived for applications in dye-sensitized solar cells (DSSCs) as a co-absorbers and in silicon photovoltaic (PV) panels to improve their efficiency as they are able to emit where PV modules show the maximum efficiency. A molecule that shows a large Stokes' shift was designed, synthesized, and intercalated into LDH. Two dyes already used in DSSCs were also intercalated to produce two new nanocomposites. LDH intercalation allows the stability of organic dyes to be improved and their direct use in polymer melt blending. The prepared nanocomposites absorb sunlight from UV to visible and emit from blue to near-IR and thus can be exploited for light-energy management. Finally one nanocomposite was dispersed by melt blending into a poly(methyl methacrylate)-block-poly(n-butyl acrylate) copolymer to obtain a photoactive film. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  12. High Throughput Interrogation of Behavioral Transitions in C. elegans

    NASA Astrophysics Data System (ADS)

    Liu, Mochi; Shaevitz, Joshua; Leifer, Andrew

    We present a high-throughput method to probe transformations from neural activity to behavior in Caenorhabditis elegans to better understand how organisms change behavioral states. We optogenetically deliver white-noise stimuli to target sensory or inter neurons while simultaneously recording the movement of a population of worms. Using all the postural movement data collected, we computationally classify stereotyped behaviors in C. elegans by clustering based on the spectral properties of the instantaneous posture. (Berman et al., 2014) Transitions between these behavioral clusters indicate discrete behavioral changes. To study the neural correlates dictating these transitions, we perform model-driven experiments and employ Linear-Nonlinear-Poisson cascades that take the white-noise stimulus as the input. The parameters of these models are fitted by reverse-correlation from our measurements. The parameterized models of behavioral transitions predict the worm's response to novel stimuli and reveal the internal computations the animal makes before carrying out behavioral decisions. Preliminary results are shown that describe the neural-behavioral transformation between neural activity in mechanosensory neurons and reversal behavior.

  13. Use of High Throughput Screening Data in IARC Monograph ...

    EPA Pesticide Factsheets

    Purpose: Evaluation of carcinogenic mechanisms serves a critical role in IARC monograph evaluations, and can lead to “upgrade” or “downgrade” of the carcinogenicity conclusions based on human and animal evidence alone. Three recent IARC monograph Working Groups (110, 112, and 113) pioneered analysis of high throughput in vitro screening data from the U.S. Environmental Protection Agency’s ToxCast program in evaluations of carcinogenic mechanisms. Methods: For monograph 110, ToxCast assay data across multiple nuclear receptors were used to test the hypothesis that PFOA acts exclusively through the PPAR family of receptors, with activity profiles compared to several prototypical nuclear receptor-activating compounds. For monographs 112 and 113, ToxCast assays were systematically evaluated and used as an additional data stream in the overall evaluation of the mechanistic evidence. Specifically, ToxCast assays were mapped to 10 “key characteristics of carcinogens” recently identified by an IARC expert group, and chemicals’ bioactivity profiles were evaluated both in absolute terms (number of relevant assays positive for bioactivity) and relative terms (ranking with respect to other compounds evaluated by IARC, using the ToxPi methodology). Results: PFOA activates multiple nuclear receptors in addition to the PPAR family in the ToxCast assays. ToxCast assays offered substantial coverage for 5 of the 10 “key characteristics,” with the greates

  14. The Utilization of Formalin Fixed-Paraffin-Embedded Specimens in High Throughput Genomic Studies

    PubMed Central

    Zhang, Pan

    2017-01-01

    High throughput genomic assays empower us to study the entire human genome in short time with reasonable cost. Formalin fixed-paraffin-embedded (FFPE) tissue processing remains the most economical approach for longitudinal tissue specimen storage. Therefore, the ability to apply high throughput genomic applications to FFPE specimens can expand clinical assays and discovery. Many studies have measured the accuracy and repeatability of data generated from FFPE specimens using high throughput genomic assays. Together, these studies demonstrate feasibility and provide crucial guidance for future studies using FFPE specimens. Here, we summarize the findings of these studies and discuss the limitations of high throughput data generated from FFPE specimens across several platforms that include microarray, high throughput sequencing, and NanoString. PMID:28246590

  15. High-throughput phenotyping of root growth dynamics.

    PubMed

    Yazdanbakhsh, Nima; Fisahn, Joachim

    2012-01-01

    Plant organ phenotyping by noninvasive video imaging techniques provides a powerful tool to assess physiological traits, circadian and diurnal rhythms, and biomass production. In particular, growth of individual plant organs is known to exhibit a high plasticity and occurs as a result of the interaction between various endogenous and environmental processes. Thus, any investigation aiming to unravel mechanisms that determine plant or organ growth has to accurately control and document the environmental growth conditions. Here we describe challenges in establishing a recently developed plant root monitoring platform (PlaRoM) specially suited for noninvasive high-throughput plant growth analysis with highest emphasis on the detailed documentation of capture time, as well as light and temperature conditions. Furthermore, we discuss the experimental procedure for measuring root elongation kinetics and key points that must be considered in such measurements. PlaRoM consists of a robotized imaging platform enclosed in a custom designed phytochamber and a root extension profiling software application. This platform has been developed for multi-parallel recordings of root growth phenotypes of up to 50 individual seedlings over several days, with high spatial and temporal resolution. Two Petri dishes are mounted on a vertical sample stage in a custom designed phytochamber that provides exact temperature control. A computer-controlled positioning unit moves these Petri dishes in small increments and enables continuous screening of the surface under a binocular microscope. Detection of the root tip is achieved by applying thresholds on image pixel data and verifying the neighbourhood for each dark pixel. The growth parameters are visualized as position over time or growth rate over time graphs and averaged over consecutive days, light-dark periods and 24 h day periods. This setup enables the investigation of root extension profiles of different genotypes in various growth

  16. High-throughput process development: II. Membrane chromatography.

    PubMed

    Rathore, Anurag S; Muthukumar, Sampath

    2014-01-01

    Membrane chromatography is gradually emerging as an alternative to conventional column chromatography. It alleviates some of the major disadvantages associated with the latter including high pressure drop across the column bed and dependence on intra-particle diffusion for the transport of solute molecules to their binding sites within the pores of separation media. In the last decade, it has emerged as a method of choice for final polishing of biopharmaceuticals, in particular monoclonal antibody products. The relevance of such a platform is high in view of the constraints with respect to time and resources that the biopharma industry faces today. This protocol describes the steps involved in performing HTPD of a membrane chromatography step. It describes operation of a commercially available device (AcroPrep™ Advance filter plate with Mustang S membrane from Pall Corporation). This device is available in 96-well format with 7 μL membrane in each well. We discuss the challenges that one faces when performing such experiments as well as possible solutions to alleviate them. Besides describing the operation of the device, the protocol also presents an approach for statistical analysis of the data that is gathered from such a platform. A case study involving use of the protocol for examining ion exchange chromatography of Granulocyte Colony Stimulating Factor (GCSF), a therapeutic product, is briefly discussed. This is intended to demonstrate the usefulness of this protocol in generating data that is representative of the data obtained at the traditional lab scale. The agreement in the data is indeed very significant (regression coefficient 0.99). We think that this protocol will be of significant value to those involved in performing high-throughput process development of membrane chromatography.

  17. Solion ion source for high-efficiency, high-throughput solar cell manufacturing

    SciTech Connect

    Koo, John Binns, Brant; Miller, Timothy; Krause, Stephen; Skinner, Wesley; Mullin, James

    2014-02-15

    In this paper, we introduce the Solion ion source for high-throughput solar cell doping. As the source power is increased to enable higher throughput, negative effects degrade the lifetime of the plasma chamber and the extraction electrodes. In order to improve efficiency, we have explored a wide range of electron energies and determined the conditions which best suit production. To extend the lifetime of the source we have developed an in situ cleaning method using only existing hardware. With these combinations, source life-times of >200 h for phosphorous and >100 h for boron ion beams have been achieved while maintaining 1100 cell-per-hour production.

  18. High-throughput neuroimaging-genetics computational infrastructure

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D.; Franco, Joseph; Toga, Arthur W.

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  19. Applications of Biophysics in High-Throughput Screening Hit Validation.

    PubMed

    Genick, Christine Clougherty; Barlier, Danielle; Monna, Dominique; Brunner, Reto; Bé, Céline; Scheufler, Clemens; Ottl, Johannes

    2014-06-01

    For approximately a decade, biophysical methods have been used to validate positive hits selected from high-throughput screening (HTS) campaigns with the goal to verify binding interactions using label-free assays. By applying label-free readouts, screen artifacts created by compound interference and fluorescence are discovered, enabling further characterization of the hits for their target specificity and selectivity. The use of several biophysical methods to extract this type of high-content information is required to prevent the promotion of false positives to the next level of hit validation and to select the best candidates for further chemical optimization. The typical technologies applied in this arena include dynamic light scattering, turbidometry, resonance waveguide, surface plasmon resonance, differential scanning fluorimetry, mass spectrometry, and others. Each technology can provide different types of information to enable the characterization of the binding interaction. Thus, these technologies can be incorporated in a hit-validation strategy not only according to the profile of chemical matter that is desired by the medicinal chemists, but also in a manner that is in agreement with the target protein's amenability to the screening format. Here, we present the results of screening strategies using biophysics with the objective to evaluate the approaches, discuss the advantages and challenges, and summarize the benefits in reference to lead discovery. In summary, the biophysics screens presented here demonstrated various hit rates from a list of ~2000 preselected, IC50-validated hits from HTS (an IC50 is the inhibitor concentration at which 50% inhibition of activity is observed). There are several lessons learned from these biophysical screens, which will be discussed in this article.

  20. Mining Chemical Activity Status from High-Throughput Screening Assays

    PubMed Central

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B.

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare. PMID:26658480

  1. High-throughput neuroimaging-genetics computational infrastructure.

    PubMed

    Dinov, Ivo D; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Hobel, Sam; Vespa, Paul; Woo Moon, Seok; Van Horn, John D; Franco, Joseph; Toga, Arthur W

    2014-01-01

    Many contemporary neuroscientific investigations face significant challenges in terms of data management, computational processing, data mining, and results interpretation. These four pillars define the core infrastructure necessary to plan, organize, orchestrate, validate, and disseminate novel scientific methods, computational resources, and translational healthcare findings. Data management includes protocols for data acquisition, archival, query, transfer, retrieval, and aggregation. Computational processing involves the necessary software, hardware, and networking infrastructure required to handle large amounts of heterogeneous neuroimaging, genetics, clinical, and phenotypic data and meta-data. Data mining refers to the process of automatically extracting data features, characteristics and associations, which are not readily visible by human exploration of the raw dataset. Result interpretation includes scientific visualization, community validation of findings and reproducible findings. In this manuscript we describe the novel high-throughput neuroimaging-genetics computational infrastructure available at the Institute for Neuroimaging and Informatics (INI) and the Laboratory of Neuro Imaging (LONI) at University of Southern California (USC). INI and LONI include ultra-high-field and standard-field MRI brain scanners along with an imaging-genetics database for storing the complete provenance of the raw and derived data and meta-data. In addition, the institute provides a large number of software tools for image and shape analysis, mathematical modeling, genomic sequence processing, and scientific visualization. A unique feature of this architecture is the Pipeline environment, which integrates the data management, processing, transfer, and visualization. Through its client-server architecture, the Pipeline environment provides a graphical user interface for designing, executing, monitoring validating, and disseminating of complex protocols that utilize

  2. Mining Chemical Activity Status from High-Throughput Screening Assays.

    PubMed

    Soufan, Othman; Ba-alawi, Wail; Afeef, Moataz; Essack, Magbubah; Rodionov, Valentin; Kalnis, Panos; Bajic, Vladimir B

    2015-01-01

    High-throughput screening (HTS) experiments provide a valuable resource that reports biological activity of numerous chemical compounds relative to their molecular targets. Building computational models that accurately predict such activity status (active vs. inactive) in specific assays is a challenging task given the large volume of data and frequently small proportion of active compounds relative to the inactive ones. We developed a method, DRAMOTE, to predict activity status of chemical compounds in HTP activity assays. For a class of HTP assays, our method achieves considerably better results than the current state-of-the-art-solutions. We achieved this by modification of a minority oversampling technique. To demonstrate that DRAMOTE is performing better than the other methods, we performed a comprehensive comparison analysis with several other methods and evaluated them on data from 11 PubChem assays through 1,350 experiments that involved approximately 500,000 interactions between chemicals and their target proteins. As an example of potential use, we applied DRAMOTE to develop robust models for predicting FDA approved drugs that have high probability to interact with the thyroid stimulating hormone receptor (TSHR) in humans. Our findings are further partially and indirectly supported by 3D docking results and literature information. The results based on approximately 500,000 interactions suggest that DRAMOTE has performed the best and that it can be used for developing robust virtual screening models. The datasets and implementation of all solutions are available as a MATLAB toolbox online at www.cbrc.kaust.edu.sa/dramote and can be found on Figshare.

  3. An improved high throughput sequencing method for studying oomycete communities.

    PubMed

    Sapkota, Rumakanta; Nicolaisen, Mogens

    2015-03-01

    Culture-independent studies using next generation sequencing have revolutionized microbial ecology, however, oomycete ecology in soils is severely lagging behind. The aim of this study was to improve and validate standard techniques for using high throughput sequencing as a tool for studying oomycete communities. The well-known primer sets ITS4, ITS6 and ITS7 were used in the study in a semi-nested PCR approach to target the internal transcribed spacer (ITS) 1 of ribosomal DNA in a next generation sequencing protocol. These primers have been used in similar studies before, but with limited success. We were able to increase the proportion of retrieved oomycete sequences dramatically mainly by increasing the annealing temperature during PCR. The optimized protocol was validated using three mock communities and the method was further evaluated using total DNA from 26 soil samples collected from different agricultural fields in Denmark, and 11 samples from carrot tissue with symptoms of Pythium infection. Sequence data from the Pythium and Phytophthora mock communities showed that our strategy successfully detected all included species. Taxonomic assignments of OTUs from 26 soil sample showed that 95% of the sequences could be assigned to oomycetes including Pythium, Aphanomyces, Peronospora, Saprolegnia and Phytophthora. A high proportion of oomycete reads was consistently present in all 26 soil samples showing the versatility of the strategy. A large diversity of Pythium species including pathogenic and saprophytic species were dominating in cultivated soil. Finally, we analyzed amplicons from carrots with symptoms of cavity spot. This resulted in 94% of the reads belonging to oomycetes with a dominance of species of Pythium that are known to be involved in causing cavity spot, thus demonstrating the usefulness of the method not only in soil DNA but also in a plant DNA background. In conclusion, we demonstrate a successful approach for pyrosequencing of oomycete

  4. High-throughput process development for recombinant protein purification.

    PubMed

    Rege, Kaushal; Pepsin, Mike; Falcon, Brandy; Steele, Landon; Heng, Meng

    2006-03-05

    screening of a wide variety of actual bioprocess media and conditions and represents a novel paradigm approach for the high-throughput process development of recombinant proteins.

  5. High-throughput screening method for lipases/esterases.

    PubMed

    Mateos-Díaz, Eduardo; Rodríguez, Jorge Alberto; de Los Ángeles Camacho-Ruiz, María; Mateos-Díaz, Juan Carlos

    2012-01-01

    High-throughput screening (HTS) methods for lipases and esterases are generally performed by using synthetic chromogenic substrates (e.g., p-nitrophenyl, resorufin, and umbelliferyl esters) which may be misleading since they are not their natural substrates (e.g., partially or insoluble triglycerides). In previous works, we have shown that soluble nonchromogenic substrates and p-nitrophenol (as a pH indicator) can be used to quantify the hydrolysis and estimate the substrate selectivity of lipases and esterases from several sources. However, in order to implement a spectrophotometric HTS method using partially or insoluble triglycerides, it is necessary to find particular conditions which allow a quantitative detection of the enzymatic activity. In this work, we used Triton X-100, CHAPS, and N-lauroyl sarcosine as emulsifiers, β-cyclodextrin as a fatty acid captor, and two substrate concentrations, 1 mM of tributyrin (TC4) and 5 mM of trioctanoin (TC8), to improve the test conditions. To demonstrate the utility of this method, we screened 12 enzymes (commercial preparations and culture broth extracts) for the hydrolysis of TC4 and TC8, which are both classical substrates for lipases and esterases (for esterases, only TC4 may be hydrolyzed). Subsequent pH-stat experiments were performed to confirm the preference of substrate hydrolysis with the hydrolases tested. We have shown that this method is very useful for screening a high number of lipases (hydrolysis of TC4 and TC8) or esterases (only hydrolysis of TC4) from wild isolates or variants generated by directed evolution using nonchromogenic triglycerides directly in the test.

  6. Systematic error detection in experimental high-throughput screening

    PubMed Central

    2011-01-01

    Background High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection process. Several error correction methods and software have been developed to address this issue in the context of experimental HTS [1-7]. Despite their power to reduce the impact of systematic error when applied to error perturbed datasets, those methods also have one disadvantage - they introduce a bias when applied to data not containing any systematic error [6]. Hence, we need first to assess the presence of systematic error in a given HTS assay and then carry out systematic error correction method if and only if the presence of systematic error has been confirmed by statistical tests. Results We tested three statistical procedures to assess the presence of systematic error in experimental HTS data, including the χ2 goodness-of-fit test, Student's t-test and Kolmogorov-Smirnov test [8] preceded by the Discrete Fourier Transform (DFT) method [9]. We applied these procedures to raw HTS measurements, first, and to estimated hit distribution surfaces, second. The three competing tests were applied to analyse simulated datasets containing different types of systematic error, and to a real HTS dataset. Their accuracy was compared under various error conditions. Conclusions A successful assessment of the presence of systematic error in experimental HTS assays is possible when the appropriate statistical methodology is used. Namely, the t-test should be carried out by researchers to determine whether systematic error is present in their HTS data prior to applying any error correction method

  7. Systematic error detection in experimental high-throughput screening.

    PubMed

    Dragiev, Plamen; Nadon, Robert; Makarenkov, Vladimir

    2011-01-19

    High-throughput screening (HTS) is a key part of the drug discovery process during which thousands of chemical compounds are screened and their activity levels measured in order to identify potential drug candidates (i.e., hits). Many technical, procedural or environmental factors can cause systematic measurement error or inequalities in the conditions in which the measurements are taken. Such systematic error has the potential to critically affect the hit selection process. Several error correction methods and software have been developed to address this issue in the context of experimental HTS 1234567. Despite their power to reduce the impact of systematic error when applied to error perturbed datasets, those methods also have one disadvantage - they introduce a bias when applied to data not containing any systematic error 6. Hence, we need first to assess the presence of systematic error in a given HTS assay and then carry out systematic error correction method if and only if the presence of systematic error has been confirmed by statistical tests. We tested three statistical procedures to assess the presence of systematic error in experimental HTS data, including the χ2 goodness-of-fit test, Student's t-test and Kolmogorov-Smirnov test 8 preceded by the Discrete Fourier Transform (DFT) method 9. We applied these procedures to raw HTS measurements, first, and to estimated hit distribution surfaces, second. The three competing tests were applied to analyse simulated datasets containing different types of systematic error, and to a real HTS dataset. Their accuracy was compared under various error conditions. A successful assessment of the presence of systematic error in experimental HTS assays is possible when the appropriate statistical methodology is used. Namely, the t-test should be carried out by researchers to determine whether systematic error is present in their HTS data prior to applying any error correction method. This important step can significantly

  8. Using In Vitro High-Throughput Screening Data for Predicting ...

    EPA Pesticide Factsheets

    Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values and risk screening values. We aim to use computational toxicology and quantitative high throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. By coupling qHTS data with adverse outcome pathways (AOPs) we can use ontologies to make predictions about potential hazards and to identify those assays which are sufficient to infer these same hazards. Once those assays are identified, we can use bootstrap natural spline-based metaregression to integrate the evidence across multiple replicates or assays (if a combination of assays are together necessary to be sufficient). In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene (B[k]F) may induce DNA damage and steatosis using qHTS data and two separate AOPs. We also demonstrate how bootstrap natural spline-based metaregression can be used to integrate the data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an internal point of departure of 0.751µM and risk-specific concentrations of 0.378µM for both 1:1,000 and 1:10,000 additive risk for B[k]F induced DNA damage based on the p53 assay. Based on the available evidence, we

  9. Scanning fluorescence detector for high-throughput DNA genotyping

    NASA Astrophysics Data System (ADS)

    Rusch, Terry L.; Petsinger, Jeremy; Christensen, Carl; Vaske, David A.; Brumley, Robert L., Jr.; Luckey, John A.; Weber, James L.

    1996-04-01

    A new scanning fluorescence detector (SCAFUD) was developed for high-throughput genotyping of short tandem repeat polymorphisms (STRPs). Fluorescent dyes are incorporated into relatively short DNA fragments via polymerase chain reaction (PCR) and are separated by electrophoresis in short, wide polyacrylamide gels (144 lanes with well to read distances of 14 cm). Excitation light from an argon laser with primary lines at 488 and 514 nm is introduced into the gel through a fiber optic cable, dichroic mirror, and 40X microscope objective. Emitted fluorescent light is collected confocally through a second fiber. The confocal head is translated across the bottom of the gel at 0.5 Hz. The detection unit utilizes dichroic mirrors and band pass filters to direct light with 10 - 20 nm bandwidths to four photomultiplier tubes (PMTs). PMT signals are independently amplified with variable gain and then sampled at a rate of 2500 points per scan using a computer based A/D board. LabView software (National Instruments) is used for instrument operation. Currently, three fluorescent dyes (Fam, Hex and Rox) are simultaneously detected with peak detection wavelengths of 543, 567, and 613 nm, respectively. The detection limit for fluorescein-labeled primers is about 100 attomoles. Planned SCAFUD upgrades include rearrangement of laser head geometry, use of additional excitation lasers for simultaneous detection of more dyes, and the use of detector arrays instead of individual PMTs. Extensive software has been written for automatic analysis of SCAFUD images. The software enables background subtraction, band identification, multiple- dye signal resolution, lane finding, band sizing and allele calling. Whole genome screens are currently underway to search for loci influencing such complex diseases as diabetes, asthma, and hypertension. Seven production SCAFUDs are currently in operation. Genotyping output for the coming year is projected to be about one million total genotypes (DNA

  10. Towards Chip Scale Liquid Chromatography and High Throughput Immunosensing

    SciTech Connect

    Ni, Jing

    2000-09-21

    This work describes several research projects aimed towards developing new instruments and novel methods for high throughput chemical and biological analysis. Approaches are taken in two directions. The first direction takes advantage of well-established semiconductor fabrication techniques and applies them to miniaturize instruments that are workhorses in analytical laboratories. Specifically, the first part of this work focused on the development of micropumps and microvalves for controlled fluid delivery. The mechanism of these micropumps and microvalves relies on the electrochemically-induced surface tension change at a mercury/electrolyte interface. A miniaturized flow injection analysis device was integrated and flow injection analyses were demonstrated. In the second part of this work, microfluidic chips were also designed, fabricated, and tested. Separations of two fluorescent dyes were demonstrated in microfabricated channels, based on an open-tubular liquid chromatography (OT LC) or an electrochemically-modulated liquid chromatography (EMLC) format. A reduction in instrument size can potentially increase analysis speed, and allow exceedingly small amounts of sample to be analyzed under diverse separation conditions. The second direction explores the surface enhanced Raman spectroscopy (SERS) as a signal transduction method for immunoassay analysis. It takes advantage of the improved detection sensitivity as a result of surface enhancement on colloidal gold, the narrow width of Raman band, and the stability of Raman scattering signals to distinguish several different species simultaneously without exploiting spatially-separated addresses on a biochip. By labeling gold nanoparticles with different Raman reporters in conjunction with different detection antibodies, a simultaneous detection of a dual-analyte immunoassay was demonstrated. Using this scheme for quantitative analysis was also studied and preliminary dose-response curves from an immunoassay of a

  11. Using In Vitro High-Throughput Screening Data for Predicting ...

    EPA Pesticide Factsheets

    Today there are more than 80,000 chemicals in commerce and the environment. The potential human health risks are unknown for the vast majority of these chemicals as they lack human health risk assessments, toxicity reference values and risk screening values. We aim to use computational toxicology and quantitative high throughput screening (qHTS) technologies to fill these data gaps, and begin to prioritize these chemicals for additional assessment. By coupling qHTS data with adverse outcome pathways (AOPs) we can use ontologies to make predictions about potential hazards and to identify those assays which are sufficient to infer these same hazards. Once those assays are identified, we can use bootstrap natural spline-based metaregression to integrate the evidence across multiple replicates or assays (if a combination of assays are together necessary to be sufficient). In this pilot, we demonstrate how we were able to identify that benzo[k]fluoranthene (B[k]F) may induce DNA damage and steatosis using qHTS data and two separate AOPs. We also demonstrate how bootstrap natural spline-based metaregression can be used to integrate the data across multiple assay replicates to generate a concentration-response curve. We used this analysis to calculate an internal point of departure of 0.751µM and risk-specific concentrations of 0.378µM for both 1:1,000 and 1:10,000 additive risk for B[k]F induced DNA damage based on the p53 assay. Based on the available evidence, we

  12. Automated Segmentation and Classification of High Throughput Yeast Assay Spots

    PubMed Central

    Jafari-Khouzani, Kourosh; Soltanian-Zadeh, Hamid; Fotouhi, Farshad; Parrish, Jodi R.; Finley, Russell L.

    2009-01-01

    Several technologies for characterizing genes and proteins from humans and other organisms use yeast growth or color development as read outs. The yeast two-hybrid assay, for example, detects protein-protein interactions by measuring the growth of yeast on a specific solid medium, or the ability of the yeast to change color when grown on a medium containing a chromogenic substrate. Current systems for analyzing the results of these types of assays rely on subjective and inefficient scoring of growth or color by human experts. Here an image analysis system is described for scoring yeast growth and color development in high throughput biological assays. The goal is to locate the spots and score them in color images of two types of plates named “X-Gal” and “growth assay” plates, with uniformly placed spots (cell areas) on each plate (both plates in one image). The scoring system relies on color for the X-Gal spots, and texture properties for the growth assay spots. A maximum likelihood projection-based segmentation is developed to automatically locate spots of yeast on each plate. Then color histogram and wavelet texture features are extracted for scoring using an optimal linear transformation. Finally an artificial neural network is used to score the X-Gal and growth assay spots using the extracted features. The performance of the system is evaluated using spots of 60 images. After training the networks using training and validation sets, the system was assessed on the test set. The overall accuracies of 95.4% and 88.2% are achieved respectively for scoring the X-Gal and growth assay spots. PMID:17948730

  13. Missing call bias in high-throughput genotyping.

    PubMed

    Fu, Wenqing; Wang, Yi; Wang, Ying; Li, Rui; Lin, Rong; Jin, Li

    2009-03-13

    The advent of high-throughput and cost-effective genotyping platforms made genome-wide association (GWA) studies a reality. While the primary focus has been invested upon the improvement of reducing genotyping error, the problems associated with missing calls are largely overlooked. To probe into the effect of missing calls on GWAs, we demonstrated experimentally the prevalence and severity of the problem of missing call bias (MCB) in four genotyping technologies (Affymetrix 500 K SNP array, SNPstream, TaqMan, and Illumina Beadlab). Subsequently, we showed theoretically that MCB leads to biased conclusions in the subsequent analyses, including estimation of allele/genotype frequencies, the measurement of HWE and association tests under various modes of inheritance relationships. We showed that MCB usually leads to power loss in association tests, and such power change is greater than what could be achieved by equivalent reduction of sample size unbiasedly. We also compared the bias in allele frequency estimation and in association tests introduced by MCB with those by genotyping errors. Our results illustrated that in most cases, the bias can be greatly reduced by increasing the call-rate at the cost of genotyping error rate. The commonly used 'no-call' procedure for the observations of borderline quality should be modified. If the objective is to minimize the bias, the cut-off for call-rate and that for genotyping error rate should be properly coupled in GWA. We suggested that the ongoing QC cut-off for call-rate should be increased, while the cut-off for genotyping error rate can be reduced properly.

  14. High-throughput mode liquid microjunction surface sampling probe.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos; King, Richard C

    2009-08-15

    A simple and automated spot sampling operation mode for a liquid microjunction surface sampling probe/electrospray ionization mass spectrometry (LMJ-SSP/ESI-MS) system is reported. Prior manual and automated spot sampling methods with this probe relied on a careful, relatively slow alignment of the probe and surface distance (<20 microm spacing) to form the probe-to-surface liquid microjunction critical to successful surface sampling. Moreover, sampling multiple spots required retraction of the surface from the probe and a repeat of this careful probe-to-surface distance alignment at the next sampling position. With the method described here, the probe was not positioned as close to the surface, the exact probe-to-surface positioning was found to be less critical (spanning distances from about 100-300 microm), and this distance was not altered during the sampling of an entire array of sample spots. With the probe positioned within the appropriate distance from the surface, the liquid microjunction was formed by letting the liquid from the sampling end of the probe extend out from the probe to the surface. This was accomplished by reducing the self-aspiration liquid flow rate of the probe to a value less than the volume flow rate pumped into the probe. When the self-aspiration rate of the probe was subsequently increased, analytes on the surface that dissolved at the liquid microjunction were aspirated back into the probe with the liquid that created the liquid microjunction and electrosprayed. Presented here are the basics of this new sampling mode, as well as data that illustrate the potential analytical capabilities of the device to conduct high-throughput quantitative analysis.

  15. High-throughput screening of solid-state catalyst libraries

    NASA Astrophysics Data System (ADS)

    Senkan, Selim M.

    1998-07-01

    Combinatorial synthesis methods allow the rapid preparation and processing of large libraries of solid-state materials. The use of these methods, together with the appropriate screening techniques, has recently led to the discovery of materials with promising superconducting, magnetoresistive, luminescent and dielectric properties. Solid-state catalysts, which play an increasingly important role in the chemical and oil industries, represent another class of material amenable to combinatorial synthesis. Yet typically, catalyst discovery still involves inefficient trial-and-error processes, because catalytic activity is inherently difficult to screen. In contrast to superconductivity, magnetoresistivity and dielectric properties, which can be tested by contact probes, or luminescence, which can be observed directly, the assessment of catalytic activity requires the unambiguous detection of a specific product molecule above a small catalyst site on a large library. Screening by in situ infrared thermography and microprobe sampling mass spectrometry, have been suggested, but the first method, while probing activity, provides no information on reaction products, whereas the second is difficult to implement because it requires the transport of minute gas samples from each library site to the detection system. Here I describe the use of laser-induced resonance-enhanced multiphoton ionization for sensitive, selective and high-throughput screening of a library of solid-state catalysts that activate the dehydrogenation of cyclohexane to benzene. I show that benzene, the product molecule, can be selectively photoionized in the vicinity of the catalytic sites, and that the detection of the resultant photoions by an array of microelectrodes provides information on the activity of individual sites. Adaptation of this technique for the screening of other catalytic reactions and larger libraries with smaller site size seems feasible, thus opening up the possibility of exploiting

  16. Low inlet gas velocity high throughput biomass gasifier

    DOEpatents

    Feldmann, Herman F.; Paisley, Mark A.

    1989-01-01

    The present invention discloses a novel method of operating a gasifier for production of fuel gas from carbonaceous fuels. The process disclosed enables operating in an entrained mode using inlet gas velocities of less than 7 feet per second, feedstock throughputs exceeding 4000 lbs/ft.sup.2 -hr, and pressures below 100 psia.

  17. HT-Paxos: High Throughput State-Machine Replication Protocol for Large Clustered Data Centers

    PubMed Central

    Agarwal, Ajay

    2015-01-01

    Paxos is a prominent theory of state-machine replication. Recent data intensive systems that implement state-machine replication generally require high throughput. Earlier versions of Paxos as few of them are classical Paxos, fast Paxos, and generalized Paxos have a major focus on fault tolerance and latency but lacking in terms of throughput and scalability. A major reason for this is the heavyweight leader. Through offloading the leader, we can further increase throughput of the system. Ring Paxos, Multiring Paxos, and S-Paxos are few prominent attempts in this direction for clustered data centers. In this paper, we are proposing HT-Paxos, a variant of Paxos that is the best suitable for any large clustered data center. HT-Paxos further offloads the leader very significantly and hence increases the throughput and scalability of the system, while at the same time, among high throughput state-machine replication protocols, it provides reasonably low latency and response time. PMID:25821856

  18. Development and implementation of industrialized, fully automated high throughput screening systems

    PubMed Central

    2003-01-01

    Automation has long been a resource for high-throughput screening at Bristol-Myers Squibb. However, with growing deck sizes and decreasing time lines, a new generation of more robust, supportable automated systems was necessary for accomplishing high-throughput screening goals. Implementation of this new generation of automated systems required numerous decisions concerning hardware, software and the value of in-house automation expertise. This project has resulted in fast, flexible, industrialized automation systems with a strong in-house support structure that we believe meets our current high-throughput screening requirements and will continue to meet them well into the future. PMID:18924614

  19. High-sensitivity high-throughput chip based biosensor array for multiplexed detection of heavy metals

    NASA Astrophysics Data System (ADS)

    Yan, Hai; Tang, Naimei; Jairo, Grace A.; Chakravarty, Swapnajit; Blake, Diane A.; Chen, Ray T.

    2016-03-01

    Heavy metal ions released into the environment from industrial processes lead to various health hazards. We propose an on-chip label-free detection approach that allows high-sensitivity and high-throughput detection of heavy metals. The sensing device consists of 2-dimensional photonic crystal microcavities that are combined by multimode interferometer to form a sensor array. We experimentally demonstrate the detection of cadmium-chelate conjugate with concentration as low as 5 parts-per-billion (ppb).

  20. High-throughput analysis of algal crude oils using high resolution mass spectrometry.

    PubMed

    Lee, Young Jin; Leverence, Rachael C; Smith, Erica A; Valenstein, Justin S; Kandel, Kapil; Trewyn, Brian G

    2013-03-01

    Lipid analysis often needs to be specifically optimized for each class of compounds due to its wide variety of chemical and physical properties. It becomes a serious bottleneck in the development of algae-based next generation biofuels when high-throughput analysis becomes essential for the optimization of various process conditions. We propose a high-resolution mass spectrometry-based high-throughput assay as a 'quick-and-dirty' protocol to monitor various lipid classes in algal crude oils. Atmospheric pressure chemical ionization was determined to be most effective for this purpose to cover a wide range of lipid classes. With an autosampler-LC pump set-up, we could analyze algal crude samples every one and half minutes, monitoring several lipid species such as TAG, DAG, squalene, sterols, and chlorophyll a. High-mass resolution and high-mass accuracy of the orbitrap mass analyzer provides confidence in the identification of these lipid compounds. MS/MS and MS3 analysis could be performed in parallel for further structural information, as demonstrated for TAG and DAG. This high-throughput method was successfully demonstrated for semi-quantitative analysis of algal oils after treatment with various nanoparticles.

  1. High-throughput imaging of neuronal activity in Caenorhabditis elegans

    PubMed Central

    Larsch, Johannes; Ventimiglia, Donovan; Bargmann, Cornelia I.; Albrecht, Dirk R.

    2013-01-01

    Neuronal responses to sensory inputs can vary based on genotype, development, experience, or stochastic factors. Existing neuronal recording techniques examine a single animal at a time, limiting understanding of the variability and range of potential responses. To scale up neuronal recordings, we here describe a system for simultaneous wide-field imaging of neuronal calcium activity from at least 20 Caenorhabditis elegans animals under precise microfluidic chemical stimulation. This increased experimental throughput was used to perform a systematic characterization of chemosensory neuron responses to multiple odors, odor concentrations, and temporal patterns, as well as responses to pharmacological manipulation. The system allowed recordings from sensory neurons and interneurons in freely moving animals, whose neuronal responses could be correlated with behavior. Wide-field imaging provides a tool for comprehensive circuit analysis with elevated throughput in C. elegans. PMID:24145415

  2. High-throughput metal susceptibility testing of microbial biofilms.

    PubMed

    Harrison, Joe J; Turner, Raymond J; Ceri, Howard

    2005-10-03

    Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO3(2-)) than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic cell E. coli JM109 to metals was time

  3. High-throughput metal susceptibility testing of microbial biofilms

    PubMed Central

    Harrison, Joe J; Turner, Raymond J; Ceri, Howard

    2005-01-01

    Background Microbial biofilms exist all over the natural world, a distribution that is paralleled by metal cations and oxyanions. Despite this reality, very few studies have examined how biofilms withstand exposure to these toxic compounds. This article describes a batch culture technique for biofilm and planktonic cell metal susceptibility testing using the MBEC assay. This device is compatible with standard 96-well microtiter plate technology. As part of this method, a two part, metal specific neutralization protocol is summarized. This procedure minimizes residual biological toxicity arising from the carry-over of metals from challenge to recovery media. Neutralization consists of treating cultures with a chemical compound known to react with or to chelate the metal. Treated cultures are plated onto rich agar to allow metal complexes to diffuse into the recovery medium while bacteria remain on top to recover. Two difficulties associated with metal susceptibility testing were the focus of two applications of this technique. First, assays were calibrated to allow comparisons of the susceptibility of different organisms to metals. Second, the effects of exposure time and growth medium composition on the susceptibility of E. coli JM109 biofilms to metals were investigated. Results This high-throughput method generated 96-statistically equivalent biofilms in a single device and thus allowed for comparative and combinatorial experiments of media, microbial strains, exposure times and metals. By adjusting growth conditions, it was possible to examine biofilms of different microorganisms that had similar cell densities. In one example, Pseudomonas aeruginosa ATCC 27853 was up to 80 times more resistant to heavy metalloid oxyanions than Escherichia coli TG1. Further, biofilms were up to 133 times more tolerant to tellurite (TeO32-) than corresponding planktonic cultures. Regardless of the growth medium, the tolerance of biofilm and planktonic cell E. coli JM109 to metals

  4. Emerging high throughput analyses of cyanobacterial toxins and toxic cyanobacteria.

    PubMed

    Sivonen, Kaarina

    2008-01-01

    The common occurrence of toxic cyanobacteria causes problems for health of animals and human beings. More research and good monitoring systems are needed to protect water users. It is important to have rapid, reliable and accurate analysis i.e. high throughput methods to identify the toxins as well as toxin producers in the environment. Excellent methods, such as ELISA already exist to analyse cyanobacterial hepatotoxins and saxitoxins, and PPIA for microcystins and nodularins. The LC/MS method can be fast in identifying the toxicants in the samples. Further development of this area should resolve the problems with sampling and sample preparation, which still are the bottlenecks of rapid analyses. In addition, the availability of reliable reference materials and standards should be resolved. Molecular detection methods are now routine in clinical and criminal laboratories and may also become important in environmental diagnostics. One prerequisite for the development of molecular analysis is that pure cultures of the producer organisms are available for identification of the biosynthetic genes responsible for toxin production and for proper testing of the diagnostic methods. Good methods are already available for the microcystin and nodularin-producing cyanobacteria such as conventional PCR, quantitative real-time PCR and microarrays/DNA chips. The DNA-chip technology offers an attractive monitoring system for toxic and non-toxic cyanobacteria. Only with these new technologies (PCR + DNA-chips) will we be able to study toxic cyanobacteria populations in situ and the effects of environmental factors on the occurrence and proliferation of especially toxic cyanobacteria. This is likely to yield important information for mitigation purposes. Further development of these methods should include all cyanobacterial biodiversity, including all toxin producers and primers/probes to detect producers of neurotoxins, cylindrospermopsins etc. (genes are unknown). The on

  5. High-throughput, high-sensitivity analysis of gene expression in Arabidopsis.

    PubMed

    Kris, Richard Martin; Felder, Stephen; Deyholos, Michael; Lambert, Georgina M; Hinton, James; Botros, Ihab; Martel, Ralph; Seligmann, Bruce; Galbraith, David W

    2007-07-01

    High-throughput gene expression analysis of genes expressed during salt stress was performed using a novel multiplexed quantitative nuclease protection assay that involves customized DNA microarrays printed within the individual wells of 96-well plates. The levels of expression of the transcripts from 16 different genes were quantified within crude homogenates prepared from Arabidopsis (Arabidopsis thaliana) plants also grown in a 96-well plate format. Examples are provided of the high degree of reproducibility of quantitative dose-response data and of the sensitivity of detection of changes in gene expression within limiting amounts of tissue. The lack of requirement for RNA purification renders the assay particularly suited for high-throughput gene expression analysis and for the discovery of novel chemical compounds that specifically modulate the expression of endogenous target genes.

  6. Implementation of an Automated High-Throughput Plasmid DNA Production Pipeline.

    PubMed

    Billeci, Karen; Suh, Christopher; Di Ioia, Tina; Singh, Lovejit; Abraham, Ryan; Baldwin, Anne; Monteclaro, Stephen

    2016-12-01

    Biologics sample management facilities are often responsible for a diversity of large-molecule reagent types, such as DNA, RNAi, and protein libraries. Historically, the management of large molecules was dispersed into multiple laboratories. As methodologies to support pathway discovery, antibody discovery, and protein production have become high throughput, the implementation of automation and centralized inventory management tools has become important. To this end, to improve sample tracking, throughput, and accuracy, we have implemented a module-based automation system integrated into inventory management software using multiple platforms (Hamilton, Hudson, Dynamic Devices, and Brooks). Here we describe the implementation of these systems with a focus on high-throughput plasmid DNA production management.

  7. Integration of Dosimetry, Exposure and High-Throughput Screening Data in Chemical Toxicity Assessment

    EPA Science Inventory

    High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, c...

  8. High-Throughput Industrial Coatings Research at The Dow Chemical Company.

    PubMed

    Kuo, Tzu-Chi; Malvadkar, Niranjan A; Drumright, Ray; Cesaretti, Richard; Bishop, Matthew T

    2016-09-12

    At The Dow Chemical Company, high-throughput research is an active area for developing new industrial coatings products. Using the principles of automation (i.e., using robotic instruments), parallel processing (i.e., prepare, process, and evaluate samples in parallel), and miniaturization (i.e., reduce sample size), high-throughput tools for synthesizing, formulating, and applying coating compositions have been developed at Dow. In addition, high-throughput workflows for measuring various coating properties, such as cure speed, hardness development, scratch resistance, impact toughness, resin compatibility, pot-life, surface defects, among others have also been developed in-house. These workflows correlate well with the traditional coatings tests, but they do not necessarily mimic those tests. The use of such high-throughput workflows in combination with smart experimental designs allows accelerated discovery and commercialization.

  9. Integration of Dosimetry, Exposure and High-Throughput Screening Data in Chemical Toxicity Assessment

    EPA Science Inventory

    High-throughput in vitro toxicity screening can provide an efficient way to identify potential biological targets for chemicals. However, relying on nominal assay concentrations may misrepresent potential in vivo effects of these chemicals due to differences in bioavailability, c...

  10. Techniques: high-throughput measurement of intracellular Ca(2+) -- back to basics.

    PubMed

    Monteith, Gregory R; Bird, Gary St J

    2005-04-01

    High-throughput screening techniques continue to provide important tools to the pharmaceutical industry for the efficient identification of drug leads. However, high-throughput techniques are now being exploited to address a variety of pharmacological and cellular signaling research questions, including the regulation and role of intracellular Ca(2+) in a plethora of biological systems. Although an awareness of specific assay conditions is crucial for reliable and reproducible measurements of intracellular free Ca(2+) whichever system of study is used, the complex temporal nature of Ca(2+) signals has posed some unique limitations for its measurement in high-throughput mode. Progress in high-throughput design has overcome many of these limitations and will complement other technical approaches to understanding the underlying regulation and role of intracellular Ca(2+).

  11. EMBRYONIC VASCULAR DISRUPTION ADVERSE OUTCOMES: LINKING HIGH THROUGHPUT SIGNALING SIGNATURES WITH FUNCTIONAL CONSEQUENCES

    EPA Science Inventory

    Embryonic vascular disruption is an important adverse outcome pathway (AOP) given the knowledge that chemical disruption of early cardiovascular system development leads to broad prenatal defects. High throughput screening (HTS) assays provide potential building blocks for AOP d...

  12. Picking Cell Lines for High-Throughput Transcriptomic Toxicity Screening (SOT)

    EPA Science Inventory

    High throughput, whole genome transcriptomic profiling is a promising approach to comprehensively evaluate chemicals for potential biological effects. To be useful for in vitro toxicity screening, gene expression must be quantified in a set of representative cell types that captu...

  13. An image analysis toolbox for high-throughput C. elegans assays

    PubMed Central

    Wählby, Carolina; Kamentsky, Lee; Liu, Zihan H.; Riklin-Raviv, Tammy; Conery, Annie L.; O’Rourke, Eyleen J.; Sokolnicki, Katherine L.; Visvikis, Orane; Ljosa, Vebjorn; Irazoqui, Javier E.; Golland, Polina; Ruvkun, Gary; Ausubel, Frederick M.; Carpenter, Anne E.

    2012-01-01

    We present a toolbox for high-throughput screening of image-based Caenorhabditis elegans phenotypes. The image analysis algorithms measure morphological phenotypes in individual worms and are effective for a variety of assays and imaging systems. This WormToolbox is available via the open-source CellProfiler project and enables objective scoring of whole-animal high-throughput image-based assays of C. elegans for the study of diverse biological pathways relevant to human disease. PMID:22522656

  14. High throughput electrospinning of high-quality nanofibers via an aluminum disk spinneret

    NASA Astrophysics Data System (ADS)

    Zheng, Guokuo

    In this work, a simple and efficient needleless high throughput electrospinning process using an aluminum disk spinneret with 24 holes is described. Electrospun mats produced by this setup consisted of fine fibers (nano-sized) of the highest quality while the productivity (yield) was many times that obtained from conventional single-needle electrospinning. The goal was to produce scaled-up amounts of the same or better quality nanofibers under variable concentration, voltage, and the working distance than those produced with the single needle lab setting. The fiber mats produced were either polymer or ceramic (such as molybdenum trioxide nanofibers). Through experimentation the optimum process conditions were defined to be: 24 kilovolt, a distance to collector of 15cm. More diluted solutions resulted in smaller diameter fibers. Comparing the morphologies of the nanofibers of MoO3 produced by both the traditional and the high throughput set up it was found that they were very similar. Moreover, the nanofibers production rate is nearly 10 times than that of traditional needle electrospinning. Thus, the high throughput process has the potential to become an industrial nanomanufacturing process and the materials processed by it may be used as filtration devices, in tissue engineering, and as sensors.

  15. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    SciTech Connect

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar; Ozcan, Aydogan

    2015-04-13

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ∼1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  16. High throughput on-chip analysis of high-energy charged particle tracks using lensfree imaging

    NASA Astrophysics Data System (ADS)

    Luo, Wei; Shabbir, Faizan; Gong, Chao; Gulec, Cagatay; Pigeon, Jeremy; Shaw, Jessica; Greenbaum, Alon; Tochitsky, Sergei; Joshi, Chandrashekhar; Ozcan, Aydogan

    2015-04-01

    We demonstrate a high-throughput charged particle analysis platform, which is based on lensfree on-chip microscopy for rapid ion track analysis using allyl diglycol carbonate, i.e., CR-39 plastic polymer as the sensing medium. By adopting a wide-area opto-electronic image sensor together with a source-shifting based pixel super-resolution technique, a large CR-39 sample volume (i.e., 4 cm × 4 cm × 0.1 cm) can be imaged in less than 1 min using a compact lensfree on-chip microscope, which detects partially coherent in-line holograms of the ion tracks recorded within the CR-39 detector. After the image capture, using highly parallelized reconstruction and ion track analysis algorithms running on graphics processing units, we reconstruct and analyze the entire volume of a CR-39 detector within ˜1.5 min. This significant reduction in the entire imaging and ion track analysis time not only increases our throughput but also allows us to perform time-resolved analysis of the etching process to monitor and optimize the growth of ion tracks during etching. This computational lensfree imaging platform can provide a much higher throughput and more cost-effective alternative to traditional lens-based scanning optical microscopes for ion track analysis using CR-39 and other passive high energy particle detectors.

  17. A GUINIER CAMERA FOR SR POWDER DIFFRACTION: HIGH RESOLUTION AND HIGH THROUGHPUT.

    SciTech Connect

    SIDDONS,D.P.; HULBERT, S.L.; STEPHENS, P.W.

    2006-05-28

    The paper describe a new powder diffraction instrument for synchrotron radiation sources which combines the high throughput of a position-sensitive detector system with the high resolution normally only provided by a crystal analyzer. It uses the Guinier geometry which is traditionally used with an x-ray tube source. This geometry adapts well to the synchrotron source, provided proper beam conditioning is applied. The high brightness of the SR source allows a high resolution to be achieved. When combined with a photon-counting silicon microstrip detector array, the system becomes a powerful instrument for radiation-sensitive samples or time-dependent phase transition studies.

  18. High Throughput 600 Watt Hall Effect Thruster for Space Exploration

    NASA Technical Reports Server (NTRS)

    Szabo, James; Pote, Bruce; Tedrake, Rachel; Paintal, Surjeet; Byrne, Lawrence; Hruby, Vlad; Kamhawi, Hani; Smith, Tim

    2016-01-01

    A nominal 600-Watt Hall Effect Thruster was developed to propel unmanned space vehicles. Both xenon and iodine compatible versions were demonstrated. With xenon, peak measured thruster efficiency is 46-48% at 600-W, with specific impulse from 1400 s to 1700 s. Evolution of the thruster channel due to ion erosion was predicted through numerical models and calibrated with experimental measurements. Estimated xenon throughput is greater than 100 kg. The thruster is well sized for satellite station keeping and orbit maneuvering, either by itself or within a cluster.

  19. Strategies for high-throughput focused-beam ptychography

    DOE PAGES

    Jacobsen, Chris; Deng, Junjing; Nashed, Youssef

    2017-08-08

    X-ray ptychography is being utilized for a wide range of imaging experiments with a resolution beyond the limit of the X-ray optics used. Introducing a parameter for the ptychographic resolution gainGp(the ratio of the beam size over the achieved pixel size in the reconstructed image), strategies for data sampling and for increasing imaging throughput when the specimen is at the focus of an X-ray beam are considered. As a result, the tradeoffs between large and small illumination spots are examined.

  20. High Throughput Light Absorber Discovery, Part 1: An Algorithm for Automated Tauc Analysis.

    PubMed

    Suram, Santosh K; Newhouse, Paul F; Gregoire, John M

    2016-11-14

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe2O3, Cu2V2O7, and BiVO4. The applicability of the algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.

  1. High throughput light absorber discovery, Part 1: An algorithm for automated tauc analysis

    DOE PAGES

    Suram, Santosh K.; Newhouse, Paul F.; Gregoire, John M.

    2016-09-23

    High-throughput experimentation provides efficient mapping of composition-property relationships, and its implementation for the discovery of optical materials enables advancements in solar energy and other technologies. In a high throughput pipeline, automated data processing algorithms are often required to match experimental throughput, and we present an automated Tauc analysis algorithm for estimating band gap energies from optical spectroscopy data. The algorithm mimics the judgment of an expert scientist, which is demonstrated through its application to a variety of high throughput spectroscopy data, including the identification of indirect or direct band gaps in Fe2O3, Cu2V2O7, and BiVO4. Here, the applicability of themore » algorithm to estimate a range of band gap energies for various materials is demonstrated by a comparison of direct-allowed band gaps estimated by expert scientists and by automated algorithm for 60 optical spectra.« less

  2. Outlook for development of high-throughput cryopreservation for small-bodied biomedical model fishes.

    PubMed

    Tiersch, Terrence R; Yang, Huiping; Hu, E

    2012-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach.

  3. Outlook for development of high-throughput cryopreservation for small-bodied biomedical model fishes.

    PubMed

    Tiersch, Terrence R; Yang, Huiping; Hu, E

    2011-08-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach.

  4. High-throughput screening based on label-free detection of small molecule microarrays

    NASA Astrophysics Data System (ADS)

    Zhu, Chenggang; Fei, Yiyan; Zhu, Xiangdong

    2017-02-01

    Based on small-molecule microarrays (SMMs) and oblique-incidence reflectivity difference (OI-RD) scanner, we have developed a novel high-throughput drug preliminary screening platform based on label-free monitoring of direct interactions between target proteins and immobilized small molecules. The screening platform is especially attractive for screening compounds against targets of unknown function and/or structure that are not compatible with functional assay development. In this screening platform, OI-RD scanner serves as a label-free detection instrument which is able to monitor about 15,000 biomolecular interactions in a single experiment without the need to label any biomolecule. Besides, SMMs serves as a novel format for high-throughput screening by immobilization of tens of thousands of different compounds on a single phenyl-isocyanate functionalized glass slide. Based on the high-throughput screening platform, we sequentially screened five target proteins (purified target proteins or cell lysate containing target protein) in high-throughput and label-free mode. We found hits for respective target protein and the inhibition effects for some hits were confirmed by following functional assays. Compared to traditional high-throughput screening assay, the novel high-throughput screening platform has many advantages, including minimal sample consumption, minimal distortion of interactions through label-free detection, multi-target screening analysis, which has a great potential to be a complementary screening platform in the field of drug discovery.

  5. Outlook for Development of High-throughput Cryopreservation for Small-bodied Biomedical Model Fishes★

    PubMed Central

    Tiersch, Terrence R.; Yang, Huiping; Hu, E.

    2011-01-01

    With the development of genomic research technologies, comparative genome studies among vertebrate species are becoming commonplace for human biomedical research. Fish offer unlimited versatility for biomedical research. Extensive studies are done using these fish models, yielding tens of thousands of specific strains and lines, and the number is increasing every day. Thus, high-throughput sperm cryopreservation is urgently needed to preserve these genetic resources. Although high-throughput processing has been widely applied for sperm cryopreservation in livestock for decades, application in biomedical model fishes is still in the concept-development stage because of the limited sample volumes and the biological characteristics of fish sperm. High-throughput processing in livestock was developed based on advances made in the laboratory and was scaled up for increased processing speed, capability for mass production, and uniformity and quality assurance. Cryopreserved germplasm combined with high-throughput processing constitutes an independent industry encompassing animal breeding, preservation of genetic diversity, and medical research. Currently, there is no specifically engineered system available for high-throughput of cryopreserved germplasm for aquatic species. This review is to discuss the concepts and needs for high-throughput technology for model fishes, propose approaches for technical development, and overview future directions of this approach. PMID:21440666

  6. Cheminformatics aspects of high throughput screening: from robots to models: symposium summary.

    PubMed

    Jane Tseng, Y; Martin, Eric; G Bologa, Cristian; Shelat, Anang A

    2013-05-01

    The "Cheminformatics aspects of high throughput screening (HTS): from robots to models" symposium was part of the computers in chemistry technical program at the American Chemical Society National Meeting in Denver, Colorado during the fall of 2011. This symposium brought together researchers from high throughput screening centers and molecular modelers from academia and industry to discuss the integration of currently available high throughput screening data and assays with computational analysis. The topics discussed at this symposium covered the data-infrastructure at various academic, hospital, and National Institutes of Health-funded high throughput screening centers, the cheminformatics and molecular modeling methods used in real world examples to guide screening and hit-finding, and how academic and non-profit organizations can benefit from current high throughput screening cheminformatics resources. Specifically, this article also covers the remarks and discussions in the open panel discussion of the symposium and summarizes the following talks on "Accurate Kinase virtual screening: biochemical, cellular and selectivity", "Selective, privileged and promiscuous chemical patterns in high-throughput screening" and "Visualizing and exploring relationships among HTS hits using network graphs".

  7. Cheminformatics Aspects of High Throughput Screening: from Robots to Models: Symposium Summary

    PubMed Central

    Tseng, Y. Jane; Martin, Eric; Bologa, Cristian; Shelat, Anang A.

    2014-01-01

    The “Cheminformatics aspects of high throughput screening (HTS): from robots to models” symposium was part of the Computers in Chemistry (COMP) technical program at the American Chemical Society National Meeting in Denver, Colorado during the fall of 2011. This symposium brought together researchers from high throughput screening centersand molecular modelers from academia and industry to discuss the integration of currently available high throughput screening data and assays with computational analysis. The topics discussed at this symposium covered the data-infrastructure at various academic, hospital, and NIH-funded high throughput screening centers, the cheminformatics and molecular modeling methods used in real world examples to guide screening and hit-finding, and how academic and non-profit organizations can benefit from current high throughput screening cheminformatics resources. Specifically, this article also covers the remarks and discussions in the open panel discussion in thesymposium and summarizes the following talks on “Accurate Kinase virtual screening: biochemical, cellular and selectivity”, “Selective, privileged and promiscuous chemical patterns in high-throughput screening” and “Visualizing and exploring relationships among HTS hits using network graphs”. PMID:23636795

  8. Lessons from high-throughput protein crystallization screening: 10 years of practical experience

    PubMed Central

    JR, Luft; EH, Snell; GT, DeTitta

    2011-01-01

    Introduction X-ray crystallography provides the majority of our structural biological knowledge at a molecular level and in terms of pharmaceutical design is a valuable tool to accelerate discovery. It is the premier technique in the field, but its usefulness is significantly limited by the need to grow well-diffracting crystals. It is for this reason that high-throughput crystallization has become a key technology that has matured over the past 10 years through the field of structural genomics. Areas covered The authors describe their experiences in high-throughput crystallization screening in the context of structural genomics and the general biomedical community. They focus on the lessons learnt from the operation of a high-throughput crystallization screening laboratory, which to date has screened over 12,500 biological macromolecules. They also describe the approaches taken to maximize the success while minimizing the effort. Through this, the authors hope that the reader will gain an insight into the efficient design of a laboratory and protocols to accomplish high-throughput crystallization on a single-, multiuser-laboratory or industrial scale. Expert Opinion High-throughput crystallization screening is readily available but, despite the power of the crystallographic technique, getting crystals is still not a solved problem. High-throughput approaches can help when used skillfully; however, they still require human input in the detailed analysis and interpretation of results to be more successful. PMID:22646073

  9. Cheminformatics aspects of high throughput screening: from robots to models: symposium summary

    NASA Astrophysics Data System (ADS)

    Jane Tseng, Y.; Martin, Eric; G. Bologa, Cristian; Shelat, Anang A.

    2013-05-01

    The "Cheminformatics aspects of high throughput screening (HTS): from robots to models" symposium was part of the computers in chemistry technical program at the American Chemical Society National Meeting in Denver, Colorado during the fall of 2011. This symposium brought together researchers from high throughput screening centers and molecular modelers from academia and industry to discuss the integration of currently available high throughput screening data and assays with computational analysis. The topics discussed at this symposium covered the data-infrastructure at various academic, hospital, and National Institutes of Health-funded high throughput screening centers, the cheminformatics and molecular modeling methods used in real world examples to guide screening and hit-finding, and how academic and non-profit organizations can benefit from current high throughput screening cheminformatics resources. Specifically, this article also covers the remarks and discussions in the open panel discussion of the symposium and summarizes the following talks on "Accurate Kinase virtual screening: biochemical, cellular and selectivity", "Selective, privileged and promiscuous chemical patterns in high-throughput screening" and "Visualizing and exploring relationships among HTS hits using network graphs".

  10. High-Throughput Assessment of Cellular Mechanical Properties.

    PubMed

    Darling, Eric M; Di Carlo, Dino

    2015-01-01

    Traditionally, cell analysis has focused on using molecular biomarkers for basic research, cell preparation, and clinical diagnostics; however, new microtechnologies are enabling evaluation of the mechanical properties of cells at throughputs that make them amenable to widespread use. We review the current understanding of how the mechanical characteristics of cells relate to underlying molecular and architectural changes, describe how these changes evolve with cell-state and disease processes, and propose promising biomedical applications that will be facilitated by the increased throughput of mechanical testing: from diagnosing cancer and monitoring immune states to preparing cells for regenerative medicine. We provide background about techniques that laid the groundwork for the quantitative understanding of cell mechanics and discuss current efforts to develop robust techniques for rapid analysis that aim to implement mechanophenotyping as a routine tool in biomedicine. Looking forward, we describe additional milestones that will facilitate broad adoption, as well as new directions not only in mechanically assessing cells but also in perturbing them to passively engineer cell state.

  11. High-throughput miniaturized microfluidic microscopy with radially parallelized channel geometry.

    PubMed

    Jagannadh, Veerendra Kalyan; Bhat, Bindu Prabhath; Nirupa Julius, Lourdes Albina; Gorthi, Sai Siva

    2016-03-01

    In this article, we present a novel approach to throughput enhancement in miniaturized microfluidic microscopy systems. Using the presented approach, we demonstrate an inexpensive yet high-throughput analytical instrument. Using the high-throughput analytical instrument, we have been able to achieve about 125,880 cells per minute (more than one hundred and twenty five thousand cells per minute), even while employing cost-effective low frame rate cameras (120 fps). The throughput achieved here is a notable progression in the field of diagnostics as it enables rapid quantitative testing and analysis. We demonstrate the applicability of the instrument to point-of-care diagnostics, by performing blood cell counting. We report a comparative analysis between the counts (in cells per μl) obtained from our instrument, with that of a commercially available hematology analyzer.

  12. A high-throughput in vitro ring assay for vasoactivity using magnetic 3D bioprinting

    PubMed Central

    Tseng, Hubert; Gage, Jacob A.; Haisler, William L.; Neeley, Shane K.; Shen, Tsaiwei; Hebel, Chris; Barthlow, Herbert G.; Wagoner, Matthew; Souza, Glauco R.

    2016-01-01

    Vasoactive liabilities are typically assayed using wire myography, which is limited by its high cost and low throughput. To meet the demand for higher throughput in vitro alternatives, this study introduces a magnetic 3D bioprinting-based vasoactivity assay. The principle behind this assay is the magnetic printing of vascular smooth muscle cells into 3D rings that functionally represent blood vessel segments, whose contraction can be altered by vasodilators and vasoconstrictors. A cost-effective imaging modality employing a mobile device is used to capture contraction with high throughput. The goal of this study was to validate ring contraction as a measure of vasoactivity, using a small panel of known vasoactive drugs. In vitro responses of the rings matched outcomes predicted by in vivo pharmacology, and were supported by immunohistochemistry. Altogether, this ring assay robustly models vasoactivity, which could meet the need for higher throughput in vitro alternatives. PMID:27477945

  13. Space Link Extension Protocol Emulation for High-Throughput, High-Latency Network Connections

    NASA Technical Reports Server (NTRS)

    Tchorowski, Nicole; Murawski, Robert

    2014-01-01

    New space missions require higher data rates and new protocols to meet these requirements. These high data rate space communication links push the limitations of not only the space communication links, but of the ground communication networks and protocols which forward user data to remote ground stations (GS) for transmission. The Consultative Committee for Space Data Systems, (CCSDS) Space Link Extension (SLE) standard protocol is one protocol that has been proposed for use by the NASA Space Network (SN) Ground Segment Sustainment (SGSS) program. New protocol implementations must be carefully tested to ensure that they provide the required functionality, especially because of the remote nature of spacecraft. The SLE protocol standard has been tested in the NASA Glenn Research Center's SCENIC Emulation Lab in order to observe its operation under realistic network delay conditions. More specifically, the delay between then NASA Integrated Services Network (NISN) and spacecraft has been emulated. The round trip time (RTT) delay for the continental NISN network has been shown to be up to 120ms; as such the SLE protocol was tested with network delays ranging from 0ms to 200ms. Both a base network condition and an SLE connection were tested with these RTT delays, and the reaction of both network tests to the delay conditions were recorded. Throughput for both of these links was set at 1.2Gbps. The results will show that, in the presence of realistic network delay, the SLE link throughput is significantly reduced while the base network throughput however remained at the 1.2Gbps specification. The decrease in SLE throughput has been attributed to the implementation's use of blocking calls. The decrease in throughput is not acceptable for high data rate links, as the link requires constant data a flow in order for spacecraft and ground radios to stay synchronized, unless significant data is queued a the ground station. In cases where queuing the data is not an option

  14. A high-throughput, high-quality plant genomic DNA extraction protocol.

    PubMed

    Li, H; Li, J; Cong, X H; Duan, Y B; Li, L; Wei, P C; Lu, X Z; Yang, J B

    2013-10-15

    The isolation of high-quality genomic DNA (gDNA) is a crucial technique in plant molecular biology. The quality of gDNA determines the reliability of real-time polymerase chain reaction (PCR) analysis. In this paper, we reported a high-quality gDNA extraction protocol optimized for real-time PCR in a variety of plant species. Performed in a 96-well block, our protocol provides high throughput. Without the need for phenol-chloroform and liquid nitrogen or dry ice, our protocol is safer and more cost-efficient than traditional DNA extraction methods. The method takes 10 mg leaf tissue to yield 5-10 µg high-quality gDNA. Spectral measurement and electrophoresis were used to demonstrate gDNA purity. The extracted DNA was qualified in a restriction enzyme digestion assay and conventional PCR. The real-time PCR amplification was sufficiently sensitive to detect gDNA at very low concentrations (3 pg/µL). The standard curve of gDNA dilutions from our phenol-chloroform-free protocol showed better linearity (R(2) = 0.9967) than the phenol-chloroform protocol (R(2) = 0.9876). The results indicate that the gDNA was of high quality and fit for real-time PCR. This safe, high-throughput plant gDNA extraction protocol could be used to isolate high-quality gDNA for real-time PCR and other downstream molecular applications.

  15. 3D nanochannel electroporation for high-throughput cell transfection with high uniformity and dosage control

    NASA Astrophysics Data System (ADS)

    Chang, Lingqian; Bertani, Paul; Gallego-Perez, Daniel; Yang, Zhaogang; Chen, Feng; Chiang, Chiling; Malkoc, Veysi; Kuang, Tairong; Gao, Keliang; Lee, L. James; Lu, Wu

    2015-12-01

    Of great interest to modern medicine and biomedical research is the ability to inject individual target cells with the desired genes or drug molecules. Some advances in cell electroporation allow for high throughput, high cell viability, or excellent dosage control, yet no platform is available for the combination of all three. In an effort to solve this problem, here we show a ``3D nano-channel electroporation (NEP) chip'' on a silicon platform designed to meet these three criteria. This NEP chip can simultaneously deliver the desired molecules into 40 000 cells per cm2 on the top surface of the device. Each 650 nm pore aligns to a cell and can be used to deliver extremely small biological elements to very large plasmids (>10 kbp). When compared to conventional bulk electroporation (BEP), the NEP chip shows a 20 fold improvement in dosage control and uniformity, while still maintaining high cell viability (>90%) even in cells such as cardiac cells which are characteristically difficult to transfect. This high-throughput 3D NEP system provides an innovative and medically valuable platform with uniform and reliable cellular transfection, allowing for a steady supply of healthy, engineered cells.Of great interest to modern medicine and biomedical research is the ability to inject individual target cells with the desired genes or drug molecules. Some advances in cell electroporation allow for high throughput, high cell viability, or excellent dosage control, yet no platform is available for the combination of all three. In an effort to solve this problem, here we show a ``3D nano-channel electroporation (NEP) chip'' on a silicon platform designed to meet these three criteria. This NEP chip can simultaneously deliver the desired molecules into 40 000 cells per cm2 on the top surface of the device. Each 650 nm pore aligns to a cell and can be used to deliver extremely small biological elements to very large plasmids (>10 kbp). When compared to conventional bulk

  16. Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing.

    PubMed

    Shafer, Aaron B A; Northrup, Joseph M; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B W

    2016-01-01

    Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations.

  17. Forecasting Ecological Genomics: High-Tech Animal Instrumentation Meets High-Throughput Sequencing

    PubMed Central

    Shafer, Aaron B. A.; Northrup, Joseph M.; Wikelski, Martin; Wittemyer, George; Wolf, Jochen B. W.

    2016-01-01

    Recent advancements in animal tracking technology and high-throughput sequencing are rapidly changing the questions and scope of research in the biological sciences. The integration of genomic data with high-tech animal instrumentation comes as a natural progression of traditional work in ecological genetics, and we provide a framework for linking the separate data streams from these technologies. Such a merger will elucidate the genetic basis of adaptive behaviors like migration and hibernation and advance our understanding of fundamental ecological and evolutionary processes such as pathogen transmission, population responses to environmental change, and communication in natural populations. PMID:26745372

  18. Identification of functional modules using network topology and high-throughput data

    PubMed Central

    Ulitsky, Igor; Shamir, Ron

    2007-01-01

    Background With the advent of systems biology, biological knowledge is often represented today by networks. These include regulatory and metabolic networks, protein-protein interaction networks, and many others. At the same time, high-throughput genomics and proteomics techniques generate very large data sets, which require sophisticated computational analysis. Usually, separate and different analysis methodologies are applied to each of the two data types. An integrated investigation of network and high-throughput information together can improve the quality of the analysis by accounting simultaneously for topological network properties alongside intrinsic features of the high-throughput data. Results We describe a novel algorithmic framework for this challenge. We first transform the high-throughput data into similarity values, (e.g., by computing pairwise similarity of gene expression patterns from microarray data). Then, given a network of genes or proteins and similarity values between some of them, we seek connected sub-networks (or modules) that manifest high similarity. We develop algorithms for this problem and evaluate their performance on the osmotic shock response network in S. cerevisiae and on the human cell cycle network. We demonstrate that focused, biologically meaningful and relevant functional modules are obtained. In comparison with extant algorithms, our approach has higher sensitivity and higher specificity. Conclusion We have demonstrated that our method can accurately identify functional modules. Hence, it carries the promise to be highly useful in analysis of high throughput data. PMID:17408515

  19. Accelerator mass spectrometry targets of submilligram carbonaceous samples using the high-throughput Zn reduction method.

    PubMed

    Kim, Seung-Hyun; Kelly, Peter B; Clifford, Andrew J

    2009-07-15

    The high-throughput Zn reduction method was developed and optimized for various biological/biomedical accelerator mass spectrometry (AMS) applications of mg of C size samples. However, high levels of background carbon from the high-throughput Zn reduction method were not suitable for sub-mg of C size samples in environmental, geochronology, and biological/biomedical AMS applications. This study investigated the effect of background carbon mass (mc) and background 14C level (Fc) from the high-throughput Zn reduction method. Background mc was 0.011 mg of C and background Fc was 1.5445. Background subtraction, two-component mixing, and expanded formulas were used for background correction. All three formulas accurately corrected for backgrounds to 0.025 mg of C in the aerosol standard (NIST SRM 1648a). Only the background subtraction and the two-component mixing formulas accurately corrected for backgrounds to 0.1 mg of C in the IAEA-C6 and -C7 standards. After the background corrections, our high-throughput Zn reduction method was suitable for biological (diet)/biomedical (drug) and environmental (fine particulate matter) applications of sub-mg of C samples (> or = 0.1 mg of C) in keeping with a balance between throughput (270 samples/day/analyst) and sensitivity/accuracy/precision of AMS measurement. The development of a high-throughput method for examination of > or = 0.1 mg of C size samples opens up a range of applications for 14C AMS studies. While other methods do exist for > or = 0.1 mg of C size samples, the low throughput has made them cost prohibitive for many applications.

  20. High-throughput analysis of yeast replicative aging using a microfluidic system.

    PubMed

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-07-28

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction.

  1. High-throughput analysis of yeast replicative aging using a microfluidic system

    PubMed Central

    Jo, Myeong Chan; Liu, Wei; Gu, Liang; Dang, Weiwei; Qin, Lidong

    2015-01-01

    Saccharomyces cerevisiae has been an important model for studying the molecular mechanisms of aging in eukaryotic cells. However, the laborious and low-throughput methods of current yeast replicative lifespan assays limit their usefulness as a broad genetic screening platform for research on aging. We address this limitation by developing an efficient, high-throughput microfluidic single-cell analysis chip in combination with high-resolution time-lapse microscopy. This innovative design enables, to our knowledge for the first time, the determination of the yeast replicative lifespan in a high-throughput manner. Morphological and phenotypical changes during aging can also be monitored automatically with a much higher throughput than previous microfluidic designs. We demonstrate highly efficient trapping and retention of mother cells, determination of the replicative lifespan, and tracking of yeast cells throughout their entire lifespan. Using the high-resolution and large-scale data generated from the high-throughput yeast aging analysis (HYAA) chips, we investigated particular longevity-related changes in cell morphology and characteristics, including critical cell size, terminal morphology, and protein subcellular localization. In addition, because of the significantly improved retention rate of yeast mother cell, the HYAA-Chip was capable of demonstrating replicative lifespan extension by calorie restriction. PMID:26170317

  2. High throughput screening of ligand binding to macromolecules using high resolution powder diffraction

    DOEpatents

    Von Dreele, Robert B.; D'Amico, Kevin

    2006-10-31

    A process is provided for the high throughput screening of binding of ligands to macromolecules using high resolution powder diffraction data including producing a first sample slurry of a selected polycrystalline macromolecule material and a solvent, producing a second sample slurry of a selected polycrystalline macromolecule material, one or more ligands and the solvent, obtaining a high resolution powder diffraction pattern on each of said first sample slurry and the second sample slurry, and, comparing the high resolution powder diffraction pattern of the first sample slurry and the high resolution powder diffraction pattern of the second sample slurry whereby a difference in the high resolution powder diffraction patterns of the first sample slurry and the second sample slurry provides a positive indication for the formation of a complex between the selected polycrystalline macromolecule material and at least one of the one or more ligands.

  3. High-throughput atomic force microscopes operating in parallel

    NASA Astrophysics Data System (ADS)

    Sadeghian, Hamed; Herfst, Rodolf; Dekker, Bert; Winters, Jasper; Bijnagte, Tom; Rijnbeek, Ramon

    2017-03-01

    Atomic force microscopy (AFM) is an essential nanoinstrument technique for several applications such as cell biology and nanoelectronics metrology and inspection. The need for statistically significant sample sizes means that data collection can be an extremely lengthy process in AFM. The use of a single AFM instrument is known for its very low speed and not being suitable for scanning large areas, resulting in a very-low-throughput measurement. We address this challenge by parallelizing AFM instruments. The parallelization is achieved by miniaturizing the AFM instrument and operating many of them simultaneously. This instrument has the advantages that each miniaturized AFM can be operated independently and that the advances in the field of AFM, both in terms of speed and imaging modalities, can be implemented more easily. Moreover, a parallel AFM instrument also allows one to measure several physical parameters simultaneously; while one instrument measures nano-scale topography, another instrument can measure mechanical, electrical, or thermal properties, making it a lab-on-an-instrument. In this paper, a proof of principle of such a parallel AFM instrument has been demonstrated by analyzing the topography of large samples such as semiconductor wafers. This nanoinstrument provides new research opportunities in the nanometrology of wafers and nanolithography masks by enabling real die-to-die and wafer-level measurements and in cell biology by measuring the nano-scale properties of a large number of cells.

  4. Design of a High-Throughput CABAC Encoder

    NASA Astrophysics Data System (ADS)

    Lo, Chia-Cheng; Zeng, Ying-Jhong; Shieh, Ming-Der

    Context-based Adaptive Binary Arithmetic Coding(CABAC) is one of the algorithmic improvements that the H.264/AVC standard provides to enhance the compression ratio of video sequences. Compared with the context-based adaptive variable length coding (CAVLC), CABAC can obtain a better compression ratio at the price of higher computation complexity. In particular, the inherent data dependency and various types of syntax elements in CABAC results in a dramatically increased complexity if two bins obtained from binarized syntax elements are handled at a time. By analyzing the distribution of binarized bins in different video sequences, this work shows how to effectively improve the encoding rate with limited hardware overhead by allowing only a certain type of syntax element to be processed two bins at a time. Together with the proposed context memory management scheme and range renovation method, experimental results reveal that an encoding rate of up to 410M-bin/s can be obtained with a limited increase in hardware requirement. Compared with related works that do not support multi-symbol encoding, our development can achieve nearly twice their throughput rates with less than 25% hardware overhead.

  5. Proteome-wide association studies identify biochemical modules associated with a wing-size phenotype in Drosophila melanogaster

    PubMed Central

    Okada, Hirokazu; Ebhardt, H. Alexander; Vonesch, Sibylle Chantal; Aebersold, Ruedi; Hafen, Ernst

    2016-01-01

    The manner by which genetic diversity within a population generates individual phenotypes is a fundamental question of biology. To advance the understanding of the genotype–phenotype relationships towards the level of biochemical processes, we perform a proteome-wide association study (PWAS) of a complex quantitative phenotype. We quantify the variation of wing imaginal disc proteomes in Drosophila genetic reference panel (DGRP) lines using SWATH mass spectrometry. In spite of the very large genetic variation (1/36 bp) between the lines, proteome variability is surprisingly small, indicating strong molecular resilience of protein expression patterns. Proteins associated with adult wing size form tight co-variation clusters that are enriched in fundamental biochemical processes. Wing size correlates with some basic metabolic functions, positively with glucose metabolism but negatively with mitochondrial respiration and not with ribosome biogenesis. Our study highlights the power of PWAS to filter functional variants from the large genetic variability in natural populations. PMID:27582081

  6. High throughput chromatography strategies for potential use in the formal process characterization of a monoclonal antibody.

    PubMed

    Petroff, Matthew G; Bao, Haiying; Welsh, John P; van Beuningen-de Vaan, Miranda; Pollard, Jennifer M; Roush, David J; Kandula, Sunitha; Machielsen, Peter; Tugcu, Nihal; Linden, Thomas O

    2016-06-01

    High throughput experimental strategies are central to the rapid optimization of biologics purification processes. In this work, we extend common high throughput technologies towards the characterization of a multi-column chromatography process for a monoclonal antibody (mAb). Scale-down strategies were first evaluated by comparing breakthrough, retention, and performance (yields and clearance of aggregates and host cell protein) across miniature and lab scale columns. The process operating space was then evaluated using several integrated formats, with batch experimentation to define process testing ranges, miniature columns to evaluate the operating space, and comparison to traditional scale columns to establish scale-up correlations and verify the determined operating space. When compared to an independent characterization study at traditional lab column scale, the high throughput approach identified the same control parameters and similar process sensitivity. Importantly, the high throughput approach significantly decreased time and material needs while improving prediction robustness. Miniature columns and manufacturing scale centerpoint data comparisons support the validity of this approach, making the high throughput strategy an attractive and appropriate scale-down tool for the formal characterization of biotherapeutic processes in the future if regulatory acceptance of the miniature column data can be achieved. Biotechnol. Bioeng. 2016;113: 1273-1283. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  7. High-Throughput Screening of Perovskite Alloys for Piezoelectric Performance and Formability

    NASA Astrophysics Data System (ADS)

    Armiento, Rickard; Kozinsky, Boris; Hautier, Geoffroy; Fornari, Marco; Ceder, Gerbrand

    2014-03-01

    We use high-throughput computational density functional theory to screen a large chemical space of perovskite alloys for systems with the right properties to accommodate a morphotropic phase boundary (MPB) in their composition-temperature phase diagram, a crucial feature for high piezoelectric performance. We start from alloy end-points previously identified in a high-throughput computational search. An interpolation scheme is used to estimate the relative energies between different perovskite distortions for alloy compositions with a minimum of computational effort. Suggested alloys are further screened for thermodynamic stability. The screening identifies alloy systems already known to host a MPB, and suggests a few new ones that may be promising candidates for future experiments. Our method of investigation may be extended to other perovskite systems, e.g., (oxy-)nitrides, and provides a useful methodology for any application of high-throughput screening of isovalent alloy systems. Preprint available at http://arxiv.org/abs/1309.1727

  8. High-throughput imaging: Focusing in on drug discovery in 3D.

    PubMed

    Li, Linfeng; Zhou, Qiong; Voss, Ty C; Quick, Kevin L; LaBarbera, Daniel V

    2016-03-01

    3D organotypic culture models such as organoids and multicellular tumor spheroids (MCTS) are becoming more widely used for drug discovery and toxicology screening. As a result, 3D culture technologies adapted for high-throughput screening formats are prevalent. While a multitude of assays have been reported and validated for high-throughput imaging (HTI) and high-content screening (HCS) for novel drug discovery and toxicology, limited HTI/HCS with large compound libraries have been reported. Nonetheless, 3D HTI instrumentation technology is advancing and this technology is now on the verge of allowing for 3D HCS of thousands of samples. This review focuses on the state-of-the-art high-throughput imaging systems, including hardware and software, and recent literature examples of 3D organotypic culture models employing this technology for drug discovery and toxicology screening.

  9. A High Through-put Combinatorial Growth Technique for Semiconductor Thin Film Search

    NASA Astrophysics Data System (ADS)

    Ma, Z. X.; Hao, H. Y.; Xiao, P.; Oehlerking, L. J.; Liu, D. F.; Zhang, X. J.; Yu, K.-M.; Walukiewicz, W.; Mao, S. S.; Yu, P. Y.

    2011-12-01

    Conventional semiconductor material growth technique is costly and time-consuming. Here we developed a new method to growth semiconductor thin films using high through-put combinatorial technique. In this way, we have successfully fabricated tens of semiconductor libraries with high crystallinity and high product of μτ for the purpose of radiation detection.

  10. High-throughput measurements of the optical redox ratio using a commercial microplate reader

    NASA Astrophysics Data System (ADS)

    Cannon, Taylor M.; Shah, Amy T.; Walsh, Alex J.; Skala, Melissa C.

    2015-01-01

    There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p<0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p<0.05) and lack of response (p>0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results.

  11. High-throughput measurements of the optical redox ratio using a commercial microplate reader

    PubMed Central

    Cannon, Taylor M.; Shah, Amy T.; Walsh, Alex J.; Skala, Melissa C.

    2015-01-01

    Abstract. There is a need for accurate, high-throughput, functional measures to gauge the efficacy of potential drugs in living cells. As an early marker of drug response in cells, cellular metabolism provides an attractive platform for high-throughput drug testing. Optical techniques can noninvasively monitor NADH and FAD, two autofluorescent metabolic coenzymes. The autofluorescent redox ratio, defined as the autofluorescence intensity of NADH divided by that of FAD, quantifies relative rates of cellular glycolysis and oxidative phosphorylation. However, current microscopy methods for redox ratio quantification are time-intensive and low-throughput, limiting their practicality in drug screening. Alternatively, high-throughput commercial microplate readers quickly measure fluorescence intensities for hundreds of wells. This study found that a commercial microplate reader can differentiate the receptor status of breast cancer cell lines (p<0.05) based on redox ratio measurements without extrinsic contrast agents. Furthermore, microplate reader redox ratio measurements resolve response (p<0.05) and lack of response (p>0.05) in cell lines that are responsive and nonresponsive, respectively, to the breast cancer drug trastuzumab. These studies indicate that the microplate readers can be used to measure the redox ratio in a high-throughput manner and are sensitive enough to detect differences in cellular metabolism that are consistent with microscopy results. PMID:25634108

  12. A High-Throughput Biological Calorimetry Core: Steps to Startup, Run, and Maintain a Multiuser Facility.

    PubMed

    Yennawar, Neela H; Fecko, Julia A; Showalter, Scott A; Bevilacqua, Philip C

    2016-01-01

    Many labs have conventional calorimeters where denaturation and binding experiments are setup and run one at a time. While these systems are highly informative to biopolymer folding and ligand interaction, they require considerable manual intervention for cleaning and setup. As such, the throughput for such setups is limited typically to a few runs a day. With a large number of experimental parameters to explore including different buffers, macromolecule concentrations, temperatures, ligands, mutants, controls, replicates, and instrument tests, the need for high-throughput automated calorimeters is on the rise. Lower sample volume requirements and reduced user intervention time compared to the manual instruments have improved turnover of calorimetry experiments in a high-throughput format where 25 or more runs can be conducted per day. The cost and efforts to maintain high-throughput equipment typically demands that these instruments be housed in a multiuser core facility. We describe here the steps taken to successfully start and run an automated biological calorimetry facility at Pennsylvania State University. Scientists from various departments at Penn State including Chemistry, Biochemistry and Molecular Biology, Bioengineering, Biology, Food Science, and Chemical Engineering are benefiting from this core facility. Samples studied include proteins, nucleic acids, sugars, lipids, synthetic polymers, small molecules, natural products, and virus capsids. This facility has led to higher throughput of data, which has been leveraged into grant support, attracting new faculty hire and has led to some exciting publications.

  13. A novel organelle map framework for high-content cell morphology analysis in high throughput.

    PubMed

    Schauer, Kristine; Grossier, Jean-Philippe; Duong, Tarn; Chapuis, Violaine; Degot, Sébastien; Lescure, Aurianne; Del Nery, Elaine; Goud, Bruno

    2014-02-01

    A screening procedure was developed that takes advantage of the cellular normalization by micropatterning and a novel quantitative organelle mapping approach that allows unbiased and automated cell morphology comparison using black-box statistical testing. Micropatterns of extracellular matrix proteins force cells to adopt a reproducible shape and distribution of intracellular compartments avoiding strong cell-to-cell variation that is a major limitation of classical culture conditions. To detect changes in cell morphology induced by compound treatment, fluorescently labeled intracellular structures from several tens of micropatterned cells were transformed into probabilistic density maps. Then, the similarity or difference between two given density maps was quantified using statistical testing that evaluates differences directly from the data without additional analysis or any subjective decision. The versatility of this organelle mapping approach for different magnifications and its performance for different cell shapes has been assessed. Density-based analysis detected changes in cell morphology due to compound treatment in a small-scale proof-of-principle screen demonstrating its compatibility with high-throughput screening. This novel tool for high-content and high-throughput cellular phenotyping can potentially be used for a wide range of applications from drug screening to careful characterization of cellular processes.

  14. A novel imaging-based high-throughput screening approach to anti-angiogenic drug discovery.

    PubMed

    Evensen, Lasse; Micklem, David R; Link, Wolfgang; Lorens, James B

    2010-01-01

    The successful progression to the clinic of angiogenesis inhibitors for cancer treatment has spurred interest in developing new classes of anti-angiogenic compounds. The resulting surge in available candidate therapeutics highlights the need for robust, high-throughput angiogenesis screening systems that adequately capture the complexity of new vessel formation while providing quantitative evaluation of the potency of these agents. Available in vitro angiogenesis assays are either cumbersome, impeding adaptation to high-throughput screening formats, or inadequately model the complex multistep process of new vessel formation. We therefore developed an organotypic endothelial-mural cell co-culture assay system that reflects several facets of angiogenesis while remaining compatible with high-throughput/high-content image screening. Co-culture of primary human endothelial cells (EC) and vascular smooth muscle cells (vSMC) results in assembly of a network of tubular endothelial structures enveloped with vascular basement membrane proteins, thus, comprising the three main components of blood vessels. Initially, EC are dependent on vSMC-derived VEGF and sensitive to clinical anti-angiogenic therapeutics. A subsequent phenotypic VEGF-switch renders EC networks resistant to anti-VEGF therapeutics, demarcating a mature vascular phenotype. Conversely, mature EC networks remain sensitive to vascular disrupting agents. Therefore, candidate anti-angiogenic compounds can be interrogated for their relative potency on immature and mature networks and classified as either vascular normalizing or vascular disrupting agents. Here, we demonstrate that the EC-vSMC co-culture assay represents a robust high-content imaging high-throughput screening system for identification of novel anti-angiogenic agents. A pilot high-throughput screening campaign was used to define informative imaging parameters and develop a follow-up dose-response scheme for hit characterization. High-throughput

  15. Correlation of proteome-wide changes with social immunity behaviors provides insight into resistance to the parasitic mite, Varroa destructor, in the honey bee (Apis mellifera)

    PubMed Central

    2012-01-01

    Background Disease is a major factor driving the evolution of many organisms. In honey bees, selection for social behavioral responses is the primary adaptive process facilitating disease resistance. One such process, hygienic behavior, enables bees to resist multiple diseases, including the damaging parasitic mite Varroa destructor. The genetic elements and biochemical factors that drive the expression of these adaptations are currently unknown. Proteomics provides a tool to identify proteins that control behavioral processes, and these proteins can be used as biomarkers to aid identification of disease tolerant colonies. Results We sampled a large cohort of commercial queen lineages, recording overall mite infestation, hygiene, and the specific hygienic response to V. destructor. We performed proteome-wide correlation analyses in larval integument and adult antennae, identifying several proteins highly predictive of behavior and reduced hive infestation. In the larva, response to wounding was identified as a key adaptive process leading to reduced infestation, and chitin biosynthesis and immune responses appear to represent important disease resistant adaptations. The speed of hygienic behavior may be underpinned by changes in the antenna proteome, and chemosensory and neurological processes could also provide specificity for detection of V. destructor in antennae. Conclusions Our results provide, for the first time, some insight into how complex behavioural adaptations manifest in the proteome of honey bees. The most important biochemical correlations provide clues as to the underlying molecular mechanisms of social and innate immunity of honey bees. Such changes are indicative of potential divergence in processes controlling the hive-worker maturation. PMID:23021491

  16. Fulfilling the promise of the materials genome initiative with high-throughput experimental methodologies

    NASA Astrophysics Data System (ADS)

    Green, M. L.; Choi, C. L.; Hattrick-Simpers, J. R.; Joshi, A. M.; Takeuchi, I.; Barron, S. C.; Campo, E.; Chiang, T.; Empedocles, S.; Gregoire, J. M.; Kusne, A. G.; Martin, J.; Mehta, A.; Persson, K.; Trautt, Z.; Van Duren, J.; Zakutayev, A.

    2017-03-01

    The Materials Genome Initiative, a national effort to introduce new materials into the market faster and at lower cost, has made significant progress in computational simulation and modeling of materials. To build on this progress, a large amount of experimental data for validating these models, and informing more sophisticated ones, will be required. High-throughput experimentation generates large volumes of experimental data using combinatorial materials synthesis and rapid measurement techniques, making it an ideal experimental complement to bring the Materials Genome Initiative vision to fruition. This paper reviews the state-of-the-art results, opportunities, and challenges in high-throughput experimentation for materials design. A major conclusion is that an effort to deploy a federated network of high-throughput experimental (synthesis and characterization) tools, which are integrated with a modern materials data infrastructure, is needed.

  17. Microfluidics for cell-based high throughput screening platforms - A review.

    PubMed

    Du, Guansheng; Fang, Qun; den Toonder, Jaap M J

    2016-01-15

    In the last decades, the basic techniques of microfluidics for the study of cells such as cell culture, cell separation, and cell lysis, have been well developed. Based on cell handling techniques, microfluidics has been widely applied in the field of PCR (Polymerase Chain Reaction), immunoassays, organ-on-chip, stem cell research, and analysis and identification of circulating tumor cells. As a major step in drug discovery, high-throughput screening allows rapid analysis of thousands of chemical, biochemical, genetic or pharmacological tests in parallel. In this review, we summarize the application of microfluidics in cell-based high throughput screening. The screening methods mentioned in this paper include approaches using the perfusion flow mode, the droplet mode, and the microarray mode. We also discuss the future development of microfluidic based high throughput screening platform for drug discovery.

  18. Parallelized ultra-high throughput microfluidic emulsifier for multiplex kinetic assays

    PubMed Central

    Lim, Jiseok; Caen, Ouriel; Vrignon, Jérémy; Konrad, Manfred; Baret, Jean-Christophe

    2015-01-01

    Droplet-based microfluidic technologies are powerful tools for applications requiring high-throughput, for example, in biochemistry or material sciences. Several systems have been proposed for the high-throughput production of monodisperse emulsions by parallelizing multiple droplet makers. However, these systems have two main limitations: (1) they allow the use of only a single disperse phase; (2) they are based on multiple layer microfabrication techniques. We present here a pipette-and-play solution offering the possibility of manipulating simultaneously 10 different disperse phases on a single layer device. This system allows high-throughput emulsion production using aqueous flow rates of up to 26 ml/h (>110 000 drops/s) leading to emulsions with user-defined complex chemical composition. We demonstrate the multiplex capabilities of our system by measuring the kinetics of β-galactosidase in droplets using nine different concentrations of a fluorogenic substrate. PMID:26015838

  19. High-throughput screening of small molecule libraries using SAMDI mass spectrometry.

    PubMed

    Gurard-Levin, Zachary A; Scholle, Michael D; Eisenberg, Adam H; Mrksich, Milan

    2011-07-11

    High-throughput screening is a common strategy used to identify compounds that modulate biochemical activities, but many approaches depend on cumbersome fluorescent reporters or antibodies and often produce false-positive hits. The development of "label-free" assays addresses many of these limitations, but current approaches still lack the throughput needed for applications in drug discovery. This paper describes a high-throughput, label-free assay that combines self-assembled monolayers with mass spectrometry, in a technique called SAMDI, as a tool for screening libraries of 100,000 compounds in one day. This method is fast, has high discrimination, and is amenable to a broad range of chemical and biological applications.

  20. High throughput screening to investigate the interaction of stem cells with their extracellular microenvironment

    PubMed Central

    Ankam, Soneela; Teo, Benjamin KK; Kukumberg, Marek; Yim, Evelyn KF

    2013-01-01

    Stem cells in vivo are housed within a functional microenvironment termed the “stem cell niche.” As the niche components can modulate stem cell behaviors like proliferation, migration and differentiation, evaluating these components would be important to determine the most optimal platform for their maintenance or differentiation. In this review, we have discussed methods and technologies that have aided in the development of high throughput screening assays for stem cell research, including enabling technologies such as the well-established multiwell/microwell plates and robotic spotting, and emerging technologies like microfluidics, micro-contact printing and lithography. We also discuss the studies that utilized high throughput screening platform to investigate stem cell response to extracellular matrix, topography, biomaterials and stiffness gradients in the stem cell niche. The combination of the aforementioned techniques could lay the foundation for new perspectives in further development of high throughput technology and stem cell research. PMID:23899508